Following on from my comment in the February issue of DW, I thought it worthwhile to spend a few moments contemplating just how much, or how little, companies (as well as individuals) can learn from the history books. Thankfully, major health scares as per the current coronavirus pandemic, are few and far between. Nevertheless, they do occur. And organisations might do well to spend a little time, after we have weathered the storm, thinking about the future, in terms of the human aspect of their business. Yes, AI and robots are, we hope, immune to illness, but humans are still a vital part of any organisation, and large scale illness is something that might just need a bit more planning for in the future. At least as part of a business continuity/disaster recovery plan, where the emphasis tends to be on the machines, not the humans.
Financially, there are always peaks and troughs in economies – both individual ones and the global one. It is difficult to predict these – else all economists would nearly always be in agreement – something which they rarely are! So, it’s difficult to plan for them. However, the current situation might just encourage more businesses to keep more money in reserve, ‘just in case’. And, just maybe, shareholders will not be quite so demanding for seemingly endless, increasing dividends if it leaves the company coffers bare when recession or unforeseen events threaten to derail the economic landscape.
More generally, students of history can tell anyone who cares to listen, that history really does have a habit of repeating itself. Unfortunately, there is much uncertainty as to when these repetitions might occur, but re-occur they do.
For example, there will always be threats to national, regional and global security. The 20th century was rather volatile on this front. And when the Communist empire finally collapsed and everyone imagined eternal, global peace, along came the terrorist threat from the Middle East. More recently, cyber warfare would appear to be a growing concern.
Add to this environmental stresses – politicial commentators love to say that many of the wars of the future will be fought over water (access to it/the lack of it in certain locations); other aspects of climate change – denied by the most powerful leader in the world (!); the odd health scare; the ongoing migration crises in parts of the world, and various other events, all of which have been witnessed previously in some shape or form (yes, even climate change has had a major impact over the years), and the future is not so certain.
Of course, planning for a series of ‘might happens’ is not that easy, but maybe, just maybe, the present global crisis will serve as a timely warning for countries, corporations and individual citizens that we can no longer just keep our heads down, stick our fingers firmly in our ears (having washed them well beforehand!) and carry on regardless.
Integration challenges continue to slow down digital transformation initiatives, as many organisations don’t have a company-wide API strategy.
MuleSoft has published the findings of the 2020 Connectivity Benchmark Report on the state of IT and digital transformation. The global survey of 800 IT decision makers (ITDMs) in organisations with at least 1,000 employees revealed that more than half (59%) of organisations were unable to deliver on all their projects last year, creating a backlog for 2020.
“Businesses are under increasing pressure to digitally transform as a failure to do so risks negatively impacting revenues. However, traditional IT operating models are broken, forcing organisations to find new ways of accelerating project delivery and reusing integrations,” said Ian Fairclough, vice-president of services, EMEA at MuleSoft. “For organisations with hundreds of different applications, integration remains a significant challenge towards them being able to deliver the connected experiences customers strive for. This report highlights that while ITDMs recognise the value of APIs, many organisations have yet to fully realise their potential.”
Businesses are failing to capture the full value of APIs without a company-wide strategy: The vast majority of organisations understand the power and potential of APIs – 80% currently use public or private APIs. However, very few have developed a strategic approach to enabling API usage across the business.
API reuse is directly linked to speed of innovation, operational efficiency and revenue
By establishing API strategies that promote self-service and reuse, businesses put themselves in a much better position to innovate at speed, increase productivity and open up new revenue streams. However, only 42% of ITDMs (30% in the UK) are leveraging APIs to increase the efficiency of their application development processes.
“CIOs are uniquely positioned to lead their organization’s digital transformation. IT leaders across all industries must be focused on creating a new operating model that accelerates the speed of delivery, increases organizational agility and delivers innovation at scale,” said Simon Parmett, CEO, MuleSoft. “With an API-led approach, CIOs can change the clock speed of their business and emerge as the steward of a composable enterprise to democratize access to existing assets and new capabilities.”
Study highlights lack of investment in areas most likely to have a positive CX impact.
Organizations must tear down the walls between IT and the business and make more customer-centric investments if they are to improve customer experience (CX), according to new research from Pegasystems Inc.. Pega’s 2020 Global Customer Experience Study was conducted among decision makers spanning 12 countries and seven different industries by research firm Savanta.
The study highlighted four key pain points businesses must address if they are to provide a better, more personalized customer experience, successfully differentiate themselves from the competition, and improve customer satisfaction and loyalty:
Infoblox has published new research that exposes the significant threat posed by shadow IoT devices on enterprise networks. The report, titled “What’s Lurking in the Shadows 2020” surveyed 2,650 IT professionals across the US, UK, Germany, Spain, the Netherlands and UAE to understand the state of shadow IoT in modern enterprises.
Shadow IT devices are defined as IoT devices or sensors in active use within an organisation without IT’s knowledge. Shadow IoT devices can be any number of connected technologies including laptops, mobile phones, tablets, fitness trackers or smart home gadgets like voice assistants that are managed outside of the IT department. The survey found that over the past 12 months, a staggering 80% of IT professionals discovered shadow IoT devices connected to their network, and nearly one third (29%) found more than 20.
The report revealed that, in addition to the devices deployed by the IT team, organisations around the world have countless personal devices, such as personal laptops, mobile phones and fitness trackers, connecting to their network. The majority of enterprises (78%) have more than 1,000 devices connected to their corporate networks.
“The amount of shadow IoT devices lurking on networks has reached pandemic proportions, and IT leaders need to act now before the security of their business is seriously compromised,” said Malcolm Murphy, Technical Director, EMEA at Infoblox.
“Personal IoT devices are easily discoverable by cybercriminals, presenting a weak entry point into the network and posing a serious security risk to the organisation,” he added. “Without a full view of the security policies of the devices connected to their network, IT teams are fighting a losing battle to keep the ever-expanding network perimeter safe.”
Nearly nine in ten IT leaders (89%) were particularly concerned about shadow IoT devices connected to remote or branch locations of the business.
“As workforces evolve to include more remote and branch offices and enterprises continue to go through digital transformations, organisations need to focus on protecting their cloud-hosted services the same way in which they do at their main offices,” the report recommends. “If not, enterprise IT teams will be left in the dark and unable to have visibility over what’s lurking on their networks.”
To manage the security threat posed by shadow IoT devices to the network, 89% of organisations have introduced a security policy for personal IoT devices. While most respondents believe these policies to be effective, levels of confidence range significantly across regions. For example, 58%of IT professionals in the Netherlands feel their security policy for personal IoT devices is very effective, compared to just over a third (34%) of respondents in Spain.
“Whilst it’s great to see many organisations have IoT security policies in place, there’s no point in implementing policies for their own sake if you don’t know what’s really happening on your network,” Murphy said. “Gaining full visibility into connected devices, whether on premises or while roaming, as well as using intelligent systems to detect anomalous and potentially malicious communications to and from the network, can help security teams detect and stop cybercriminals in their tracks.”
Global research highlights how organisations are capitalising on emerging technologies to enhance finance and operations for competitive advantage.
Organisations that are adopting Artificial Intelligence (AI) and other emerging technologies in finance and operations are growing their annual profits 80 percent faster, according to a new study fromEnterprise Strategy Group and Oracle. The global study,Emerging Technologies: The competitive edge for finance and operations, surveyed 700 finance and operations leaders across 13 countries and found that emerging technologies – AI,Internet of Things(IoT),blockchain,digital assistants– have passed the adoption tipping point, exceed expectations, and create significant competitive advantage for organisations.
AI and Digital Assistants Improve Accuracy and Efficiency in Finance
Organisations embracing emerging technologies in finance are experiencing benefits far greater than anticipated:
AI, IoT, and Blockchain Drive More Responsive Supply Chains
AI, IoT, blockchain and digital assistants are helping organisations improve accuracy, speed and insight in operations and the supply chain, and respondents expect additional business value as blockchain applications become mainstream.
Emerging Tech Equals Competitive Advantage
The vast majority of organisations have now adopted emerging technologies and early adopters (those using three or more solutions) are seeing the greatest benefit and are more likely to outperform competitors.
A study from Juniper Research has found that total operator-billed revenue from 5G IoT connections will reach $8 billion by 2024; rising from $525 million in 2020. This is a growth of over 1,400% over the next 5 years. The report identified the automotive and smart cities sectors as key growth drivers for 5G adoption over the next five years.
The new research, 5G Networks in IoT: Sector Analysis & Impact Assessment 2020-2025, anticipated that revenue from these 5G connections is a highly sought-after new revenue stream for operators. It forecasts that 5G IoT connections must be considered as new connections that will not cannibalise existing operator connectivity revenue from current IoT technologies.
For more insights on 5G on the IoT, download our free whitepaper: 5G ~ The 5-Year Roadmap.
5G Value-Added Services Key for Operators
The research urges operators to develop comprehensive value-added services to enable IoT service users to manage their 5G connections. It forecasts that tools, such as network slicing and multi-access edge computing solutions, will be essential to attract the highest spending IoT service users to use their 5G networks.
The research also forecasts that valued-added services will become crucial in the automotive and smart cities sectors. It also forecasts that these sectors would account for 70% of all 5G IoT connections by 2025, with higher than anticipated levels of device support for 5G radios accelerating the uptake of 5G connectivity.
Maximising the New Revenue Stream
The research claimed that the initial high pricing of 5G connectivity in the IoT sector would dissuade all but high value IoT users. It urged operators to roll out holistic network management tools that complement the enhanced capabilities of 5G networks for IoT capabilities.
Research author Andrew Knighton remarked “Management tools for the newly-enabled services are key for users managing large scale deployments. We believe that only 5% of 5G connections will be attributable to the IoT, but as these are newly enabled connections, operators must view them as essential to securing a return on their 5G investment”.
New data from Extreme Networks reveals that IoT is barreling toward the enterprise, but organisations remain highly vulnerable to IoT-based attacks.
The report, which surveyed 540 IT professionals across industries in North America, Europe, and Asia Pacific, found that 84% of organisations have IoT devices on their corporate networks. Of those organisations, 70% are aware of successful or attempted hacks, yet more than half do not use security measures beyond default passwords. The results underscore the vulnerabilities that emerge from a fast-expanding attack surface and enterprises’ uncertainty in how to best defend themselves against breaches.
Key findings include:
● Organisations lack confidence in their network security: 9 out of 10 IT professionals are not confident that their network is secured against attacks or breaches. Financial services IT professionals are the most concerned about security, with 89% saying they are not confident their networks are secured against breaches. This is followed by the healthcare industry (88% not confident), then professional services (86% not confident). Education and government are the least concerned of any sector about their network being a target for attack.
● Enterprises underestimate insider threats: 55% of IT professionals believe the main risk of breaches comes mostly from outside the organisation and over 70% believe they have complete visibility into the devices on the network. But according to Verizon’s 2019 Data Breach Investigations Report, insider and privilege misuse was the top security incident pattern of 2019, and among the top three causes of breaches.
● Europe’s IoT adoption catches up to North America: 83% of organisations in EMEA are now deploying IoT, compared to 85% in North America, which was an early adopter. Greater IoT adoption across geographies is quickly expanding the attack surface.
● Skills shortage and implementation complexity cause NAC deployments to fail: NAC is critical to protect networks from vulnerable IoT devices, yet a third of all NAC deployment projects fail. The top reasons for unsuccessful NAC implementations are a lack of qualified IT personnel (37%), too much maintenance cost/effort (29%), and implementation complexity (19%).
● SaaS-based networking adoption grows: 72% of IT professionals want network access to be controlled from the cloud. This validates 650 Group’s prediction that more than half of enterprise network systems will transition to SaaS-based networking by the end of 2023.
Extreme provides the multi-layered security capabilities that modern enterprises demand, from the wireless and IoT edge to the data centre, including role-based access control, network segmentation and isolation, application telemetry, real-time monitoring of IoT, and compliance automation. As the mass migration of business systems to the cloud continues, cloud security becomes ever more important. Extreme’s security solutions extend in lockstep with the expanding network perimeter to harden enterprises’ environments both on-premises and in the cloud.
Ivanti has released survey results highlighting the challenges faced by IT organisations when it comes to aligning their IT Service Management (ITSM) and IT Asset Management (ITAM) processes
According to data collected by Ivanti, 43% of IT professionals surveyed reported using spreadsheets as one of their resources to track IT assets. Further, 56% currently do not manage the entire asset lifecycle, risking redundant assets, potentially creating a risk, and causing unnecessary and costly purchases.
Findings from the survey demonstrate the need for greater alignment between ITSM and ITAM processes, especially when looking at the time spent reconciling inventory/assets. Nearly a quarter of respondents reported spending hours per week on this process. Another time-intensive process for IT professionals is dealing with out-of-warranty/out-of-support-policy assets, with 28% of respondents reporting they spend hours per week supporting these assets. And, when asked how often they have spent time fixing devices that were later identified to still be under warranty, 50% of respondents said “sometimes.”
“It’s clear that there is room for improvement when it comes to managing assets,” said Ian Aitchison, senior product director at Ivanti. “While IT teams are starting to better track their assets, collaborating with other teams and understanding the benefits of combining asset and service processes, time and money advantages are being lost as they don't have the data they need to effectively manage and optimise their assets and services.”
When asked about the benefits of combining ITSM and ITAM processes, the survey found that respondents expected to see:
Aitchison added, “When ITSM and ITAM are closely aligned and integrated, many activities and processes become more automated, efficient and responsive, with fewer things ‘falling through the cracks.’ IT teams gain more insight and are better positioned to move from reactive activities to more proactive practices, delivering higher service levels and efficiency at lower costs.”
According to the survey, IT professionals are also somewhat dissatisfied with the available asset information, or data, they have access to within their organisations. When asked if they incorporate and monitor purchase data, contracts and/or warranty data as part of their IT asset management program, 39% of respondents said yes, 42% said partially and 19% said no. This means more than 60% of IT professionals are missing key information in their IT asset management program to effectively manage their IT assets from cradle to grave.
Over 80% of C-Suite, business and IT decision makers believe that digital performance is critical to business growth.
Riverbed has published the results of a survey which found that over 70% of C-Suite decision makers believe business innovation and staff retention are driven by improved visibility into network and application performance. The survey findings unveiled in the ‘Rethink Possible: Visibility and Network Performance – The Pillars of Business Success’ report cites a positive correlation between effective technology and company health, a finding that is supported by the fact that 86% of C-Suite and IT decision makers (ITDMs), and 87% of business decision makers (BDMs), reveal that digital performance is increasingly critical to business growth.
Slow running systems and a lack of visibility
Three-quarters of all groups surveyed have felt frustrated by current network performance, with IT infrastructure being given as the key reason for the poor performance. This problem is exacerbated by a lack of full and consistent visibility, as one in three ITDMs report that they don’t have full visibility over applications, their networks and/or end-users. Furthermore, almost half of the C-Suite (49%) believe that slow running and outdated technology is directly impacting the growth of their businesses. This highlights the importance of implementing new technology to drive productivity, creativity and innovation.
Business priorities and challenges
Business priorities and challenges are evolving, so the technology they rely on needs to advance too. Over three quarters (76%) of ITDMs acknowledge that their IT infrastructure will have to change dramatically in the next five years to support new ways of doing business. A further 95% of all respondents recognise that innovation and breaking boundaries is crucial to business success, emphasising the need to embrace new technology. This may be why 80% of BDMs and 77% of the C-Suite say they believe that investing in next-generation technology is vital.
Commenting on the research findings, Colette Kitterhing, Senior Director UK&I at Riverbed Technology, said: “All leaders recognise that visibility, optimised network infrastructure, and the ability to accelerate cloud and SaaS performance are the next frontier in business success. Given this, it’s time the C-Suite, business decision leaders, and IT decision makers come together to invest in the right solutions, prioritise measurement, and place visibility and infrastructure at the top of their agenda.”
Kitterhing continued: “At Riverbed, we are helping businesses evolve their capabilities, whether it’s by monitoring networks and the apps that run on them, application performance, or updating the network infrastructure that underpins their digital services. We fundamentally believe this is the key to supporting our customers’ staff and their ambitions, driving innovation, creativity, and helping them Rethink what’s possible.”
Rethink Possible: Evolving the Digital Experience
With over 80% of all leaders (82% C-Suite, 84% BDMs and ITDMs) agreeing that businesses must rethink what’s possible to survive in today’s unpredictable world, technology needs to be an enabler in the process. Riverbed’s portfolio of next-generation solutions is giving customers across the globe the visibility, acceleration, optimization and connectivity that maximizes performance and visibility for networks and applications.
FireMon has released its 2020 State of Hybrid Cloud Security Report, the annual benchmark of the cloud security landscape.
The latest report finds that while enterprises rapidly transition to the public cloud, complexity is increasing, but visibility and team sizes are decreasing while security budgets remain flat to pose a significant obstacle to preventing data breaches. The 2020 State of Hybrid Cloud Security Report features insights from more than 500 respondents, including 14 percent from the executive ranks, detailing cloud security initiatives in the era of digital transformation.
“As companies around the world undergo digital transformations and migrate to the cloud, they need better visibility to reduce network complexity and strengthen security postures,” said Tim Woods, VP of Technology Alliances for FireMon. “It is shocking to see the lack of automation being used across the cloud security landscape, especially in light of the escalating risk around misconfigurations as enterprises cut security resources. The new State of Hybrid Cloud Security Report shows that enterprises are most concerned about these challenges, and we know that adaptive and automated security tools would be a welcomed solution for their needs.”
Cloud Adoption, Complexity and Scale Create Security Challenges
While enterprises increasingly transition to public and hybrid cloud environments, their network complexity continues to grow and create security risks. Meanwhile, they are losing the visibility needed to protect their cloud systems, which was the biggest concern cited by 18 percent of C-suite respondents, who now also require more vendors and enforcement points for effective security.
The 2020 FireMon State of Hybrid Cloud Security Report found that:
(Consider) Business acceleration outpaces effective security implementations:
Need for Security Outpaces the Need for Data Protection: Nearly 60 percent believed their cloud deployments had surpassed their ability to secure the networks in a timely manner. This number was virtually unchanged from 2019, showing no improvement against a key industry progress indicator.
Cloud Complexity Increases: The number of vendors and enforcement points needed to secure cloud networks are also increasing; 78.2 percent of respondents are using two or more enforcement points. This number increased substantially from the 59 percent using more than two enforcement points last year. Meanwhile, almost half are using two or more public cloud platforms, which further increases complexity and decreases visibility.
Shrinking Budgets and Security Teams Create Gaps in Protection
Despite increasing cyberthreats and ongoing data breaches, respondents also reported a substantial reduction in their security budgets and teams from 2019. These shrinking resources are creating gaps in public cloud and hybrid infrastructure security.
Budget Reductions Increase Risk: There was a 20.7 percent increase in the number of enterprises spending less than 25 percent on cloud security from 2019; 78.2 percent spend less than 25 percent on cloud security (vs. 57.5 percent in 2019). Meanwhile, 44.8 percent of this group spent less than 10 percent of their total security budget on the cloud.
Security Teams are Understaffed and Overworked: While the cyberattack surface and potential for data breaches continues to expand in the cloud, many organisations trimmed the size of their security teams – 69.5 percent had less than 10-person security teams (compare to 52 percent in 2019). The number of 5-person security teams also nearly doubled with 45.2 percent having this smaller team size versus 28.5 percent in 2019.
Lack of Automation and Third-Party Integration Fuels Misconfigurations
While cloud misconfigurations due to human-introduced errors remain the top vulnerability for data breaches, an alarming 65.4 percent of respondents are still using manual processes to manage their hybrid cloud environments. Other key automation findings included:
Misconfigurations are Biggest Security Threat: Almost a third of respondents said that misconfigurations and human-introduced errors are the biggest threat to their hybrid cloud environment. However, 73.5 percent of this group are still using manual processes to manage the security of their hybrid environments.
Better Third-Party Security Tools Integration Needed: The lack of automation and integration across disparate tools is also making it harder for resource-strapped security teams to secure hybrid environments. As such, 24.5 percent of respondents said that not having a “centralised or global view of information from their security tools” was their biggest challenge to managing multiple network security tools across their hybrid cloud.
By harnessing automated network security tools, robust API structures and public cloud integrations, enterprise can gain real-time control across all environments to minimise challenges created by manual processes, increasing complexity and reduced visibility. Automation is also the antidote to shrinking security budgets and teams by enabling organisations to maximise resources and personnel for their most strategic uses.
SolarWinds has released the findings of its “2019 Trends in Managed Services” report, showing the health of managed services and the forces shaping the market across North America and Europe.
SolarWinds MSP partners with The 2112 Group to create the annual report from findings based on data from its benchmarking tool, MSP Pulse, which provides deep insight into potential growth opportunities for IT providers and where they can excel, areas for improvement, and can also help guide how they invest in their business. The comprehensive reports show comparisons on revenue, profit, service selection, sales capacity, customer engagement, and growth potential.
“Our research has revealed that 97% of respondents offer some form of managed services—which is a clear demonstration of the managed services transformation remaking the technology channel. It’s safe to say the state of managed services is strong,” said John Pagliuca, president of SolarWinds MSP. “This latest research also underscored the major growth opportunities we’re already looking to help our MSPs leverage including automation, security, and operations. With our customer success-focused initiatives like MSP Pulse, the MSP Institute, and Empower MSP, we’re supporting partners with much more than technology. We’re working to fuel the success of our MSPs in 2020 and beyond. The future looks bright for those who want to expand their comfort zone, and we’re here to help them do just that.”
The results show that MSPs are comfortable with the security basics such as antivirus, backup, and firewalls:
For solutions in North America, respondents were most comfortable offering and using antivirus (89%), firewalls (83%), data backup and recovery (81%), and endpoint security (75%).
In Europe, respondents were most comfortable offering and using antivirus (93%), data backup and recovery (82%), firewalls (82%), and antispam (80%) as solutions.
However, MSPs have room for growth in some of the more advanced security solutions and offerings, as respondents were less confident in the more complex controls:
European and North American respondents selected the same top three solutions they were least comfortable with: biometrics, cloud access security brokers (CASBs), and digital rights management.
On the services end, European respondents were least comfortable with penetration testing (52%), auditing and compliance management (39%), and risk assessments (36%). North American respondents were least comfortable with auditing and compliance management (53%), penetration testing (47%), and security system architecture (39%).
The results also showed MSPs are starting to increase the use of automation to handle day-to-day tasks such as patch management and backup, but don’t feel comfortable with automating the advanced tasks:
Automation saves North American MSPs an average of 15.6 full-time employee hours per week and in Europe, an average of 23 full-time employee hours per week.
In North America, respondents were least comfortable automating client onboarding (44%) with identity and access management in second place (38%). In Europe, respondents were least comfortable automating SQL query workflows (57%) but shared their discomfort with automating identity and access management with their North American counterparts.
In the 2018 report, MSPs were losing customers almost as fast as they gained them, but 2019 showed an improvement in customer retention. Two of the top three reasons for losing customers stemmed from the customer rather than the service provider:
In North America, respondents pick up an average of four clients every three months while losing one in the same period.
In Europe, respondents pick up an average of three clients every two months while losing more than one on average in the same period.
Top causes of customer loss included the company either went out of business (26% in North America and 16% in Europe) or were fired by the partner (25% in North America and 16% in Europe).
Another key finding showed core business operations are still amongst the biggest growth obstacles for MSPs including lack of resources/time, sales, and marketing:
North American MSPs claimed their biggest obstacles toward growth were sales (43%), lack of resources/time (42%), and marketing (26%).
European MSPs claimed their biggest obstacles toward growth were lack of resources/time (41%), sales (32%), and security threats (32%).
Many providers claim a lack of sales and marketing expertise is a major anchor on their growth—hiring specialized staff could help close the gap or training for existing employees. The SolarWinds MSP Institute, a learning portal within the SolarWinds Customer Success Center, gives SolarWinds partners access to a variety of courses, including sales and marketing tracks, to help coach users in the management and growth of their business, and provides go-to-market strategies and advice to help educate on these issues.
“We’re so pleased to be partnering with SolarWinds MSP again on the annual Trends in Managed Services report,” stated Larry Walsh, CEO of The 2112 Group. “It gives MSPs an opportunity to see where the IT channel currently stands, where the opportunities are for growth and where they could improve as it pinpoints the crucial trends shaping the managed services market.”
A strong Q4 ensured that the four largest colocation FLAP markets of Frankfurt, London, Amsterdam and Paris reached a combined 201MW in take-up for 2019 – a new European record, according to the Q4 European Data Centres Report from CBRE, the world's leading data centre real estate advisor.
CBRE analysis shows that Frankfurt was the star performer during 2019, responsible for 45% of the 201MW. This is both the first time Frankfurt has topped the annual rankings and the first time any European market has seen 90MW of take-up in a single year. Furthermore, half of the pre-let capacity in facilities under construction also took place in Frankfurt.
CBRE expects that take-up in the core FLAP markets will remain high at 200MW for the next two years as the hyperscale companies continue to utilise wholesale colocation services in these markets.
The developer-operators, confident of prolonged strong demand, are now building at a larger scale than ever before to secure these requirements. To this end, there was 314MW of new capacity brought online in the FLAP markets during 2019, equating to a 24% market growth This new capacity would require EUR 2.2bn of capital to deliver.
Mitul Patel, Head of EMEA Data Centre Research at CBRE, commented: “The data centre market in Europe continues to grow like no other and represents one of the most exciting asset classes anywhere. The 201MW procured in 2019, as well as pre-lets and optioned capacity, is a remarkable achievement. Our expectation is that this level of activity will continue as the hyperscalers continue their accelerated procurement of data centre services.”
“As the hyperscale companies roll out their services more widely across Europe, the rate of hyperscale procurement of data centre capacity in other European cities will increase rapidly. As a result, markets such as Madrid, Milan, Warsaw and Zurich, will witness a substantial increase in colocation activity.”
The third annual Verizon Mobile Security Index takes a deep dive into the state of mobile security, looking at different types of threats and offering tips to protect your environment.
The third annual Verizon Mobile Security Index finds that a large number organizations are still compromising mobile security to get things done, which can leave entities at risk. About four out of 10 respondents (43 percent) reported their organization had sacrificed mobile security in the past year. Those that did were twice as likely to suffer a compromise.
In fact, the study found that 39 percent of respondents reported having a mobile-security-related compromise. Sixty-six percent of organizations that suffered a compromise called the impact “major,” and 55 percent said the compromise they experienced had lasting repercussions.
“In today’s world, mobile connectivity is more important than ever. Organizations of all sizes and in all industries rely on mobile devices to run much of the day to day operations, so mobile security is a priority,” said Bryan Sartin, executive director, global security services with Verizon. “The types of devices, diverse applications and further emergence of IoT devices further complicate security. Everyone has to be deliberate and diligent about mobile security to protect themselves and their customers.”
Because mobile attacks aren’t industry specific, this year’s Verizon Mobile Security Index 2020 features supplemental vertical reports in key segments including: financial services; healthcare; manufacturing; public sector; retail and small and medium business. The report also discusses the importance of mobile security in pivotal technologies like cloud and IoT and how the emergence of 5G will impact security. And with 80 percent of organizations saying that mobile will be their primary means of accessing cloud services within five years, now is the time to hone in on mobile security.
The obvious question is: what should organizations do? The report highlights users, apps, devices and networks as the four key mobile attack vectors. The report includes a number of tips on how organizations can safeguard against mobile security threats, including establishing a “security-first” focus, developing and enforcing policies and encrypting data over unsecured networks.
2020 is likely to be a time of uncertainty in both economic and business cycles; customers are more demanding than ever, channels need to understand the markets and where to apply investment and resources.
With a gently rising global GDP (Goldman Sachs guesses 2020 will see around 3.4% worldwide growth; Europe will be lower, at around 1.1%, with Germany, Italy and the UK shrinking, and issues in France) IT sales across Europe can be expected to grow. They have been rising based on enterprises seeing the need to drive efficiency and stay up with competition, plus the growing SMB turn to technology. There have been minor issues over confidence, particularly in the UK market, but the trend is up, especially towards more targeted solutions. It will place more of a burden on channels, however, as the requirement is for more sophisticated and integrated solutions, which in turn place more demands on the channel resources and its ability to successfully sell advanced ideas.
At the same time, customers are asking about the use of technologies such as AI, analytics, blockchain and the many varieties of security. Any supplier or channel player not able to talk convincingly about these technologies may find themselves displaced by an upstart.
This message was powerfully driven home by Microsoft research at the end of 2019. It reported that while customers had previously looked at using tools from Microsoft and others, organisations are “moving beyond adopting the latest applications and are developing their own proprietary digital capabilities to help propel success and gain a competitive advantage”.
According to a Gartner survey Operational excellence, Cost, Digital and Growth are top of mind of CIO plans in 2020, well above other concerns such as Cyber security, Innovation and Recruiting.
Your customers don’t want the same as everyone else; they want to build and develop their own solutions- their own intellectual property (IP) even. There's widespread agreement that the applied use of a “creative, entrepreneurial mindset to invent new digital capabilities using advanced technologies such as AI and IoT — will have a significant impact on global communities and organizational culture”.
Which is where managed services play their part in being scalable, with known fixed costs and in their ability to explore new technology introductions through pilots.
The global managed services market was valued at $166.8bn in 2018, and it is expected to reach $319.5bn by 2024, registering a CAGR of 11.5% during the forecast period of 2019-2024 says researcher Mordor Intelligence. The market for managed services is going to be fuelled by the increasing shortage of expertise with businesses becoming more technology oriented. Furthermore, due to rapid digitalization, the companies are required to continually innovate and upgrade their infrastructure to remain competitive.
Industry trade body CompTIA surveyed members in late 2019 and found general agreement on the areas of growth to 2022 as Managed Services, IoT, Data, security and consulting services. Managed services was out in front with some 82% agreeing that it was a growth engine. Some 35% expected it to grow significantly, half moderately and 14% to stay the same.
But the issues of purchasing such services remains a source of difficulty for customers. As Gartner told the Managed Services Summits in London and Amsterdam last year, people are more tech savvy than ever before which means it’s not just the IT department that makes technology choices. And the internet now opens up many more sources of information and access to trusted sources (independent sites) which used to cost a fortune
But with all that complexity and wealth of information comes a problem; the buyer has to sort through so much information. But worse still the buyer still can’t find the right information to make a quick buying decision.
And so the buying cycle becomes fluid and sometimes erratic. The net result – buying takes 97% longer than planned as found in a Gartner survey last year. And what’s more Gartner surveys show that up to 59% of buyers cancelled at least one buying effort.
And the conversations are becoming more detailed and going to the core of the customer business: this emerged in a lively panel at the Managed Services and Hosting Summit Manchester on October 30 2019 gave some useful ideas on meeting current customer challenges: Nigel Church from MSP First Solution: “We’re speaking to more customers about digital transformation and enablement processes, and we have seen a shift, starting with conversations about cloud and then going onto business strategy.”
So the Managed Services Summits in Amsterdam and London this year aim to reflect the new pressures on MSPs to engage successfully with customers and offer new types of service, in security, in data, in newer technologies such as AI.
Concerns that the assumptions underpinning organizational strategy may be outdated or misaligned to current growth objectives topped business leaders’ concerns in Gartner, Inc.’s latest Emerging Risks Monitor Report.
Gartner surveyed 136 senior executives across industries and geographies and the results showed that “strategic assumptions” had risen to the top emerging risk in the 4Q19 Emerging Risks Monitor survey, up from the third position the previous quarter (see Table 1). Last quarter’s top emerging risk, “digitalization misconceptions,” has now become an established risk after ranking on four previous emerging risk reports.
“This quarter saw a number of external risks converge in executives’ thinking, from increasing concerns about the impact of extreme weather events to trade policy,” said Matt Shinkman, vice president with Gartner’s Risk and Audit Practice. “Currently, however, business leaders are most acutely concerned with the beliefs underpinning their own strategic assumptions and the ramifications of getting them wrong.”
Table 1. Top Five Risks by Overall Risk Score: 1Q19-4Q19
Accelerating Privacy Regulation
Pace of Change
Pace of Change
Extreme Weather Events
U.S.-China Trade Talks
U.S.-China Trade Talks
Source: Gartner (February 2020)
Strategic Planning Must Account for Critical Uncertainties
The study defined a strategic assumption as a plan based on a belief that a certain set of events must occur. Gartner research has found that executives believe more than half of their time spent in strategic planning is wasted, and the quality of those plans fail to meet expectations. Incorrect strategic assumptions often result in stalled growth that can derail planned results.
“Strategic assumptions are often sound when they are first formed, but in today’s environment are more vulnerable to becoming outdated or obsolete due to a rapid increase in the pace of change,” noted Mr. Shinkman. Senior executives ranked the pace of change as a top emerging risk in the second quarter of 2019.
Organizations with a poorly formed set of strategic assumptions typically produce a high number of projects that are not aligned with their stated objectives. Moreover, time and budget for the execution of key initiatives consistently overrun planned targets.
“Risk teams should play a vital role in mitigating the impact of inaccurate strategic assumptions. A key component of clarifying strategic assumptions is discerning between likely truths and critical uncertainties,” Mr. Shinkman said. “Risk leaders should involve themselves early in the strategic planning process and add value by developing a set of criteria to stress-test assumptions and root out biases and flaws before they become cemented in a strategic plan.”
Cyber-Physical Convergence Presents New Risks
The second most cited risk was a convergence of cyber-physical risks, as previously unconnected physical assets become part of an organization’s cyber network. Nearly 90 percent of organizations with connected operational technology (OT) have already experienced a breach related to cyber-physical architecture. Despite these threats, organizations continue to move forward with integrating Internet of Things devices, smart buildings and other OT, often without dedicated security policies.
“The risks of OT are still not widely appreciated throughout most organizations,” said Mr. Shinkman. “Top risk teams partner with IT to develop dedicated strategies that account for the security deficiencies in such assets and include regular meetings to review issues such as access rights and employee training.”
Organizations are taking customer experience (CX) more seriously by committing more resources and talent to the discipline, according to Gartner, Inc. Gartner’s 2019 Customer Experience Management Survey revealed that in 2017, more than 35% of organizations lacked a chief experience officer (CXO) or chief customer officer (CCO) or equivalents, but in 2019, only 11% and 10% lacked one or the other role, respectively (see Figure 1).
“There has been significant growth in the presence of CXOs and CCOs or equivalents in many organizations over the last two years,” said Augie Ray, VP analyst, Gartner for Marketers. “However, these roles rarely report to CMOs despite marketing taking control of more CX initiatives.”
Figure 1: Key CX Leader Roles Are More Common and Rarely Report to the CMO
Source: Gartner (February 2020)
The survey — which covered a variety of departments where CX efforts are run and supported, such as marketing, IT, customer service, operations, sales and stand-alone CX departments — found that responsibility for CX budgets and initiatives has begun to shift into the marketing department.
“As marketing continues to take on a larger role in CX, marketing leadership faces a potential challenge coordinating companywide CX,” said Ray. “CMOs and marketing leaders responsible for aspects of their organization’s CX must ensure that roles are understood, redundancy and conflict are minimized, and collaboration is prioritized.”
To do this, Gartner recommends that CMOs and marketing leaders take the following actions:
Gartner, Inc. predicts that by 2025, at least two of the top 10 global retailers will establish robot resource organizations to manage nonhuman workers.
“The retail industry continues to transform through a period of unprecedented change, with customer experience as the new currency,” said Kelsie Marian, senior research director at Gartner. “The adoption of new digital technologies and the ever-changing expectations of customers continues to challenge traditional retailers, forcing them to investigate new-human hybrid operational models, including artificial intelligence (AI), automation and robotics.”
Gartner research shows that 77% of retailers plan to deploy AI by 2021, with the deployment of robotics for warehouse picking as the No. 1 use case. Warehouse picking involves smart robots working independently or alongside humans. In the future, retailers will establish units within the organization for procuring, maintaining, training, taxing, decommissioning and proper disposal of robot resources. In addition, they will create the governance required to ensure that people and robots can effectively collaborate.
Many retail workers want to use AI specifically as an on-demand or predictive assistant, meaning the robot will need to work alongside humans. “This means the robot will have to “mesh” with the human team — essentially meaning that both sides will need to learn how to “collaborate” to operate effectively together,” said Ms. Marian.
An example is an autonomous robotic kitchen assistant that learns an operator’s specific recipes and prepares them according to the wishes of the operator. The robot can work in harmony with the operators who, in turn, are having to adapt to changing consumer tastes.
Choosing the right candidate — human and machine — for the job is critical for success. A combined effort from HR, IT and the line-of-business hiring managers will be required to identify the skills needed to ensure the pair work together effectively. “Retail CIOs must provide ongoing maintenance and monitoring performance for effectiveness. If not, the team may be counterproductive and lead to a bad customer experience,” said Ms. Marian.
The introduction of AI and robotics will likely cause fear and anxiety among the workforce — particularly among part-time workers. It will be vital for retail CIOs to work with HR and business leaders to address and manage employees’ skills and concerns; and change their mindset around the development of robot resource units.
“When an organization adopts blockchain smart contracts — whether externally imposed or voluntarily adopted — they benefit from the associated increase in data quality, which will increase by 50% by 2023,” said Lydia Clougherty Jones, senior research director at Gartner.
However governance frameworks for blockchain participation, or the terms and conditions within the smart contract, can dictate the availability of the data generated from the smart contract transaction, from none to limited to unlimited. “This variable could leave participants in a worse position than if they did not participate in the blockchain smart contract process. As such, an organization’s overall data asset availability would decrease by 30% by 2023,” added Ms. Clougherty Jones.
The net impact however, is a positive result for data and analytics (D&A) ROI. The impact of blockchain smart contract adoption on analytical decision making is profound. It enhances transparency, speed and granularity of decision making. It also improves the quality of decision making, as its continuous verification makes the data more accurate, reliable and trustworthy.
“Smart contracts are important and D&A leaders should focus on them because they promise a near certainty of trusted exchange. Once deployed, blockchain smart contracts are immutable and irrevocable through nonmodifiable code, which enforces a binding commitment to do or not do something in the future. Moreover, they eliminate third-party intermediaries (e.g., bankers, escrow agents, and lawyers) and their fees, as smart contracts, perform the intermediary functions automatically,” said Ms. Clougherty Jones.
Gartner analysts recommend D&A leaders pilot blockchain smart contracts now. Organizations should start deploying them to automate a simple business process, such as non-sensitive data distribution or a simple contract formation for contract performance or management purposes. Then organizations should engage with their affiliates and partners to pilot blockchain smart contracts to automate multiparty contracts within a well-defined ecosystem, such as banking and finance, real estate, insurance, utilities, and entertainment.
Privacy Compliance Technology to rely on AI
Over 40% of privacy compliance technology will rely on artificial intelligence (AI) by 2023, up from 5% today, according to Gartner, Inc.
“Privacy laws, such as General Data Protection Regulation (GDPR), presented a compelling business case for privacy compliance and inspired many other jurisdictions worldwide to follow,” said Bart Willemsen, research vice president at Gartner.
“More than 60 jurisdictions around the world have proposed or are drafting postmodern privacy and data protection laws as a result. Canada, for example, is looking to modernize their Personal Information Protection and Electronic Documents Act (PIPEDA), in part to maintain the adequacy standing with the EU post-GDPR.”
Privacy leaders are under pressure to ensure that all personal data processed is brought in scope and under control, which is difficult and expensive to manage without technology aid. This is where the use of AI-powered applications that reduce administrative burdens and manual workloads come in.
AI-Powered Privacy Technology Lessens Compliance Headaches
At the forefront of a positive privacy user experience (UX) is the ability of an organization to promptly handle subject rights requests (SRRs). SRRs cover a defined set of rights, where individuals have the power to make requests regarding their data and organizations must respond to them in a defined time frame.
According to the 2019 Gartner Security and Risk Survey, many organizations are not capable of delivering swift and precise answers to the SRRs they receive. Two-thirds of respondents indicated it takes them two or more weeks to respond to a single SRR. Often done manually as well, the average costs of these workflows are roughly $1,400 USD, which pile up over time.
“The speed and consistency by which AI-powered tools can help address large volumes of SRRs not only saves an organization excessive spend, but also repairs customer trust,” said Mr. Willemsen. “With the loss of customers serving as privacy leaders’ second highest concern, such tools will ensure that their privacy demands are met.”
Global Privacy Spending on Compliance Tooling Will Rise to $8 Billion Through 2022
Through 2022, privacy-driven spending on compliance tooling will rise to $8 billion worldwide. Gartner expects privacy spending to impact connected stakeholders’ purchasing strategies, including those of CIOs, CDOs and CMOs. “Today’s post-GDPR era demands a wide array of technological capabilities, well beyond the standard Excel sheets of the past,” said Mr. Willemsen.
“The privacy-driven technology market is still emerging,” said Mr. Willemsen. “What is certain is that privacy, as a conscious and deliberate discipline, will play a considerable role in how and why vendors develop their products. As AI turbocharges privacy readiness by assisting organizations in areas like SRR management and data discovery, we’ll start to see more AI capabilities offered by service providers.”
International Data Corporation (IDC) has announced the release of IDC FutureScape: Worldwide Developer and DevOps 2020 Predictions — China Implications (IDC #CHC45894020, January 2020). The new IDC paper puts forward ten developer and DevOps market predictions for 2020 with a specific focus on China, as well as actionable insights for technology buyers in the DevOps market for the next five years.
Nan Wang, Senior Market Analyst, Enterprise System and Software Research, IDC China, said that the study provides a well-rounded analysis and discussion of developer and DevOps market trends and their impact on digital transformation and business’ technology departments.
“Senior IT leaders and business executives will come away from this report with guidance for managing the implications of these technologies,” she said. “Furthermore, our market predictions will help inform their IT investment priorities and implementation strategies.”
IDC’s Top10 developer and DevOps market predictions for 2020 are as follows:
1. Enhanced AI optimizations for developers
By 2024, 56% of companies will not confine their artificial intelligence (AI)/machine learning (ML) use to application development. By then, close to 10% of AI/ML-based optimizations will focus on software development, design, quality management, security and deployment. By 2023, 70% of companies will invest in employee retraining and development, including third-party services to meet the needs of new skills and working methods brought about by AI applications.
2. Wide use of container platforms
By 2024, 70% of new application developed with programming languages will be deployed in containers for improved deployment speed, application consistency and portability.
3. Growth of part-time developers
By 2023, the number of part-time developers (including business analysts, data analysts and data scientists) in China will be double the number in2019. Specifically, part-time developers will increase from1.8 million in 2019 to 3.6 million, representing a CAGR of 12.2%. By 2020, 15% of customer experience applications will deliver hyper-personalization through reinforcement learning algorithms continuously trained on a wide range of data and innovations.
4. DevOps as a daily activity
By 2023, the number of organizations releasing codes for specific applications will increase to 30% from 3% in 2019. By 2024, at least 90% of new versions of enterprise-grade AI applications will feature embedded AI functions, though those disruptive AI-focused applications will only account for 10% of the total market.
5. Accelerating transformation of traditional applications
By 2022, the accelerating modernization of traditional applications and the development of new applications will increase the percentage of cloud-native production applications to 25%, driven by the utilization of microservices, containers and dynamic orchestration.
6. DevOps focus on business KPIs
By 2023, 40% of DevOps teams will invest in tooling and focus on business KPIs such as cost and revenue as operations will begin to play a more important role in the performance of end-to-end applications and business impact.
7. Use of related analytical tools driven by open source software
The increasing reliance of applications on open source components has driven the rapid growth of software component analysis and related tools. By 2023, software component analysis tools, which are currently only used by a minority of organizations, will be used by 45% of organizations.
8. Companies establishing their own development ecosystems
By 2023, 60% of Chinese G2000 companies will establish their own software ecosystems, and 50% will access important reusable code components from publicly accessible community code libraries.
9. Growth of open source codebases
By 2024, open source software derived from open code libraries as a percentage of all enterprise applications will double to 25%, with the remaining75% being customized according to organizations’ business models or use scenarios.
10. Recognized applicability of DevOps
By 2024, applications that completely use DevOps will account for less than 35%. Enterprises have recognized that not all applications can benefit from the complex operations spanning development and production related to continuous integration and continuous delivery (CI/CD).
As shown by the above predictions, emerging business models and frequently changing needs for services have created challenges to enterprises’ IT infrastructure. Going forward, the ability to quickly establish new business systems and continuously respond to system iterations brought about by market changes will emerge as the top priorities for IT departments
In response to these external pressures, companies have begun adopting modern methods to implement application development, packaging and testing. In this new and challenging environment, machine learning and AI are being widely used. IDC predicts that the work of future will, to a large extent, will be influenced by how developers and the DevOps community evolve in the next five years.
Worldwide ICT spending to reach $4.3 trillion
A new forecast from International Data Corporation (IDC) predicts worldwide spending on information and communications technology (ICT) will be $4.3 trillion in 2020, an increase of 3.6% over 2019. Commercial and public sector spending on information technology (hardware, software and IT services), telecommunications services, and business services will account for nearly $2.7 trillion of the total in 2020 with consumer spending making up the remainder.
"The slow economy, weak business investment, and uncertain production expectations combined with protectionist policies and geopolitical tensions — including the US-China trade war, threats of US tariffs on EU automobiles and the EU's expected response, and continued uncertainty around the Brexit deal — are still acting as inhibitors to ICT spending across regions," said Serena Da Rold, program manager in IDC's Customer Insights and Analysis group. "On the upside, our surveys indicate a strong focus on customer experience and on creating innovative products and services driving new ICT investments. Companies and organizations across industries are shifting gears in their digital transformation process, investing in cloud, mobility, the Internet of Things, artificial intelligence, robotics, and increasingly in DevOps and edge computing, to transform their business processes."
IT spending will make up more than half of all ICT spending in 2020, led by purchases of devices (mainly mobile phones and PCs) and enterprise applications. However, when combined, the three IT services categories (managed services, project-oriented services, and support services) will deliver more than $750 billion in spending this year as organizations look to accelerate their digital transformation efforts. The application development & deployment category will provide the strongest spending growth over the 2019-2023 forecast period with a five-year compound annual growth rates (CAGR) of 11.1%.
Telecommunications services will represent more than one third of all ICT spending in 2020. Mobile telecom services will be the largest category at more than $859 billion, followed by fixed telecom services. Both categories will see growth in the low single digits over the forecast period. Business services, including key horizontal business process outsourcing and business consulting, will be about half the size of the IT services market in 2020 with solid growth (8.2% CAGR) expected for business consulting.
Consumer ICT spending will grow at a much slower rate (0.7% CAGR) resulting in a gradual loss of share over the five-year forecast period. Consumer spending will be dominated by purchases of mobile telecom services (data and voice) and devices (such as smartphones, notebooks, and tablets).
Four industries – banking, discrete manufacturing, professional services, and telecommunications – will deliver 40% of all commercial ICT spending in 2020. IT services will represent a significant portion of the spending in all four industries, ranging from 50% in banking to 26% in professional services. From there, investment priorities will vary as banking and discrete manufacturing focus on applications while telecommunications and professional services invest in infrastructure. The industries that will deliver the fastest ICT spending growth over the five-year forecast are professional services (7.2% CAGR) and media (6.6% CAGR).
More than half of all commercial ICT spending in 2020 will come from very large businesses (more than 1,000 employees), while small businesses (10-99 employees) and medium businesses (100-499 employees) will account for nearly 28%. IT services will represent a significant portion of the overall spending for both market segments – 54% for very large businesses and 35% for small and medium businesses. Application and infrastructure spending will be about equal for very large businesses while small and medium businesses will invest more in applications.
"SMBs are increasingly embracing digital transformation to take advantage of both the opportunities it presents, and the disruption it can mitigate," said Shari Lava, research director, Small and Medium Business Markets. "Digitally determined SMBs, defined as those that are making investments in digital transformation related technology, are almost twice as likely to report double-digit revenue growth versus their technology indifferent peers."
$124 billion to be spent on Smart Cities initiatives in 2020
A new forecast from the International Data Corporation (IDC) Worldwide Smart Cities Spending Guide shows global spending on smart cities initiatives will total nearly $124 billion this year, an increase of 18.9% over 2019. The top 100 cities investing in smart initiatives in 2019 represented around 29% of global spending, and while growth will be sustained among the top spenders in the short term, the market is quite dispersed across midsize and small cities investing in relatively small projects.
"This new release of IDC's Worldwide Smart Cities Spending Guide brings further expansion of our forecasts into smart ecosystems with the addition of smart ports alongside smart stadiums and campus," said Serena Da Rold, program manager in IDC's Customer Insights & Analysis group. "The Spending Guide also provides spending data for more than 200 cities and shows that fewer than 80 cities are investing over $100 million per year. At the same time, around 70% of the opportunity lies within cities that are spending $1 million or less per year. There is a great opportunity for providers of smart city solutions who are able to leverage the experience gained from larger projects to offer affordable smart initiatives for small and medium sized cities."
In 2019, use cases related to resilient energy and infrastructure represented over one third of the opportunity, driven mainly by smart grids. Data-driven public safety and intelligent transportation represented around 18% and 14% of overall spending respectively.
Looking at the largest use cases, smart grids (electricity and gas combined) still attract the largest share of investments, although their relative importance will decrease over time as the market matures and other use cases become mainstream. Fixed visual surveillance, advanced public transportation, intelligent traffic management, and connected back office follow, and these five use cases together currently represent over half of the opportunity. The use cases that will see the fastest spending growth over the five-year forecast are vehicle-to-everything (V2X) connectivity, digital twin, and officer wearables.
Singapore will remain the top investor in smart cities initiatives. Tokyo will be the second largest spender in 2020, driven by investments for the Summer Olympics, followed by New York City and London. These four cities will each see smart city spending of more than $1 billion in 2020.
On a regional basis, the United States, Western Europe, and China will account for more than 70% of global smart cities spending throughout the forecast. Latin America and Japan will experience the fastest growth in smart cities spending in 2020.
"Regional and municipal governments are working hard to keep pace with technology advances and take advantage of new opportunities in the context of risk management, public expectations, and funding needs to scale initiatives," said Ruthbea Yesner, vice president of IDC Government Insights and Smart Cities and Communities. "Many are moving to incorporate Smart City use cases into budgets, or financing efforts through more traditional means. This is helping to grow investments."
Oliver Krebs, Vice President of EMEA at Cherwell Software, discusses how digital transformation can leverage Artificial Intelligence to improve the employee and customer experience.
Was there ever a buzzword as buzzy as “digital transformation?” It’s perfect: It sounds futuristic and exciting, but it’s also vague enough to be used for any number of different products and services. It could be a process, a technology suite, a mindset, or all of the above. But aside from the inevitable hype that accompanies every other innovation, digital transformation is not just technology for technology's sake; it will be a real game changer to your business.
Is digital transformation for you?
Digital transformation is, put simply, the use of new and fast evolving digital technology to solve a business problem. Gartner, in its glossary, describes it as “anything from IT modernization (for example, cloud computing), digital optimization, to the invention of new digital business models.” Sounds obvious that you would want it, doesn’t it?
But if you are still having doubts about whether it is quite right for you, then consider the following four questions. If you can answer ‘yes’ to at least one of these when evaluating the relevance of a new technology for your organisation, then you can be pretty confident that digital transformation is for you.
1. Does this solution make us more efficient?
2. Does this solution enhance the customer experience?
3. Will this solution help attract and retain the right talent?
4. Will this solution carry a measurable return on investment?
It is worth reiterating here that digital transformation is no longer something reserved for the future. It is the here and now. According to a Smart Insights survey, 65% of businesses already have a transformation program in place, with only 35% saying they currently have no plans to run one; which means that businesses who have not yet started their transformation are already behind two thirds of their competitors, a worrying thought in the competitive world we live in. So, undertaking a digital transformation is clearly not proceeding at the same pace everywhere, but what can it actually do?
One of the biggest innovations to sit under the digital transformation umbrella is Artificial Intelligence (AI). In its most basic form, you would consider yourself to be in the presence of AI when a computer system performs tasks which would normally be performed by humans. AI has moved beyond theory and into reality so quickly that it is fast becoming a ‘must have’ rather than a ‘nice to have’. Human skills such as visual perception, psychological insights, speech recognition, decision-making, and translation between languages are now routinely being replicated by AI algorithms that provide the instructions for computer systems to follow.
And according to a recent (2018) study, which Cherwell conducted in partnership with global research firm IDG, a whopping 71 percent of IT organisations have already implemented one or more AI projects. And around one-third of those surveyed say their AI projects are already generating a return on investment, with an overwhelming majority—90 percent—anticipating that the ROI will be positive in the next five years.
Many businesses have already begun to make use of AI-enabling technologies as part of their digital transformation, with basic information capture and process automation being at the top of their ‘To Do’ list. All of this is being facilitated by integrating processes across departments.
Integrating processes is an essential building block of digital transformation because it breaks down the silos of information that commonly grow up within organisations. Integration also makes communication and collaboration between staff and customers much easier, promoting self-serve options that eliminate departmental bottlenecks.
Work process integration has also been shown to encourage higher employee engagement, satisfaction and productivity. According to a recent Lawless Research study of cross-functional processes commissioned by Cherwell Software, (‘Work Process Integration: Bad News is Good News – Small Footprint Today Signals Big Opportunities Ahead’) a variety of standard processes aren’t highly integrated, and poorly designed (i.e., non-integrated) processes reduce productivity and negatively affect the employee experience.
At least one-quarter of cross-functional team managers expressed frustration that different team members were using different apps. The issues considered most frustrating in this context were inefficiencies (cited by 43% of managers), repetitive work (40%), miscommunication (37%), errors (27%), and software incompatibility (26%). Respondents also reported that, on average, manual processes (which could be automated) consumed nearly half of their workday. 43% said that they spent at least half of their workday on manual processes. These processes included onboarding/offboarding an employee (86%), resolving customer issues (also 86%), conducting performance reviews (85%), and participating in cross-functional projects (also 85%).
Whether your ‘customers’ are inside or outside your organisation, their demands for a service that utilises data across the organisation are only increasing. ‘Inside’ customers, such as employees who use the company’s IT services and ‘Outside’ customers who buy the company’s products and services can both benefit equally from AI-enabled activities that draw from a common well of resources.
The IT support desk is a good example of where organisations can start gathering information around the most frequent ‘pain points’ of their end users and not just react to problems after they have occurred. Proactively anticipating them in advance can prevent them happening in the first place. If they have the resources and know-how, they can find solutions and innovate. Predictive analytics can also be used for incident management, demand planning, and workflow improvement.
Your IT employees might be relying on manually - processed IT tickets for every request they receive, whereas moving to a self-service IT model would give the IT team more time to focus on transformation initiatives, while also increasing employee efficiency and effectiveness.
AI can also reduce the costs associated with high-volume, low-value service desk activities because it takes over routine and repetitive tasks. Chatbots, knowledge curation, and incident/request routing are three big categories of AI features that are already in widespread use now. AI-assisted knowledge management now includes an intelligent search function that doesn’t just rely on specific keywords, but actually understands context and meaning in a way that a human being does. It should be pointed out that while AI does change the way that IT supports staff work, it won’t replace them completely, but will add value to what they are already doing.
The use of chatbots, for example, means there is always ‘someone’ available 24/7 and 365 days a year, even if it’s not an actual human being. While many chatbots are used simply to deflect routine and repetitive queries and requests before escalating more complex interactions to a live IT support agent, virtual support assistants (VSAs) are now even capable of carrying out actions for the customer such as resetting passwords, deploying software, escalating support requests and restoring their IT services.
And what about customer service? Chatbots, for example, are one of a number of different engagement options that can be made available to customers. A chatbot AI can give customers the answers they need, when they want them, without having to wait on hold for ages until a human employee becomes available. This can drastically improve the time both parties spend resolving the query.
And how much better would it be if you could put customer communications, history and documents into context and use that knowledge to provide a better, faster service? Human agents draw on a wide range of skills to solve customer problems, but so too can non-human agents if they have the information available. And non-human agents have the advantage that they can read, digest and process huge amounts of data far faster than humans can. It is also worth remembering that chatbots are not the only option for customers – organisations should be able to customise the available interfaces depending on the context of the business and the customers themselves.
The benefits of getting quick and easy resolutions to problems and getting perfectly tailored offers that suit your specific requirements make it easy to see why these AI systems are becoming so popular. In the future, we can look forward to enjoying the convenience of a fully tailored, always ‘on’, user experience when at work or at home.
Connectivity as a concept has become an essential part of life, as opposed to just a luxury. The Internet of Things (IoT) has already become commonplace in our lives, thanks to all the connected devices and smart technologies we own, interacting with one another to create a fully connected network. With the global number of IoT devices projected to triple by 2025 and 5G technologies very soon to become a cohesive part of the UK’s telecoms infrastructure, as a country we will soon be more connected than ever.
By David Higgins, EMEA Technical Director at CyberArk.
Constant connectivity provides opportunities for innovation and modernisation. Conversely though, it also creates cybersecurity threats that can compromise extremely sensitive information.
With the world heading swiftly into an age of ever-more-enhanced connectivity, individuals and organisations need to familiarise themselves with these developing threats and the volatile landscape, while ensuring they have a robust way to protect themselves against these threats.
Finding a Place for CSPs in a Volatile Landscape
Communications services providers (CSPs) specialising in mobile services, media, or web services live in a world of relentless innovation. A need to stay relevant forces CSPs to deliver value beyond basic connectivity. This opens lucrative new markets and opportunities for all industries.
The IoT industry will play a pivotal role in these innovations. The technology is on track to embed itself into countless aspects of our day to day lives, playing a pivotal role in the creation of smart cities and infrastructure, connected vehicles, digital healthcare, smart homes and more, at a pace that is hard to keep up with. 5G is also being rolled out at the same time as the IoT reaches its peak, ensuring substantial potential disruption.
Similarly, Over-the-Top (OTT) businesses – content providers that distribute messaging and streaming media over the internet – are booming. Years after Yahoo! Messenger and AOL’s AIM came and went, they keep finding new ways to undermine CSPs’ business models. Tencent, the parent company of Chinese messaging platform WeChat, currently has a market cap of over £300bn (compared to Verizon’s £190bn) and the meteoric rise of OTT streaming players like Netflix has been well documented.
And, of course, the counterpoint to all this innovation is that cyber attacks are more prolific than ever, displaying ever-evolving tactics as cybercriminals learn and adapt. Telecoms companies are frequently targeted because they build, control, and operate critical infrastructure that is widely used to communicate and store large amounts of sensitive data for consumers, businesses, and government. Data breaches or denial of service attacks on CSPs can reverberate far beyond the initial incident. Moreover, end user equipment – home routers, smartphones, IoT devices and more – are not entirely under CSP control. They can be easy to compromise and thus ideal targets for hackers looking to steal data.
Following a long year of social media giants battling with digital regulations, data privacy is a higher priority than ever before. Since prominent communications brands have also been implicated in major data breaches, CSPs are beginning to recognise the need to embrace trust as a competitive differentiator.
For consumers, the dramatic expansion in bandwidth and connectivity that will come with 5G technologies and emerging IoT devices will provide more options to engage with media. It will also present new opportunities for both media providers and network operators. There’s no doubt that it’s an exciting time in the telecommunications sector.
Although CSPs are best positioned to enable these new business models, they can’t just sit back and enjoy the riches of growth. They still need to work to secure their customers from the risks inherent to the data economy.
Telecommunications Infrastructure is a Unique Access Point to National Security for Cyber Criminals
The infrastructure of telecommunications organisations is inherently more exposed to hard-hitting cyber attacks compared to other consumer-oriented organisations. Bad publicity, brand damage, and regulatory fines can cause short to medium-term damage, but an attack on a telecoms company has the potential for a much deeper impact that most other services don’t have to worry about.
Telecommunications systems are embedded so deeply within the networks of nations across the globe that their security has become paramount. They are constantly functioning as facilitators of not only financial and business transactions, but also emergency response communications, meaning that the consequences of a breach are substantial. Steps must be taken to ensure that every blunt edge in telecoms cyber security is sharpened and secured.
Guarding Assets with Privileged Access Management
Companies today look after a whole host of information and data, much of it being confidential and of critical importance. To guard this data yet still allow certain individuals to access to it, privileged credentials exist across almost every enterprise’s IT environment. Cybercriminals know this. That is why almost all advanced attacks today gain access to a target’s most sensitive data, applications, and infrastructure by exploiting a company’s privileged credentials. Telecommunications is by no means an exception.
Despite this knowledge, organisations allow privileged access to critical assets and systems to remain unsecured and unmanaged. Assets are therefore left vulnerable to damaging cyber-attacks that could impact telecommunications companies and citizens far beyond the limits of a simple data breach.
Companies must up their game in securing, controlling, and monitoring the use of powerful privileged accounts to minimise disruptive damage to these systems.
In order to proactively reduce the risk of privileged access abuse, telecoms companies must firstly understand the most common types of attacks that exploit privileged access. They must know how an attacker thinks and behaves in each case to exploit the organisation’s vulnerabilities. They must then prioritise the most important privileged accounts, credentials, and secrets. Identifying the potential points of attack, and then focusing especially on those that could jeopardise critical infrastructure or the organisation’s most vital information, is also essential.
After understanding weaknesses and access points for attackers, telecoms companies must determine the most effective actions to close the gap in these areas. Which actions are the highest priority? What can be achieved quickly, and which actions require a long-term plan?
As an overarching rule, organisations should recognise that attackers are constantly looking for new ways to gain access, and act appropriately in response. By taking the time to plan out a strategy for managing privileged access and returning to reassess it as your organisation and the threat landscape evolve, you can develop a formidable defence.
Securing Telecommunications’ Exciting Future
As a global society, we’re moving into an era where technology is our most important asset and tool. We are innovating the very networks that this world runs on daily, creating, in turn, services that improve our standard of living exponentially. IoT devices create a network of tools versatile and agile for human needs, and 5G connectivity will provide all of it to our fingertips. But as a shepherd tends his flock at night, so must we in keeping alert and vigilant to potential threats that attempt to disrupt positive growth.
By understanding threats, proactively prioritising the weakest points in privileged access infrastructure, determining the best course of action, and striving for continuous improvement, CSPs will be able to minimise damage from cyber threats. With a solid strategy in place, the rewards of a hyper-connected world will be reaped.
Common automation frameworks and the end of speed and security compromises.
By Bart Salaets, Senior Solution Architect Director, F5 Networks.
In the service provider realm’s not-too-distant past, there was a distinct line in the sand.
On the one side, networking and security teams spearheaded the evolution to an NFV architecture, with a strong focus on virtualising network and security functions. Forays into the world of automation were tentative at best.
On the other side, developers enthusiastically embraced cloud platforms, DevOps methodologies and automation via CI/CD pipelines. Rapid application deployment and delivery was, and still is, the name of the game.
The edge is where they both come together, and applications can live harmoniously side-by-side with network functions.
Thanks to its distributed nature, and fuelled by the gradual global rollout of 5G, edge computing is finally starting to empower service providers to offer new solutions and services that simultaneously increase revenue streams and cut network transport costs.
Rather than transmitting data to the cloud or a central data warehouse for analysis, processing can take place at the ‘edge’ of a network, reducing network latency, increasing bandwidth and delivering significantly faster response times.
Take self-driving cars.
Hosting applications, with all their associated data, in a centralised location like a public cloud can yield end-to-end latency to the tune of tens of milliseconds. That is far too slow. You’ll get the same result if the the application stays central and the network function is moved to the edge. However, when you move the application and network function to the edge it is possible slash latency to milliseconds. Now we’re in business.
Virtualised Content Delivery Networks (CDN) are another compelling case in point.
To date, a third party CDN tended to be hosted at a peering point or in a centralised data centre. This is changing, with some canny service providers building their own edge computing based CDNs to cover local video content and IPTV services, all while saving on transit and backhaul costs.
There are different business models available to bring these kind of use cases to life.
The simplest scenario is a service provider allowing physical access to an edge compute site. Third parties bring their own hardware and manage everything. The service provider is responsible for space, power and connectivity, also known as a colocation model.
A far more interesting approach is for the service provider to offer Infrastructure as a Service (IaaS) or Platform as a Service (PaaS) options to third parties through a shared edge compute platform. Service providers can build these themselves or with specialist partners.
The power of automation
Automation is the secret ingredient to making it all work.
In the context of cloud computing, automation is critical for developers to publish new versions of code at pace and with agility.
In the networking world of NFV, automation is key to driving a service provider’s operational costs down. Previously, network- and service provisioning were manual, time-consuming tasks. Today, while objectives may differ, the tooling and techniques are the same – or sharable – for both network and developer teams. Applications and network functions co-exist in an edge compute environment.
So how can developers automate the deployment of applications and associated application services in the cloud?
For the purpose of this article, we’re concentrating on application services automation. It is worth noting that the steps described below can be easily integrated into popular configuration and provisioning management tools such as Ansible or Terraform, which are then further complemented by CI/CD pipeline tools such as Jenkins.
The first step is bootstrapping or introducing the virtual machine to deliver application services into the cloud of choice.
Next is onboarding, which implies introducing a basic configuration with networking and authentication parameters (e.g. IP addresses, DNS server etc.). Finally, there’s the actual deployment of application services – such as ADC or security policies – using declarative Application Programming Interfaces (API).
The last point is critical.
Imperative APIs, which most vendors have, means you tell the system what to do at every juncture. Firewalls are a good example. You’d need to create address lists and align them with firewall rules. These are then grouped together in a policy, which is then assigned to an interface. There are distinct steps and requirements for rest API calls to go through in sequence, otherwise everything fails. Contorting all of this into an automation tool is expensive and takes time.
Declarative APIs are a different beast. You can tell the system what you want, and it figures out the way ahead. With one declaration (in JSON or YAML format) you could, for instance, define all ADC and security service parameters and give it to the system with a single rest API call. In this case, the outcome is either a success (the service has been deployed) or it has failed but the overall system remains unaffected. There is no requirement for intelligence in the automation system. The intelligence stays within the systems you are configuring, which dramatically reduces automation costs.
The exact same steps can be taken to provision a virtual network function in an NFV environment. A declarative API markedly simplifies integration with end-to-end NFV orchestration tools. The orchestrator doesn’t need to know the individual steps to configure a service or network function, it simply pushes a single JSON declaration with the parameters the system needs to set up the service. Again, the intelligence on ‘how’ to configure the service stays within the system you are configuring.
Through closer alignment between networking and developer disciplines, we can now build a distributed telco cloud with a common automation framework for applications and network functions. It is agile and secure at every layer of the stack – from the central data center all the way to the far edge and can even span into the public cloud.
Industry-wide, we expect common automation frameworks that enable the deployment of applications and their services, as well as network functions, to become the norm in the coming years – particularly as 5G rollout continues worldwide. The pressure is building for service providers to unify siloes, get agile and start living more on the edge.
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date? Are the majority of companies comfortable with the technology and how it can be used, and are many already using it? Or, are the majority of organisations still unclear as to exactly what it is and what it offers to the business? DW asked a range of experts for their thoughts, and there was no shortage of responses, and plenty of food for thought. Part 1.
Challenges lie ahead, says Subhankar Pal, Assistant Vice President for Technology and Innovation at Altran:
Enterprises are entering the early stages of Machine Learning (ML) development, but a number of challenges hinder them from extracting value from their investments. Some of the challenges facing them include developing ML capabilities at scale, version control, model reproducibility and aligning stakeholders. The time it takes to deploy a ML model is stuck somewhere between 30 to 90 days for most companies. Enterprise budgets for AI programs are growing by around 25 percent, and the banking, manufacturing, and IT industries have seen the largest budget growth this year.
Questions have also been raised on who will be responsible if AI makes serious errors, like an autonomous-car causing accidents or AI-based medical equipment putting a patient’s life at risk due to an incorrect diagnosis. Incomplete models of responsibility undermines trust and consequently deters adoption and the use of this revolutionary technology. Deploying AI is no longer just about training it to perform a given task, but about creating a system whose output can be trusted by us humans.
AI systems can have human-in-the-loop or human-out-of-the-loop strategies. Human-in-the-loop systems require human interaction, whereas human-out-of-the-loop systems can function without human interaction. Rather than rushing to create untrusted fully autonomous systems, “human-on-the-loop” is desirable, where humans are actively monitoring the AI system to ensure they are receptive to requests for human control of the AI system. A prime example of this is Tesla’s Autopilot system. Some drivers placed undue faith in the Autopilot and took their hands off the wheel. The Tesla manual states that the Autopilot is a driver assistance system, where drivers must keep their hands on the wheel at all times when using the system.
Three aspects differentiate the AI systems from the traditional software system:
Enterprises need a “Trusted AI” framework that safeguards human centricity and allows the correct degree of autonomy to the AI system. Metrics driven data quality measures trusted AI systems to remove subjectivity from assessing the quality of the training data. Trusted AI systems must safeguard and respect user data privacy, provide data protection and allow data access to only qualified personnel.
Maintaining data privacy is a not only an essential requirement to gain user trust for AI systems but, also required for legislative compliance such as GDPR, CCPA (California Consumer Privacy Act) etc. Trusted AI system will use ML algorithms that are tested robustly and be free of bias and be non-discriminatory.
There are some interesting AI trends worth keeping an eye on:
AutoML and MLOps are shaping the future of enterprise data science workflow automation and enabling the deployment and management of AI models at scale in a highly distributed network. Automated machine learning, or AutoML, is an umbrella term for the techniques involved in automating the process of building a machine learning model from raw data. The goal of AutoML is to provide an easy-to-use interface for non-experts who are training large sets of machine learning models, without prior knowledge or effort by the data scientist. Leading cloud vendors like Microsoft Azure and Google GCP also provide AutoML capabilities as part of their data science platform.
Asheesh Mehra, CEO & Co-Founder, AntWorks, on the quest for ‘automation nirvana’:
“There are many enterprises out there who are giving in to the hype around RPA, putting considerable amounts of their time and resources behind automating tasks. The issue here is what enterprises’ are really looking for in their digital transformation journey is the intelligent automation of a business process end-to-end with little to no human intervention. That’s automation nirvana.
Yet, organisations are time and time again hitting a major roadblock in their digital journey, and that’s the ingestion, curation, and digitisation of unstructured data. Business data can be classified into two types of data sets – structured and unstructured. The former can be described as data in the form of fixed field code text, whilst the latter is found in images, web pages, legal documents, and personal records. Currently, businesses are unwittingly utilising RPA tools that can only analyse the simpler structured data, whilst neglecting unstructured data, which makes up approximately 80 per cent of all business data. These businesses are selling themselves short by only automating certain aspects of their business processes, and not all of it as intended.
Automation that solely processes structured data cannot be scaled to meet the growth requirements of most businesses, simply because the ROI isn’t feasible when you’re only automating 20 per cent of all the data you’re using. If your business uses common unstructured data on a regular basis, like billing logistics and tax documents, emails, invoice documents and imagery, then RPA isn’t going to cut it.
The only fix for this issue is implementing tools that can process both structured and unstructured data. This includes Intelligent Automation Platforms (IAP), which use Cognitive Machine Reading (CMR) to organise, process and analyse both structured and unstructured data. The first step for companies is to identify what types of data they need to process, rather than making premature decisions on implementing tools that may not be right for the data they’re using.”
Artificial intelligence: are we there yet?
Recently, Atheneum hosted a panel of AI experts for a discussion on where the technology is currently, what the future looks like, and how businesses can, and are, enabling it:
AI is often misinterpreted: we’re not there yet
The panellists discussed how AI is often misinterpreted from John McCarthy’s definition in the 1950s as “thinking machines”.
According to Atheneum’s panel of AI experts, the reality of ‘strong AI’ - machines that can reason and do everything humans can do – is still another 100 years or so away from development. “We don’t have AI yet. We have statistics. There is still a need for human analysis,” explained Dr Diana Deca.
What we do have however, is increasing use of automation – Dr. Fran Cardells sees it as the ‘new electricity’ with “100 bots working for us every day, from our daily searches in Google to online banking.”
How AI can enable businesses to improve decision making
Investing in AI technology has the potential to enable businesses to streamline operations, advance market research and better manage client relations.
The panel discussed how this automation of processes allows businesses make better-informed decisions, faster, by enabling access to real-time data. It also diverts usage of a skilled work force away from admin tasks towards areas that will better benefit the business.
AI in Industry
In insurance for example, AI has the potential to replace manual processes with automated systems and improve how policies are tailored for individuals and businesses. This technology can reduce cost of losses by ensuring the most well-informed decisions are consistently made. Just as data helps marketers to better tailor advertising, insurers could save billions of pounds from having better knowledge of their customers. By analysing customer data they can more accurately predict behaviour, understand preferences and customize product offerings.
What does the future look like for AI?
Atheneum’s AI panel described how more advanced techniques are being developed to train for complex data processing tasks. “While AI can currently make a decision based on existing data, the next evolution will be when we’re able to review the process the AI took to get to that decision. This will enable the next decision to be addressed faster,” explained Dr Cardells.
“We’re currently able to measure the temperature and volume of a room with 4D cameras, but we will get to the point where we can analyse how many items there are and how much they weigh, for example, which will revolutionise the retail, insurance and logistics industries.”
“The next step is how to get AI to the point where human monitoring is no longer necessary.”
Should vs could: AI helping humans to make better decisions
As Dr Deca explained, AI based on statistics is commonly utilised today to present information learned from previous user activity. However, in theory, AI could be developed to recommend what humans should do in certain situations, rather than what they could do.
For example in the ecommerce industry, recommendations such as: “Customers that bought this, also bought this…”, could instead recommend products that are genuinely most beneficial to the customer’s need. Automated recommendations otherwise have the potential to encourage a cycle of customers all making the same wrong decisions.
Ram Chakravarti, CTO, BMC Software, on the need for organisation to evolve into Autonomous Digital Enterprises:
Adapting to the age of analytics has not been easy, even in the digital enterprise. However, businesses know that in order to compete they need to capture real value from the massive amounts of data within, as well as incorporate insights from that data into their processes that drive and optimize experiences. This is why we're seeing the demand for and investment in AIOps in the enterprise continue to rise: to help manage an expansion in the number of workloads, both in public cloud and on-premises, and an increase in application complexity – and ultimately lead to better business outcomes.
While myriad AIOps solutions on the market provide visibility into pieces of the application stack, today’s challenges place a premium on differentiated vendor solutions powered by AI/ML and big data analytics techniques that can help modern IT operations evolve from traditional monitoring to observability to actionability. Therefore, enterprises most comfortable with and benefiting the most from AIOps solutions seek capabilities that can shift their approach of simply finding a problem and discovering root cause, to one of actionability, where automatic remediation can be applied to problems based on business priority.
As AI and ML become more prevalent, organizations will need to evolve into Autonomous Digital Enterprises (ADE), a state where intelligent, integrated, value-creating functions can operate with minimal (ideally zero) human involvement or interference, across every facet of the organization and its ecosystem of partners. Using AIOps to get predictive and proactive today will help them thrive in this futurescape.
‘Any sufficiently advanced technology is indistinguishable from magic’ - Sir Arthur Charles Clarke’s Third Law (1973) goes a long way to explaining the state of AI in business at the moment.
AI certainly qualifies as an advanced technology (even though it’s been around since 1956*) and, without meaning to be rude, for most people it is indistinguishable from magic – most of us have no idea how it actually works, and many have over-inflated expectations of what it can actually do (thanks Elon). This makes it hard to understand its strengths and limitations, where it makes sense to deploy it, as well as makes it hard to differentiate AI from complex software. This results in highly-varied degrees of success in the use of AI within businesses.
To give an example of why understanding the limitations of AI is essential to successfully using it, consider the following example: for several years, Amazon was using AI to help filter applicant CVs and identify suitable candidates, essentially letting the machine objectively decide who to take forward. However, the AI was trained on historical CVs that were mostly males, and so it was found to be rejecting female CVs. Not so objective after all. It is remarkable that Amazon, one of the leading AI players, had not considered the inherent bias of input to the model sooner.
Now, it may seem boring to talk about understanding and limitations when everyone is hyping up the potential for AI, but, as the Amazon example shows, it is of fundamental importance. Even more boring still, and what many companies really struggle with when it comes to deploying AI, is access to data - the first step in any AI project is to ensure data availability, cleanliness, and integrity, without which the benefits of AI will never be truly realised. It is no mean feat to get this right, and it should come as no surprise that the companies achieving the greatest success in AI are those that are, in relative terms, young – for example, Facebook, Google, Amazon, Uber, and Tesla.
In summary, many companies, even those using it, are unclear what AI is and how it works, and many don’t have the data foundations on which to exploit it to its fullest. That said, its popularity will continue to grow, and its use will no doubt continue to increase over the coming years, but the companies that will achieve the greatest successes are those that have understood its limitations and exploited what it is great at – learning from huge amounts of data and helping humans to do better jobs, not replacing them entirely.
*The term Artificial Intelligence was coined back in 1956, but it’s only relatively recently that we’ve been accumulating the mass data sets and developed the computational horsepower needed to use AI (consider that the most recent iPhones have over 100,000 times the processing power of the computer used for the Apollo 11 moon landing).
Artificial intelligence (AI) has become integrated into our everyday lives. It powers what we see in our social media newsfeeds, activates facial recognition (to unlock our smartphones), and even suggests music for us to listen to. Machine learning, a subset of AI, is progressively integrating into our everyday and changing how we live and make decisions.
By Grainne McKeever, Marketing and Communications Consultant at Imperva.
Machine Learning in Finance
Business changes all the time, but advances in today’s technologies have accelerated the pace of change. Machine learning analyses historical data and behaviours to predict patterns and make decisions. It has proved hugely successful in retail for its ability to tailor products and services to customers. Unsurprisingly, retail banking and machine learning are also a perfect combination. Thanks to machine learning, functions such as fraud detection and credit scoring are now automated. Banks also leverage machine learning and predictive analytics to offer their customers a much more personalised user experience, recommend new products, and animate chatbots that help with routine transactions such as account checking and paying bills.
Machine learning is also disrupting the insurance sector. As more connected devices provide deeper insights into customer behaviours, insurers are enabled to set premiums and make payout decisions based on data. Insurtech firms are shaking things up by harnessing new technologies to develop enhanced solutions for customers. The potential for change is huge and, according to McKinsey, “the [insurance] industry is on the verge of a seismic, tech-driven shift.”
Few industries have as much historical and structured data than the financial services industry, making it the perfect playing field for machine learning technologies. Investment banks were pioneers of AI technologies, using machine learning since as early as the 1980s. Nowadays, traders and fund managers rely on AI-driven market analysis to make investment decisions that are paving the way for fintech companies to develop new digital solutions for financial trading. AI-driven solutions such as stock-ranking based on pattern matching and deep learning for formulating investment strategies are just some of the innovations available on the market today.
Despite these technological advances, the concept of machine learning replacing human interaction for financial trading is not a done deal. While Index and quantitative investing account for over half of all equity trading, recent poor performance has exposed weaknesses in the pattern matching model on which investing strategies are based and demonstrates that, no matter how fancy the math, computers are still no replacement for the human mind when it comes to capturing the nuances of financial markets. At least, not yet.
Data Analytics for Security and Compliance
Managing enormous volumes of data make compliance and security two of the biggest challenges for financial organisations. It is no longer enough to protect your network perimeter from attack, as the exponential growth of data and an increase in legitimate access to that data increases the likelihood of a breach on the inside. Additionally, banks are storing large volumes of data across hybrid and multi-cloud environments that provide even more opportunity for cybercriminals to get their hands on valuable assets. In short, the same data that brings new opportunities for business growth increases the security risk for financial firms.
Data analytics using machine learning has been transformational in helping firms overcome these challenges as it picks up on unusual user behaviour to detect suspicious activity and minimise the risk of fraud, money laundering, or a breach. Similarly, data analytics technologies can be applied to compliance activities such as database auditing processes, reducing the need for human intervention and thereby easing the burden for compliance managers.
As the financial services industry continues to leverage machine learning and predictive analytics, the volume of data these firms generate and store is ballooning. Protecting that data, other sensitive assets, and business operations will only become more challenging. Firms will have to adopt new security technologies that can mitigate their security and compliance risk.
For developers and the IT industry, the introduction of DevOps has had a profound effect, changing mindsets and making concepts like continuous integration and continuous delivery more popular.
By Luca Ravazzolo, Product Manager, InterSystems.
The increasing adoption of DevOps is down to several factors, for instance, it allows organisations to capture all processes in an auditable and replicable way. Additionally, it adapts quickly, which makes the cost of change low, allows businesses to add cross-functionality collaborations, which often involves different teams working together, and results in working at a much higher speed. Since its introduction, DevOps has also served to highlight that organisations need to be more agile and has inspired many to do so.
As DevOps has matured and become more mainstream, there has been a gradual evolution of the approach. With a similar evolution taking place in the cloud world, more intelligent tools have started to become available too. Consequently, developers are now able to follow up DevOps processes with more discipline and more efficiently, with the approach showing the potential to revolutionise enterprise.
Among the most significant developments in the evolution of DevOps so far has been the emergence of DevSecOps – the practice of integrating security into development. Historically, the issue of security had been largely overlooked in terms of DevOps due to the inclusion of security during development hindering speed. Instead, security was commonly retrofitted after a build. However, as developers and organisations have begun to realise that this approach not only makes the process more difficult but also isn’t the most security-conscious method, some have started to integrate security into DevOps from the outset. This DevSecOps approach allows developers to alleviate any security issues at the time of development, rather than retrospectively.
Beyond this, DevSecOps is also helping businesses to break down siloes by encouraging greater collaboration across teams to ensure that security experts are involved and knowledge is being shared.
As more organisations adopt a DevSecOps approach, they should adopt the following two initiatives:
Detailed review processes
A vital principle of DevSecOps is to continually review security. This means compliance monitoring for PCI and GDPR, determining what the process is if security senses a threat and deciding how the business will assess if code is susceptible to a certain vulnerability. In order to do this successfully, it’s important for an organisation to establish a review process from the moment it thinks about architecting a new solution. From here, it can move to ongoing monitoring and management of security as the code progresses through every stage, from the developer desk to the building of the solution and the testing of it. It’s also crucial to ensure developers are given training and are taught to be aware of security throughout the development journey.
A culture of collaboration
Enterprises must ensure they have the right mindset and embrace a collaborative culture which recognises that different expertise from across the business is required for DevSecOps to be effective. Traditionally, developers have been focused solely on aspects such as logic and algorithms, with security factoring only as an afterthought. However, by adopting a culture that encourages collaboration at the start of every build, organisations will be able to create secure, stable, resilient solutions which will pay dividends.
The next phase of DevOps
The advent of DevSecOps signals a continued evolution of DevOps, and while there is no guarantee regarding where the approach might go next, there are two main theories regarding its future:
NoOps is the idea that solutions will feature everything they are required to, such as code standards, security, libraries and legislation protocols, from the outset and everything will be completely automated. Technically, as everything would be automated within the software provisioning pipeline, there would be no need for manual, human-based operations, instead, they will be required to merely monitor and raise questions as they verify the software. As everything would automatically meet a certain standard, this could potentially guarantee a higher level of security and resilience.
Rather than DevOps disappearing completely, different types of Ops may emerge. Ops could be augmented by machine learning (ML), or MLOps could be developed to form a machine learning-driven operation that would be able to certify the standards that organisations want software to be written with and even flag issues with it.
The evolution of DevOps is likely to continue as organisations become more familiar with it and technology continues to advance at pace. In time, this will result in DevOps beginning to encompass new technologies, such as ML, and all of the requirements of development being brought together. Ultimately, this will be extremely beneficial, encouraging collaboration across departments and ensuring that new solutions meet required standards and security from the outset.
Containerised applications are fast becoming an established fact in the IT infrastructure of global organisations.
By John Rakowski, vice president of strategy at LogicMonitor.
According to a Gartner survey, by 2022, more than 75 percent of companies will be running containerised applications in production – a sweeping uptake from the fewer than 30 percent that do so today. While containers can help teams gain scalability, flexibility and ultimately, delivery speed, they also create a lot of complexity as applications and associated infrastructure becomes more distributed. It should also be noted that if an organisation uses containers, it will also likely use an orchestration tool for the deployment and management of containers, such as Kubernetes. As such, it is important that DevOps teams have monitoring in place to increase visibility, being able to seamlessly spot performance or availability issues that originate in containers and traditional infrastructure architecture, in order to prevent business problems. To ensure this, there are a few pivotal rules to follow to maximise container investment.
Monitor apps alongside infrastructure
For traditional infrastructure, the ideal practice is to monitor server performance metrics and the health of the application running on it, including trace or calls made to other components. However, the work of IT teams is frustrated by the exponential layers of complexity added by Kubernetes. The IT team must undertake the daunting task of not only monitoring the server and application, but also the health of the containers, pods, nodes and the Kubernetes control plane itself.
To maximise the return on investment (ROI) of container investments, the monitoring of the Kubernetes control pane and master components is highly important. Unhealthy components lead to issues in the scheduling of workloads and this can directly impact scalability and flexibility benefits, plus the running of business applications. When these applications fail, it can put serious strain on an organisation’s service level agreement (SLAs), customer commitments and the overall brand.
Beyond the monitoring of components, a keen eye must be kept on the overall business service being delivered. This multi-leveled monitoring necessitates a tool that is not just capable of monitoring containerised applications and all elements of infrastructure, but can easily roll up information to an overall service view. This ensures that DevOps and IT support teams have holistic context into how components link together and how any emerging issue impacts related processes.
Monitor at the service level
The underlying infrastructure of applications is ever-growing in complexity. This being the case, it is important that an organisation’s IT team prioritises the applications and services critical to business functionality. To help guarantee a clear perspective of networking infrastructure, IT teams must not be too focused upon the individual container view – after all, if one specific container has raised an alert, this does not necessarily mean multiple other containers are failing. It may be that, despite the alert, business services have not been negatively impacted.
An effective method to maximise ROI by staying focused upon what is integral to business functionality is by identifying and monitoring key performance indicators (KPIs) across different containers. This will provide an overall service or application level view and a telling perspective of how your applications are performing.
Automated monitoring saves time
When using Kubernetes, containerised workloads are scheduled across nodes in the cluster so that resources are used efficiently to meet workload requirements. The process of manually adding containers and pods to and from monitoring is time consuming, inefficient and, simply put, unrealistic. The oft recycled truism ‘time equals money’ is certainly correct when an organisation’s IT team is stuck with a monitoring solution that requires manual changes. In this scenario, teams are faced with numerous tasks such as adding monitoring agents, configuring metrics to be collected, and even specifying when alerts should be triggered by changing thresholds in order so that they reflect the needs of the business service.
Underlining the need for automation, is the fact that the resources themselves can be short-lived to begin with. Sysdig’s 2018 Docker report demonstrated the ephemeral natures of containers by finding that 95 percent of containers live less than a week, 85 percent live less than a day and 74 percent live less than an hour, while 27 percent live between five and 10 minutes, and 11 percent live less than 10 seconds.
These fleeting lifespans are not necessarily a drawback – that is why companies choose to implement containers. However, to maximise ROI, it is imperative that IT teams automate container monitoring, including automatically adding and deleting cluster resources to be monitored, in order to reduce manual effort involved.
A unified view of monitored resources is essential
Companies that use Kubernetes are often complex in infrastructure – they may operate both in the cloud and on-premise, while using containers as a unifying layer to standardise application management and deployment. To effectively manage such labyrinthine systems, it is essential to have unified visibility of business services and their underlying infrastructure components, including containers and traditional components. But more importantly, this visibility must provide automatic context, e.g. if an issue in a container starts to arise, then the impact on other pertinent infrastructure components and the overall business service is made known.
It is a fact of modern IT infrastructure that diverse environments are connected, so without a unified view, it can be immensely challenging to troubleshoot issues that transcend environments. Given this complexity, monitoring tools must go beyond only monitoring what has happened, to understanding why – a unified intelligence approach that helps users remain proactive when encountering these challenges.
Effective monitoring ensures ROI on container investments
Containerised applications can be a useful tool in the employ of IT teams, offering scalability, flexibility and delivery speeds. However, their utility is often matched by the complexity that they, and the Kubernetes orchestration tool, bring to infrastructure. To maximise ROI, effective monitoring is all but essential. The ideal means of doing this is by following these few pivotal rules, and when these are followed, an organisation can truly make the most out of containerised applications.
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date? Are the majority of companies comfortable with the technology and how it can be used, and are many already using it? Or, are the majority of organisations still unclear as to exactly what it is and what it offers to the business? DW asked a range of experts for their thoughts, and there was no shortage of responses, and plenty of food for thought. Part 2.
Breanna Wilson, Manager of Corporate Customer Success at Conversica, on the rise of the virtual assistant:
“We’ve heard a lot about how artificial intelligence (AI) will transform the back office function within businesses, but very little about its transformational impact on the front-end; the marketing, sales and customer success functions where there are opportunities to help drive revenue. This is changing though. Many businesses are already starting to augment their sales and marketing teams with Intelligent Virtual Assistants (IVAs) that allow them to scale their business and accelerate revenue across the customer journey.
“One example is CenturyLink, the third largest telecom company in the US. The company started using our AI-powered sales assistant to drive warms leads through their sales funnel. To begin with, our AI assistant was tasked with reaching out to smaller prospects that did not have a designated sales rep. The AI assistant was able to contact these prospects and make many feel like they had a personal relationship with CenturyLink.
“The AI Assistants that they named ‘Angie’ and ‘Ashley' immediately went to work setting up 300 calls for four sales reps. which quickly became a case of allowing them to follow up with 800 telemarketing leads per week because the quality was so good. CenturyLink is now using Conversica to contact 90,000 prospects every quarter.
“Using Intelligent Virtual Assistants to augment their workforce has been a game-changer for their business. IVAs identify and deliver hundreds of qualified leads each week who are truly ready to engage with salespeople. For CenturyLink, this has resulted in increased excitement among their sales team, which in turn has led to a 20-1 ROI on a monthly basis.
“This steady approach is often required as it helps sales teams, who can often be hesitant of the idea of technology augmenting their work, to steadily become accustomed to the idea. It also allows businesses to test the personalised approach to AI in sales in a safe environment. As more businesses become more digitally adept, expect intelligent automation to transform the sales and marketing process in these ways.”
Assessing where we are now with regards to AI in the business world is somewhat challenging. On the one hand there are a few flagship organisations that have been trailblazing when it comes to AI and have really built competitive offerings using AI – e.g. Tesla has built a really competitive autonomous driving experience (autopilot) for their cars by utilizing the data from the millions of miles driven by Tesla users. In the financial sector, the new FinTech companies apply AI and machine learning algorithms in not only the core banking services but also back-office processes to improve the end customer experience and service quality. This has forced the incumbent banks to also digitally transform themselves by using technologies like AI & ML. In fact, a 2019 survey of financial service institutions conducted by Open Text found that 75% of respondents had some form of AI strategy in place.
However, for the most part companies are behind the curve when it comes to investment in AI. This is largely either due to organisations simply not having enough usable data to make use of AI, or because they don’t have a clear vision and strategy around digital transformation. In regulated industries like healthcare, lately there has been research into applications of AI, yet there have been quite limited applications in production. This is mostly due to regulatory issues, data silos, and interpretability of AI models. Interpretable AI is required in mission-critical processes so that humans have confidence in the decisions made by AI.
Many businesses have experiences of one-off, single and ad-hoc AI projects without a clear connection to the company’s vision and strategy, and these projects end up being very expensive. All of these new technologies must be managed and developed holistically, not individually to get the best possible business results, and they must be aligned to the strategy in order to completely enable new services and business models.
Born-digital companies operate this way automatically, but for many old businesses the regeneration process is too slow. The technology exists and is in place, but how to derive value from these technologies in the new competitive environment is still missing. Boardrooms have yet to understand how AI and robotics can help businesses with their objectives.
"Whether it is data visualisation (Descriptive Analytics), finding correlation in a massive amount of data (Diagnostic Analytics) or making predictions (Predictive Analytics), AI can augment humans in what they do. However, there is some anxiety about algorithms or machines being able to give instructions based on specific events and data (Prescriptive Analytics) or even to execute those instructions with little or no human intervention (Automated Decisions): will they take over and fire us all?
The reality is very different, and we are very far from such a scenario if it will ever happen. There are three practical examples of the use of AI in today’s businesses.
We all experienced, even unknowingly, some of this, e.g. when we see an online shop proposing other products while we are browsing their catalogue, or when we interact on an online chat. However, at the moment, mainly the larger business have managed to make some use of AI.
Smaller businesses might still feel a bit intimidated by technology, as they look at it as a mere technology, rather than as a tool to improve the business.
It’s important to stress that AI is about augmenting what your staff do — and freeing up their time for better things — not replacing them. Your goal is to solve business issues with the right solutions. That requires business acumen as the starting point.
Think about the core business challenge, not technology. The tail shouldn’t wag the dog!
So, if you are an SME, how do you want to use AI?
Rather than taking a ‘big bang’ approach, split your challenge into bite-sized chunks. Use AI to solve little bottlenecks first or to glean specific insights that inform your decision-making.
You might discover that the software you need exists already. If not, then think creatively. For example, you could team up with a local university or college and work collaboratively, providing some modest financial incentives and prizes. Smart minds like a challenge and may relish the opportunity to solve a real-life business headache.
This way, you’ll solve the business issue, break the ice with AI, have new tools that sharpen your competitive edge."
It has long been held that the world is flat when it comes to technological progress connecting users at an ever-increasing pace. The sewing machine took 1119 years to spread beyond Europe whereas WhatsApp messaging took only 6 years to reach 700 million followers.
It is evident that digital advances move at pace and these days digital has become synonymous with Artificial Intelligence from Alexa to Siri to Eggplant.
Some people are nervous about adopting AI-powered technologies, but our experience is that if vendors can clearly explain how algorithms work and show how AI is driving business outcomes, enterprises are hungry to adopt.In a recent Eggplant global survey, 98% of respondents are looking to attain 50% automation of their software development cycle and will likely use AI to get there.
AI is vastly impacting customer experience and how users interact with organisations. Smart companies are using it to personalise, analyse, monitor and test across any platform, simulating real user interactions and determining how technical factors are affecting business outcomes. Today personalisation is at the cosmetic and content level, but we now have the technology to personalise the entire digital experience at a much deeper level.
Ultimately, businesses working with AI are increasing competitive advantage, improving efficiency and creating amazing digital experiences - all of which lead to positive business outcomes.
Terence Tse, Associate Professor of Finance, and leader of the Master in Digital Transformation Management & Leadership programme at ESCP Business School, offers a perspective:
The answer: slow motion at 128x. In spite of the picture painted by media, the truth is that the speed of putting AI into business operations has been excruciatingly slow. If anything, AI is getting more and more smarter by the day. Instead the bottleneck relates to something a lot more mundane: it turns out to be difficult, indeed, extremely difficult to embed AI models into the legacy IT infrastructure. The result is a lot of companies have done a lot more proof of concept on AI models, there is no going further.
What many companies have failed to understand is that building outstanding workable AI models is not the key in deploying AI. The key lies in the integrating AI into the business. Take the analogy of an AI model to be a powerful car engine. What your business is truly seeking, however, is how to get from point A to point B. You need the rest of the vehicle to help you make the journey and maximise the benefit from the engine. No matter how powerful your AI model is, it is relatively useless to if your IT systems cannot turn the model’s output into successful business propositions.
Most businesses are unclear of the importance of AI Operations or “AIOps”. The term evolved from “DevOps” – a software engineering culture and practice that aims to integrate software development (Dev) and software operations (Ops). The idea is to apply automation and monitoring at all stages, from software construction to infrastructure management. Deploying AI is not very different. Harnessing the power of this technology requires companies to have strong AIOps capabilities, which involves system developers and engineers, not data scientists, building a reliable and rigorous “production environment”.
I would go as far as saying that companies would have very, very little chance of succeeding in deploying AI into the operations if they do not have a clear idea of how AIOps work and how to set up a robust production environment. Assuming that companies have a crystal clear idea what AI actually is and it can do, given the fact that embedding AI models into IT infrastructure is often both a complex business affair and a huge technical feat, we are still a long, long way to go before we see AI being a common fixtures in companies.
AI and machine learning (ML) applications have been at the centre of several high-profile controversies, witness the recent Apple Card credit limit differences and Amazon’s recruitment tool bias. Mind Foundry has been a pioneer in the development and use of ‘humble and honest’ algorithms from the very beginning of its applications development. As Davide Zilli, Client Services Director at Mind Foundry explains, ‘baked in’ transparency and explainability will be vital in winning the fight against biased algorithms and inspiring greater trust in AI and ML solutions.
Today in so many industries, from manufacturing and life sciences to financial services and retail, we rely on algorithms to conduct large-scale machine learning analysis. They are hugely effective for problem-solving and beneficial for augmenting human expertise within an organisation. But they are now under the spotlight for many reasons – and regulation is on the horizon, with Gartner projecting four of the G7 countries will establish dedicated associations to oversee AI and ML design by 2023. It remains vital that we understand their reasoning and decision-making process at every step.
Algorithms need to be fully transparent in their decisions, easily validated and monitored by a human expert. Machine learning tools must introduce this full accountability to evolve beyond unexplainable ‘black box’ solutions and eliminate the easy excuse of “the algorithm made me do it”!
The need to put bias in its place
Bias can be introduced into the machine learning process as early as the initial data upload and review stages. There are hundreds of parameters to take into consideration during data preparation, so it can often be difficult to strike a balance between removing bias and retaining useful data.
Gender for example might be a useful parameter when looking to identify specific disease risks or health threats, but using gender in many other scenarios is completely unacceptable if it risks introducing bias and, in turn, discrimination. Machine learning models will inevitably exploit any parameters – such as gender – in data sets they have access to, so it is vital for users to understand the steps taken for a model to reach a specific conclusion.
Lifting the curtain on machine learning
Removing the complexity of the data science procedure will help users discover and address bias faster – and better understand the expected accuracy and outcomes of deploying a particular model.
Machine learning tools with built-in explainability allow users to demonstrate the reasoning behind applying ML to a tackle a specific problem, and ultimately justify the outcome. First steps towards this explainability would be features in the ML tool to enable the visual inspection of data – with the platform alerting users to potential bias during preparation – and metrics on model accuracy and health, including the ability to visualise what the model is doing.
Beyond this, ML platforms can take transparency further by introducing full user visibility, tracking each step through a consistent audit trail. This records how and when data sets have been imported, prepared and manipulated during the data science process. It also helps ensure compliance with national and industry regulations – such as the European Union’s GDPR ‘right to explanation’ clause – and helps effectively demonstrate transparency to consumers.
Providing humans with the tools to make breakthrough discoveries
There is a further advantage here of allowing users to quickly replicate the same preparation and deployment steps, guaranteeing the same results from the same data – particularly vital for achieving time efficiencies on repetitive tasks. We find for example in the Life Sciences sector, users are particularly keen on replicability and visibility for ML where it becomes an important facility in areas such as clinical trials and drug discovery.
Models need to be held accountable…
There are so many different model types that it can be a challenge to select and deploy the best model for a task. Deep neural network models, for example, are inherently less transparent than probabilistic methods, which typically operate in a more ‘honest’ and transparent manner.
Here’s where many machine learning tools fall short. They’re fully automated with no opportunity to review and select the most appropriate model. This may help users rapidly prepare data and deploy a machine learning model, but it provides little to no prospect of visual inspection to identify data and model issues.
An effective ML platform must be able to help identify and advise on resolving possible bias in a model during the preparation stage, and provide support through to creation – where it will visualise what the chosen model is doing and provide accuracy metrics – and then on to deployment, where it will evaluate model certainty and provide alerts when a model requires retraining.
…and subject to testing procedures
Building greater visibility into data preparation and model deployment, we should look towards ML platforms that incorporate testing features, where users can test a new data set and receive best scores of the model performance. This helps identify bias and make changes to the model accordingly.
During model deployment, the most effective platforms will also extract extra features from data that are otherwise difficult to identify and help the user understand what is going on with the data at a granular level, beyond the most obvious insights.
The end goal is to put power directly into the hands of the users, enabling them to actively explore, visualise and manipulate data at each step, rather than simply delegating to an ML tool and risking the introduction of bias.
Industry leaders can drive the ethics debate forward
The introduction of explainability and enhanced governance into ML platforms is an important step towards ethical machine learning deployments, but we can and should go further.
Researchers and solution vendors hold a responsibility as ML educators to inform users of the use and abuses of bias in machine learning. We need to encourage businesses in this field to set up dedicated education programmes on machine learning including specific modules that cover ethics and bias, explaining how users can identify and in turn tackle or outright avoid the dangers.
Raising awareness in this manner will be a key step towards establishing trust for AI and ML in sensitive deployments such as medical diagnoses, financial decision-making and criminal sentencing.
Time to break open the black boxes
AI and machine learning offer truly limitless potential to transform the way we work, learn and tackle problems across a range of industries—but ensuring these operations are conducted in an open and unbiased manner is paramount to winning and retaining both consumer and corporate trust in these applications.
The end goal is truly humble, honest algorithms that work for us and enable us to make unbiased, categorical predictions and consistently provide context, explainability and accuracy insights.
Recent research shows that 84% of CEOs agree that AI-based decisions must be explainable in order to be trusted. The time is ripe to embrace AI and ML solutions with baked in transparency.
These days, data is viewed as the lifeblood of organisations. Gartner has been heavily focused on the importance of developing a data-driven culture in the past year, stating that: “Leaders need to cultivate an organisational culture that is data-literate and that values information as an asset.”
By Matt Middleton-Leal, EMEA & APAC General Manager at Netwrix.
Against this backdrop, it’s critical that businesses help their employees to use data properly. However, without effective information governance, even data-savvy employees struggle to derive true value from their enterprise’s content, spending as much as 36% of their time trying to search for information, and only in half of the cases achieve success (says IDC). Therefore, a solid foundation of a data-driven culture includes rethinking the ways an organisation deals with information, as well as implementing the right technologies.
The root causes of poor information governance
Today, organisations generate unprecedented quantities of data, the major part of which is unstructured, including complex data, such as PDFs, pictures, and other documents. Using traditional manual methods for data management has become inefficient and as a result, organisations are flooded with information. A recent report by The Compliance, Governance and Oversight Counsel found that 60% of corporate data has no “business, legal or regulatory value.” Such information is commonly disorganised, often old or irrelevant, requiring high IT labour and storage costs due to its large volume.
Moreover, many organisations store their content in siloed systems, each having its own search functions. Predominantly keyword-based search has proven to be ineffective due to the complexity of modern enterprise content. For example, when an average employee searches for a document in a corporate storage system, he or she usually utilises fragmented information such as brand or practice names, rather than using the exact file name. Keyword-based search will reveal all possible documents that contain these words, yet the majority of which will be irrelevant to the request. As a result, the user will have to spend a significant time filtering out unnecessary information. In fact, a recent survey has found that 82% of employees believe that poor information management adversely impacts their productivity. Some organisations even classify documents manually in order to meet this challenge. However, this is time-consuming, and introduces the risk of human error – after all, tagging documents incorrectly is worse than not tagging them at all. Generating partial information in search results leads to inconsistent business practices, which might negatively impact strategic initiatives.
Information governance best practices
How can organisations help their employees derive more value from enterprise content? Ideally, employees should be able to quickly search for necessary data, as well as for information that is relevant to their request, and share it amongst colleagues and teams. This assumes that every employee can apply existing corporate knowledge to his/her task, rather than creating an asset from scratch. For that, every piece of content needs to be indexed and managed thoroughly. Such an approach allows an organisation to take maximum advantage of its content as it endorses employees’ efforts to share "best practices" and encourages connected thinking, which in turn creates new business opportunities.
A robust information governance strategy enables an organisation to remove the roadblock caused by mountains of disorganised data and the limitations of keyword-based data exploration, and to accelerate data literacy across the organisation. The latter leads to benefits such as improved decision-making, increased employee productivity and makes organisations more competitive.
Here are the best practices for improving information governance:
1. Set priorities and design the structure
Once an organisation has decided to enhance information governance, it is recommended to appoint a dedicated team. The project team should start with asking such questions as: “How should data be organised so that employees can effectively use it?”; “Should it be searchable depending on business practices or sectors?”; or “How many languages does an organisation operate with?”. To answer these questions, the team should arrange meetings with subject matter experts. This will help them to thoroughly transform this knowledge into structured taxonomies.
2. Enforce machine learning to overcome the limitations of a keyword-based search
Smart search should be aware of the context of the specific business environment and the user’s goals. Such context is derived from the metadata that describes what this document is about. To enable the search system to leverage metadata, it is necessary to apply to each document complex tags. While this task is virtually impossible for a human to complete due to its labour intensity and high probability of errors, automated data classification easily copes with it. Then, machine learning analyses an organisation’s content enterprise-wide, adapts to its specific environment and derives the terms and taxonomies that best describe the business context. Smart search uses these taxonomies to provide business users with accurate search results. Moreover, such search uses not only terms based on traditional single keywords, but also “compound terms” – that is, short phrases that have very specific meaning in the organisation-specific context. If the user wishes to improve their search results even more, he or she will be able to use refiners that allow them to narrow down the search results. With such an intelligent search system, the organisation can ensure employees can quickly access the information necessary to achieve their business goals as well as to define duplicate and irrelevant data to dispose of it.
3. Educate employees
Employees are often resistant to change, since getting used to new technologies might take them additional time. Therefore, the project team should explain the value of information governance to every employee. Moreover, senior management should take a leadership-driven approach and encourage mid-level managers to explain how the established information governance contributes to their everyday work and opens up new business opportunities for an organisation.
Taking such an approach to information governance, empowered by machine learning, helps organisations to build a data-driven culture as it enables them not only to organise their data, but also to change the way of thinking inside the organisation, encouraging employees to keep, share and reuse good practice knowledge. A positive outcome of robust information governance on business is the ability to leverage corporate expertise to provide better service and long-lasting value to its customers. Moreover, with such governance in place, organisations can easily identify and delete duplicate, old and irrelevant data, thus, cleaning up storage and cutting down costs.
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date? Are the majority of companies comfortable with the technology and how it can be used, and are many already using it? Or, are the majority of organisations still unclear as to exactly what it is and what it offers to the business? DW asked a range of experts for their thoughts, and there was no shortage of responses, and plenty of food for thought. Part 3.
How AI is empowering customer service agents, according to Jaime Scott, CEO and co-founder of EvaluAgent:
Automation has been a hot topic in the customer service industry for some time now, with excitement growing around the potential for AI to revolutionise the way businesses interact with their customers and, in theory, cut costs.
However, as seen with the offshoring boom, viewing customer service as merely a cost to be optimised can have a devastating effect on customer satisfaction levels. So, what role should AI play in the industry going forward, and how can businesses avoid repeating past mistakes?
The rise of the ‘super-agent’
As AI technology becomes more accessible, greater numbers of businesses will undoubtedly choose to start automating some of the more basic, transactional customer-facing tasks previously undertaken by agents.
However, rather than replacing the need for live agents, the rise of AI has in fact increased demand for human interaction from customers, making the role of live agents all the more important.
Thanks to AI, agents are increasingly dealing exclusively with more complex, emotive enquiries that require them to engage with customers in a sensitive, personal way that is not yet possible for technology.
Forward-thinking businesses have identified this as an opportunity to differentiate themselves from their competitors, and have begun providing their agents with more extensive, specialised training, with an emphasis on emotional intelligence and soft skills in addition to product knowledge.
These so-called ‘super-agents’ are customer service experts, capable of building a genuine rapport with customers and providing a consistently excellent experience thanks to their empathy, authenticity and effectiveness.
While customer-facing technologies such as chatbots are the first to come to mind when thinking about AI in customer service, AI will actually prove just as, if not more, useful behind the scenes.
AI is capable of interrogating large volumes of data extremely quickly, something that is already being put to good use in other sectors, such as law, with growing numbers of law firms developing their own in-house AI systems to automate laborious research tasks.
Customer support leaders should look to emulate this, in order to improve agents’ access to customer data. For example, AI is capable of immediately gathering all relevant data from a company’s CRM system, speeding up the process and arming the agent with the tools they need at the beginning of the call to effectively and sensitively handle the query.
Importantly, this can include information about previous interactions with a customer across all channels, such as social media. This is a really effective way of minimising customer frustrations as a result of having to re-explain certain issues, and can flag certain customers as high priority if they have reached out across multiple channels already.
It’s important that businesses do not simply view AI as the latest cost-cutting tool, as this will inevitably lead to unhappy, dissatisfied customers. However, implemented correctly, AI will help to pave the way for more highly skilled, compassionate agents who, thanks to clever use of technology, are able to provide the best possible level of service to every customer.
Gasan Awad VP, Product Management, Financial Crimes and Risk Management, Fiserv, talks about getting sci-fi smart in the fight against fraud:
In only a handful of years, the financial services industry has swung between being enamoured with the potential of Artificial Intelligence (AI) to disenchantment with its current capabilities, with opinions varying based on who you asked and when you asked them. As the use of AI has spread it has become clear that, as in many a sci-fi story, maximising its potential requires a balance between man and machine. Far from heralding a headlong robot-driven rush into the future, the growing use of AI has underscored that human intelligence remains critical to the success of financial services in general, and the fight against fraud in particular.
Using data analytics to detect fraud is not a new concept. Credit card companies, financial institutions and others have been using behavioural and predictive analytics to reduce risk and fraud for years. AI technologies have taken these capabilities to a new level, using greater automation, unsupervised learning and more sophisticated algorithms to strengthen fraud detection and prevention. Ultimately, it is the combination of this intelligence with smart, machine-driven automation that add power to the fight against fraud.
A key element of successful fraud prevention is the ability to leverage data. Machine learning (ML) is a particularly promising element of AI, because it enables solutions to build on previous knowledge and evolve to face new fraud threats without being expressly programmed to do so. As these capabilities continue to advance, ML systems could be fine-tuned to deliver the best response for any situation, providing a higher level of security with fewer disruptions to the customer experience.
The speed and scale of AI-powered solutions – and their ability to look at data in a different way – enable financial institutions to successfully monitor for suspicious activity that would otherwise go unnoticed. Today, many organisations are using software to review data from limited sources using predetermined rules. AI enables more comprehensive and continuous monitoring of data and behaviour across multiple channels.
By reviewing all account activity in real time, including ACH transfers, wires, P2P transfers, credit, debit and bill payments, an AI system identifies behavioural anomalies and suspicious transactions faster and more effectively. This facilitates more accurate fraud detection and reduces friction for legitimate transactions, creating an enhanced customer experience. Aggregated account data can also be combined with market information to produce better insights that help detect emerging threats – enabling financial institutions to spot and prepare for threats they may otherwise not have known existed.
Adding biometrics to the mix, including voice, face and fingerprint authentication, strengthens AI strategies by creating an additional layer of protection to stop fraud before it begins.
Artificial Intelligence and Human Potential
AI is not about replacing human interactions. It's about creating a new calibre of better, more personalised and scalable experiences driven by a combination of human potential and technology-enabled insights. With AI, people and applications can become smarter and more capable.
As financial transaction speeds continue to increase, there’s significantly less time for approvals. Deploying real-time machine learning algorithms can improve accuracy and speed of decisions that help prevent fraud. These sophisticated algorithms can dig deep into the depths of information at an institution’s disposal and are calibrated to be more accurate over time, greatly reducing false-positive fraud alerts.
Intelligent automation can also reduce investigation times and the workload involved in uncovering criminal activity. Easier access to relevant information empowers employees to make better-informed decisions more quickly and at scale and can contribute to a sense of empowerment and higher employee engagement.
When repeatable processes are automated through a combination of AI and automation, employees are needed to make decisions when conditions fall outside the norm. Businesses still come up against new situations that prove impossible to anticipate. Internal teams must be ready to think fast, think creatively, adjust policies and fine-tune AI systems to address new realities such as evolving forms of fraud.
Organisations that stay open about their goals for AI and stress the intent is to better equip and enable people, rather than replace them, will help ensure their teams fully embrace and support their strategy.
Informing the Customer Experience
Customers also stand to benefit from AI. Improved accuracy in the detection of fraud, for example, can lead to fewer false positives, minimising disruption to customers whose transactions could be unnecessarily held or delayed. As AI applications take in data and build knowledge over time, meaningful and actionable insights will increase.
As financial institutions begin planning for a future that involves AI, customer needs should remain at the centre of their decisions. The right strategy comes down to understanding how people want to be served. While certain tasks can be facilitated by automation, people are still required to make critical decisions and evaluate their potential to impact customers.
Beyond simply offering a step forward in the detection and prevention of fraudulent activity, AI can propel institutions’ strategies in new directions. By effectively connecting technology with human intelligence, financial institutions can better protect their organisation and their reputation while improving the customer experience.
We live in an era where “narrow intelligence” (artificial intelligence focused only very specific tasks) can be used to great benefit. Cyber security teams have been using machine learning for years to better detect malware, categorise content and detect anomalies in traffic or behaviour, but a significant amount of human interaction is needed to build and maintain these solutions. This is not a bad thing – the current state of AI is such that as a technology it can be a powerful ally, but is far from being self-sufficient.
In addition to being a powerful tool for defenders, such as handling enormous amounts of data that would be impossible to do manually, AI can also be used by attackers. Although not yet very often seen, the emergence of open source tools offering capabilities ranging from creating perfect impersonations of voices to utilising state of the art reinforcement learning methods to improve the efficiency of attacks means they will soon become more prevalent.
To prepare themselves, defenders must continue to ensure they are able to respond to threats rapidly enough. For example, when a multi-agent attack bot launches and completes a data extraction from an initial breach within milliseconds, there is just no way for human defenders to keep up without the help of AI.
But we must remember that AI is not infallible. The more we use solutions that learn from data, the more we create new types of attack surfaces. AI is not inherently good or bad, but it also is not inherently secure – when building AI solutions we must take steps to not only ensure we protect our AI from intentional manipulation, but also unintentional, undesired behaviour such as bias or having blind spots where we did not have training data. These issues can be addressed, but only when we become aware of them and act when building the systems.
One could say that in general humankind currently looks at AI from the perspective of replicating human intelligence – a viewpoint that is severely limiting the potential of machine intelligence. Computers are not like people, they have very different strengths – and when building solutions that utilise machine intelligence, we should use the strengths of computers and not try to just replicate how we think. Just as humankind first tried to fly by mounting wings on our arms and flapping them wildly but ended up building airplanes and rockets, true machine intelligence will most likely take a very different form than what most expect – something we for one at F-Secure are exploring in our Project Blackfin for example. This does not mean general intelligence or robot overlords will be taking over the world, but I do believe we are nearing the next era of AI solutions and it will be very exciting to see!
Brendan Dykes, Senior Director Solution and Product Marketing at Genesys, says that in the UK, businesses are adopting and thinking about artificial intelligence (AI) at a rapid rate:
Research conducted by Genesys in late 2019 found that 60% are already using AI or were planning to start using it within the next 12-months. 42% of company owners expect that the technology will have a positive impact on their businesses within 12-month of deployment.
The benefits of AI are being recognized by employees too. Almost two thirds (64%) of employees say they value AI and even those who are yet to work with it are overwhelmingly positive reaction to the potential it will have according to the research. Many see AI as an essential tool for any successful business, with 79% saying they do not believe their companies will be competitive without it..
However, forward thinking businesses understand that AI will have the greatest impact when it is used to seamlessly work with the existing workforce. This will require businesses to combine the best data, technologies and people. Instead of replacing people, AI will automate repetitive tasks, empowering workers to perform their roles more efficiently while removing the dull and monotonous aspects of their work life and instigating a more fulfilling employee experience.
However, there are some barriers why not all companies are adopting the technology. The research shows 40% of employers think implementation could be too complex while 24% say their leaders are hesitant to invest because they don’t believe the hype surrounding AI. Cost is an area of concern for 20% of businesses and 16% are hesitant to implement the technology because they are worried about the levels of disruption that training, and deployment will cause. It is worth noting, that there are platforms available that make it easier for companies to deploy technologies such as chatbots without the need of major IT investment or the need for computer scientists.
When it comes to training, employees overwhelmingly expect this to be provided. Less than half feel that they have the skills required to work with AI and receiving training could help to placate fears among employees who believe they will be replaced by the technology.
Historically, you distributed software to the consumers and customers, but you didn't get the insight regarding how they leverage your software - which features are in use and those that are not. When providing your software as a service, you also have lots of data and insights that can help you improve your service.. You can therefore use this information to provide insight to your customers, you are better able to understand usage patterns and, ultimately, be able to use this data to give intelligent feedback. The SaaS era coincides with the almost ubiquitous concept of Big Data. And SaaS, which is able to leverage AI techniques, has a distinct advantage over the software provision days of old - the software provider now has access to aggregated data from different customers, which he can leverage to build a better service.
Companies now hold significant volumes of data from customers all in one place. AI enables a more automated means of mass data processing. Gartner defines big data by not only Volume, but also by its variety and velocity. Variety refers to the different media we use to represent data (beyond simple figures) and velocity is determined by the speed at which data is collected, analyzed and implemented. The ultimate reality is that IT teams are dealing with increasing amounts of data and a variety of tools to monitor that data - which can mean significant delays in identifying and solving issues. And with the whole area of IT operations being challenged by this rapid data growth (that must be captured, analysed and acted on), many businesses are turning to AI solutions to help prevent, identify and resolve any potential outages more quickly.
Marketing is particularly well placed to leverage AI techniques. The data SaaS companies collect need to be relevant and recent. The more up to date the data is the more efficiently it can be implemented. Large corporations can access data collected through loyalty programs and cross-promotional activities, and smaller businesses can acquire data through customer surveys, online tracking or by competitor analysis. AI solutions can certainly be a golden opportunity for businesses to broaden their perspective on potential customers.
For B2B customer-centric businesses, AI allows more functions which may previously have had a manual component to be automated - for example, it enables them to automate many customer experience processes, such as training and onboarding, marketing campaigns and ongoing customer service. AI essentially aggregates large quantities of data for example, customer data and filters it into automatic processes. Customer service AI platforms like chatbots, which respond to and troubleshoot customer inquiries automatically, enable customer service departments to take on additional inquiries. That’s great news for revenue retention and churn reduction, as customers tend to show a heightened interest in a purchase, following a positive customer service experience. Likewise, negative customer service experience is a good way to get rid of customers. Supplementing AI technology with your customer service team can target the seamless cross-section between convenience, problem-solving and human experience - a typical example here could include using machine learning to automate aspects of customer service (especially self-serve).
The main UX challenge of SaaS is remoteness. Artificial Intelligence can help to alleviate that sense of remoteness whilst delivering a more satisfying experience to the customer. There has been a lot of scaremongering with respect to machines taking jobs from humans, and that AI will bring about automation in practically all walks of working life - however, the more likely scenario is that AI will deliver most value when it is deployed in conjunction with human beings. SaaS can, and should, manage those interactions which can be handled automatically ( classic Saas) and those which require human intervention. AI-augmented human interactions can drive SaaS interactions too.
Natural language processing and machine learning allow SaaS companies to massively boost personalisation. And, let’s face it, consumers have become much more demanding, wanting personal experiences that are tailored to their specific needs. AI can help here by providing analysis of your users’ previous actions, giving you actionable insights into their preferences and interests. This allows you to configure user interfaces providing that all-important tailored experience. Remember, If they don’t receive it from your business, they’ll go elsewhere to find it.
The Wellcome Sanger Institute deploys EcoStruxure IT Expert™ to drive energy efficiency and ensure more funding for Genomic research.
The Wellcome Sanger Institute is one of the world’s leading research facilities focused on genomic discovery.
Based at the Wellcome Genome Campus, Cambridge and established in 1993, the Institute conducts key research into improving the outcomes of human health using data derived from genomic sequencing, particularly in the areas of cancer, malaria and other pathogens. It also works closely with a wide range of collaborators – both internationally and across the Campus – to share the results of its research with the wider scientific community.
At the core of Sanger’s technical infrastructure are its DNA sequencing machines; a fleet of complex and advanced scientific instruments which generate vast quantities of data that must then be analysed within their on-premise data centre.
The nature of genomic research, a cutting-edge and evolving area of science, means that the demand for data-processing capacity is only likely to increase over time – genomic data is soon set to become the biggest source of data on the planet.
In fact, as much genomic data has been produced in the last eighteen months as the previous eighteen years, but as an organisation concerned primarily with science, budgets for supporting infrastructure, including IT, will always be under pressure.
Simon Binley, Data Centre Manager at Sanger states: “It’s hard to predict what level of data centre capacity we’ll need over the next five to ten years, but all the trends point towards the requirement for greater IT capability; the more processing power you have the greater the scientific capability. Every year, the sequencers generate data more quickly as the technology improves. IT has to keep up with that demand.”
By way of illustration of the pace of change, as part of an international effort, Sanger took 13 years to map the first human genome. Today the same task can be achieved in less than an hour. As data is gathered more quickly and in greater quantities, scientists will turn to examining previously unexplored phenomena and interactions revealed as new information is produced.
“The human body is made up of trillions of cells,” says Binley. “As we sequence those cells we will also be gathering more genomic data via a greater number of sequencing machines. Advances in today’s technology also mean we are gathering data more quickly than ever before. This requires more power availability, greater storage, faster connectivity and higher levels of local compute.”
This requirement for local processing to be physically close-by the sequencing equipment where data is being generated is an archetypal example of an ‘edge computing’ deployment. “Proximity to the sequencing equipment is a primary consideration for the data centre,” continued Binley. “The bandwidth and latency requirements for the high volume and velocity of genomic data makes cloud services unsuitable. As such, no other edge data centre is as important to discoveries about human life as the one at Sanger.”
Another key considerisation is the DNA sequencing equipment on which the scientific effort depends, which must be protected by uninterruptible power supply (UPS) systems at all time. Downtime within this distributed IT environment would require the chemicals used in the research process to be replaced at significant cost, in addition to lost time and data. Ongoing monitoring of UPS battery health is therefore essential to ensure runtime is available and makes a major contribution towards the Wellcome Sanger Institute avoiding outages in both the data sequencing and research efforts.
Operational efficiency is the remaining key concern for Binley and the data centre. Simply put, any money saved on IT and physical infrastructure, either in terms of capital expenditure (CapEx) or operating costs, means greater funding is available for key scientific research. Cooling, as ever, is a major point of focus when addressing operating efficiency.
The Solution - An upgrade delivered via partnership
As part of its on-going demand for greater processing power, Sanger recently made operational a fourth data hall within its existing data centre facility. Comprising more than 400 racks and consuming 4MW of power, the Institute is now the largest genomic research data centre in Europe.
Additionally, the DNA sequencing equipment on which the scientific effort depends are protected by individual APC by Schneider Electric Smart-UPS™ uninterruptible power supply (UPS) systems. Ensuring resiliency within this distributed IT environment is essential, ensuring that the Wellcome Sanger Institute avoids outages and any losses of sequencing or research data.
As part of the upgrade, Sanger worked directly with EfficiencyIT (EiT), a specialist in data centre design and build, and UK Elite Partner to Schneider Electric. Having worked closely with the Institute for a number of years, whilst forming a long-term, strategic partnership with Schneider Electric, EiT were able to recommend the best course of action in terms of monitoring for this data-driven and distributed IT environment, ensuring that its scientific research would continue without interruption from downtime or from unexpected outages.
EiT specified Schneider Electric EcoStruxure IT Expert™; a cloud-based Data Centre Infrastructure Management (DCIM) software platform, which enables the user to manage all of the key infrastructure assets and improve the overall efficiency of the data centre.
Installed by EiT’s software experts, the solution provides the Wellcome Sanger Institute with insight into the operation of all key infrastructure assets in the data centre, including APC by Schneider Electric NetShelter™ racks, APC rack-metered power distribution units (PDUs), Smart-UPS™ uninterruptible power supplies (UPS) and cooling equipment.
“Partnership is absolutely crucial for today’s data centre environments.” Nick Ewing, Managing Director, EiT, said: “By listening closely to the Wellcome Sanger Institutes’s challenges and understanding the paramount importance of their work, we were able to recommend the most cost-effective and innovative solution to support their business objectives. As such, they selected EcoStruxure, which gives them greater visibility into the IT and power infrastructure, whilst ensuring they can drive both operational and energy efficiency.”
EcoStruxure IT Expert™ is highly scalable and quick to deploy and install, allowing thousands of devices to be connected and discovered in less than 30 minutes. The software is completely vendor-neutral, allowing the user to manage all of the critical infrastructure assets on their network within a single solution, leveraging Data Analytics that enable smarter real-time decision making and ensure that any unexpected issues are identified and quickly resolved.
The platform, where data from assets is stored, pooled and analysed, is highly cyber secure and GDPR compliant, allowing users to view all of their assets at any time, from anywhere and on any device. Additionally, security, software and patch updates are automatically issued via the Schneider Electric Cloud, which is hosted on Microsoft Azure to ensure customer information is kept safe and secure.
Previously, the Sanger facility had not utilised a unified management platform for its hardware assets. Such systems as were in place had grown organically as the data centre grew. According to Simon Binley: “It was perhaps natural that the focus was on the IT equipment housed within the data centre and not necessarily on the power train. The EcoStruxure solution unifies those two important data centre elements, giving us greater visibility of the entire estate.”
EcoStruxure IT Expert replaces that disjointed system with a single management console that allows a much more comprehensive and integrated view of the operation of the entire facility. “Visibility of power utilisation and assets throughout the data centre is the first step to build a complete understanding of what the facility looks like,” says Binley. “From that we can make an informed decision on how to make it more efficient. Having the ability to see and manage more than 400 racks of critical IT equipment, and all the components contained within each, gives us a significant advantage and enables a massive improvement to the operation of the data centre.”
The Benefits and Results
EcoStruxure IT Expert was chosen, says Binley, because it was considered best in class for the Wellcome Sanger Institute’s requirements after examining several competing products. It was well suited to the existing infrastructure in the data centre, most of which has been provided by Schneider Electric, but because of its standards-based open-platform architecture, it can simply and easily integrate with hardware from other vendors.
As well as improving the operation of the data centre itself, EcoStruxure IT’s management capabilities can also be extended to other distributed assets throughout the Institute and the broader Wellcome Genome Campus, including is sequencers. Several other communications equipment rooms distributed throughout the Campus are also made visible within EcoStruxure IT’s single pane of glass.
Another benefit of EcoStruxure IT is its cloud-based architecture, which enables the Institute to outsource some of its critical IT requirement to third-party colocation providers and still manage the infrastructure internally. The solution provides the institute with complete visibility of IT health, ensuring that both the data centre manager and any accompanying maintenance or service providers, such as EiT, are instantly notified via email and the EcoStruxure IT Expert™ smartphone application should a power outage occur. In this way it allows all key stakeholders to have visibility, whilst enabling rapid collaboration to help quickly resolve critical IT issues.
The payback for installing EcoStruxure IT will be seen through reduced operating costs, especially in terms of a reduction in energy consumption. “From a power and energy perspective, we expect to save between 5 and 10% over the first two years in the data centre itself,” says Binley. “We can do that by raising the room temperature in the data halls, currently set at 19C to 21C. For a 4MW facility, that represents a significant cost saving thanks to a reduced cooling effort required, and as the product matures, we would look to increase that even further.”
The Institute has ambitious targets for improving the Power Usage Effectiveness (PUE) rating for the data centre from between 1.6 and 1.8 where it is now, to around 1.4, which will yield even further cost savings. “Any improvement in PUE and energy efficiency automatically reduces electrical spend and allows us to repurpose investment into other areas,” says Binley. “That means more equipment, more scientists and more benefit to humanity, which is the primary focus for the Wellcome Sanger Institute.”
“Kubernetes is one of the fastest growing open source software projects in history,” according to the Cloud Native Computing Foundation (CNCF) Kubernetes Project Journey Report notes. Maybe you’ve heard it has something to do with containers and distributed systems. Or, maybe you heard Jim Zemlin, Executive Director of the Linux Foundation, recently proclaim, “Kubernetes is becoming the Linux of the cloud.”
By Paul Burt, Technical Product Marketing Engineer at NetApp and a member of SNIA.
What’s less well known is that a lot of stateful and data-centric workloads are also starting to make their way to Kubernetes. Here is a look at which workloads and databases are a good fit for Kubernetes. If you’re looking for more educational resources on Kubernetes, don’t miss the links to great content at the end of this article.
Are containers and Kubernetes becoming a better fit for stateful workloads? The answer is yes and no.
Running stateful workloads on Kubernetes has been possible for some time now. However, it might seem to an outsider that Kubernetes features that support stateful workloads are hard to find. For example, a text search for “stateful” on the Kubernetes 1.16 changelog (a recent release) returns no results. If a source change was made for StatefulSets, or “Stateful workloads” we would hope to see a result. Of course, confirming functionality in a piece of large and complex software by doing a free text search is never quite that simple.
The point here is that stateful workloads are running the same way now as they were on many prior versions of Kubernetes. Part of the reason is stateful workloads are difficult to generalize. In the language of Fred Brooks (author of The Mythical Man Month), we could say that stateful workloads are essentially complex. In other words, they must be addressed with a complex solution.
Solutions like operators (sets of design patterns) are tackling some of this complexity, and recent changes to storage components indicate progress is occurring elsewhere. A good example is in the 1.14 release (March 2019), where local persistent volumes graduated to general availability (GA). That’s a nice resolution and fix to the complaint on this blog from 2016, which said, “Currently, we recommend using StatefulSets with remote storage. Therefore, you must be ready to tolerate the performance implications of network attached storage. Even with storage optimized instances, you won’t likely realize the same performance as locally attached, solid state storage media.”
Local persistent volumes fix some of the performance concerns around running stateful workloads on Kubernetes. Has Kubernetes made progress? Undeniably. But are stateful workloads still difficult? Absolutely.
Transitioning databases and complex stateful work to Kubernetes requires a steep learning curve. Running a traditional database through your current setup typically does not require additional knowledge. Your current database just works, and you and your team already know everything you need to know to keep that going.
Running stateful applications on Kubernetes requires you gain knowledge of init containers, persistent volumes (PVs), persistent volume claims (PVCs), storage classes, service accounts, services, pods, config maps and more. This large learning requirement has remained one of the biggest challenges to running stateful workloads on Kubernetes. Moving to Kubernetes can be very rewarding, but it also comes with a large cost which is why cautionary tales still abound.
What databases are best suited to run on Kubernetes? Mainly those which have embraced the container revolution which has been happening over the past five years. Typically, those databases have a native clustering capability. For example, CockroachDB has great documentation available with examples showing how to set it up on Kubernetes.
On the other hand, Vitess provides clustering capabilities on top of MariaDB or MySQL to better enable these solutions to run on Kubernetes. Vitess has been accepted as an incubation project by the CNCF, so there is a lot of development expertise behind it to ensure it runs smoothly on Kubernetes.
Traditional relational databases like Postgres or single-node databases like Neo4j are fine for Kubernetes. The big caveat is that these are not designed to scale in a cloud native way. That means the responsibility is on you to understand the limits of those databases, and any services they might support. Scaling these pre-cloud solutions tends to require sharding, or other similarly tricky techniques. As long as we maintain a comfortable distance from the point where we’d need to scale a pre-cloud database with any strategy other than, “put it on a bigger machine,” we should be fine.
The decision to run Kubernetes in the cloud vs. on-premises is usually dictated by external factors. In the cloud we tend to benefit most from the managed services or elasticity. All of the major cloud providers have database offerings which can be exposed to Kubernetes environments as a service: for example, Amazon Aurora, Google Cloud SQL, and Microsoft Azure DB. These offerings can be appropriate if you are a smaller shop without a lot of database architects.
However, regulatory requirements like GDPR, specific country requirements, or customer requirements may mandate we run on-premises. This is often where the concept of data gravity comes into effect – it is easier to move compute to your data rather than moving your data to compute. This is one of the reasons why Kubernetes is popular. Wherever your data lives, you’ll find you can bring your compute closer (with minimal modifications), thanks to the consistency provided by Kubernetes and containers.
Aside from the steep learning curve, it can seem like there are a number of other major challenges that come with Kubernetes. These challenges are often due to other design choices, like adopting microservices, infrastructure as code, etc. These other approaches are shifts in perspective, that change how we have to think about infrastructure.
As an example, James Lewis and Martin Fowler note how a shift toward microservices will complicate storage: “As well as decentralizing decisions about conceptual models, microservices also decentralize data storage decisions. While monolithic applications prefer a single logical database for persistent data, enterprises often prefer a single database across a range of applications - many of these decisions driven through vendors’ commercial models around licensing. Microservices prefer letting each service manage its own database, either different instances of the same database technology, or entirely different database systems - an approach called Polyglot Persistence.”
Failing to move from a single enormous database to a unique datastore per service can lead to a distributed monolith. That is, an architecture which looks superficially like microservices, but behind the scenes still contains many of the problems of a monolithic architecture.
Kubernetes and containers align well with newer cloud native technologies like microservices. It’s no surprise then, that a project to move toward microservices will often involve Kubernetes. A lot of the perceived complexity of Kubernetes actually happens to be due to these accompanying approaches. Microservices and other approaches are often paired with Kubernetes, and anytime we stumble over a rough area, it can be easy to just blame Kubernetes for the issue.
Kubernetes and these newer development methodologies are popular for a number of reasons. They’ve been proven to work at extreme scale at companies like Google and Netflix. When done right, development teams also seem more productive.
If you are a business operating at a larger scale and struggling with productivity, this probably sounds like a great solution. On the other hand, if you have yet to feel the pain of scaling, all of this Kubernetes and microservices stuff might seem a bit silly. There are a number of good reasons to avoid Kubernetes. There are also a number of great reasons to embrace it. We should be mindful that the value of Kubernetes and associated technologies is very dependent on where our businesses are on, “feeling the pain of scaling.”
Kubernetes is flexible, and if we find running projects outside of it easier, we still have that option. Kubernetes is designed so that it’s easy to mix and match with other solutions.
There are a multitude of educational resources on Kubernetes. The SNIA Cloud Storage Technologies Initiative (CSTI) has a great three-part “Kubernetes in the Cloud” webcast series. They’re viewable on-demand here Kubernetes in the Cloud (Part 1), Kubernetes in the Cloud (Part 2)
As part of this series, the CSTI has also compiled more than 25 links to useful and informative resources during our live webcast. You can access all of them here.
The SNIA Cloud Storage Technologies Initiative (CSTI) is committed to the adoption, growth and standardization of storage in cloud infrastructures, including its data services, orchestration and management, and the promotion of portability of data in multi-cloud environments. To learn more about the CSTI’s activities and how you can join, visit snia.org/cloud.
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date? Are the majority of companies comfortable with the technology and how it can be used, and are many already using it? Or, are the majority of organisations still unclear as to exactly what it is and what it offers to the business? DW asked a range of experts for their thoughts, and there was no shortage of responses, and plenty of food for thought. Part 4.
A new benchmarking study conducted by leading futurologist, Dr Ian Pearson, and digital marketing specialists, Search Laboratory’s Jimmy McCann, has revealed the future technology trends that are set to transform businesses across the globe, including how AI, trust and data protection are revealed to play a vital role:
AI will shape the future Artificial Intelligence will become a leading technology in the future, from helping doctors find cures for some of the world’s most resilient diseases, to training, upskilling and assisting workforces.
We will see more time consuming and repetitive tasks become streamlined by AI - from the basics, such as setting up impactful PPC or social media advertising campaigns, to the complex such as building new websites and content to feed the algorithms. This will see the role of the marketeer shift completely away from set up and administration, to focus more on higher level strategy and creative thinking: how to engage with consumers, clients and customers in a way that is truly personalised and authentic. Brain power will replace tech ability - the human mind's aptitude for thinking flexibly and responsively will be key for successful marketers.
Of course, the alleviation of admin extends to the management of resource and skills too. Take agency management for example – rather than spending time drafting briefs, checking in on deadlines and chasing down suppliers, marketers will be able to rely on AI to manage these processes for them.
AI will grow from strength to strength, but also it has its draw backs – it isn’t as simple as machines replacing humans. Future businesses will need to invest in data scientists and engineers who can build AI and ML models to do the jobs required. Even then, we need huge amounts of clean data for AI to function effectively. If the initial data has any bias, the algorithms that AI use can become systemically prejudiced or inaccurate, damaging the brand and costing businesses money. Marketers need to be wary of these issues when embracing the technology.
Other emerging trends must also be considered as part of the wider marketing and business approach. Take analogue as an example. We are seeing analogue coming back into fashion, and this links back to consumers desire to feel something tangible and they want to feel as if products are genuine. The sweet spot will be balancing tech demands, with the want and need for personalised and authentic products and services.
This desire for authenticity will see the rise of the co-bot; humans working in partnership with AI to produce quality services and products. AI will streamline processes and help with automation, whilst humans use their emotional intelligence to create meaningful products and experiences. This will see the budgets available for creativity in marketing increase, as manual tasks which would have been completed by entry-level team members are instead carried out using AI, leaving more budget for creating added value to clients and customers.
In the first instance, the greatest challenge when embracing new technologies for marketers will be the implementation of machine learning models. This will be essential to ensure that the work this new technology completes is accurate. As such there will be a huge need for data scientists and engineers to create machine learning models that are effective.
The hard work for marketers will then become the conceptual part of the job. Whilst AI will take care of defining audiences and setting up ads, marketers will be required for building creative, immersive and experiential campaigns that work across a plethora of personalised audience types. In the future the marketing objective will be to ensure a business stands out from the crowd and offers truly engaging services and products for clients.
In the legal sector, adoption of AI is primarily being driven by law firms, and alternative and emerging legal services providers, to drive efficiencies in the way they deliver services to clients. Corporates are demanding transparency and predictability of the cost of legal services, in turn putting firms under considerable economic pressure.
An area where the use of AI has become ‘day factor’ in legal is e-discovery. Today, firms wouldn’t think about undertaking a litigation discovery exercise without an AI-based e-discovery tool. This aside, up until recently, there’s been a blurred view of AI. Some tools that were historically badged as AI – for example – document automation – in reality, represent quite old technology, which is potentially why this area, thus far, has seen the most interest for AI adoption among firms.
The big opportunity today
Now with a better understanding of this technology, firms are seeing great opportunities for the adoption of AI-based technologies in areas such as due diligence, M&A and document review – and finding success. With digital transformation initiatives well underway in corporates across industries, organisations need assistance to review and analyse their entire legal contract landscape for insight that will guide future business operation. For this, they are turning to their legal services providers, who in turn are embracing AI.
Broader economic and regulatory factors, such as Brexit and LIBOR, are also driving AI adoption among firms. For instance, corporates are having to revisit employee contracts to make changes so that they accurately reflect new locations of employment, job titles and descriptions, and other such contractual agreements. Similarly, with financial institutions under pressure to transition away from LIBOR to new reference rates by the end of 2021, adoption of AI presents the only time-efficient and cost-effective way for this wholesale repapering and remediation exercise. Banks need to remediate literally 100s of 1000s of LIBOR contracts within a very short timeframe and firms are innovatively using AI to aid such projects.
As yet an untapped opportunity
Legal is inherently a knowledge-based profession as lawyers and consultants trade on their individual expertise. But increasingly, lawyers are being asked to become team players and be more collaborative in the way they deliver legal advice. So, while, it’s imperative that firms lock down confidential information and intellectual property, they also need mechanisms to share that knowledge internally. AI technology offers firms the potential to cleverly and securely unlock that ‘knowledge’ for business and competitive advantage. As yet, there are only a few firms exploring AI for true, analytical knowledge management, but this is an untapped opportunity for legal services providers to explore.
It must be said that there’s a lot of noise is the legal sector about robot judges and lawyers, chat bots for litigation, and so on. Often these arguments are inaccurate and misrepresented, which isn’t helpful. Legal services are designed and offered to help solve problems, be they business-related or personal. Lawyering is about trusted consultation and hence will always remain a ‘human’ activity. AI technology is merely empowering these advisors with data so that they have the benefit of historical insight – in the form of best practices, successes, failures, situational understanding – in order to facilitate informed decision-making in an efficient and human way.
Steve Haighway, COO Europe, IPsoft talks about the rise of the hybrid workforce:
Many large enterprises worry that Artificial Intelligence (AI) is too complex, time intensive and costly to implement and deliver business value. Just finding the right AI use case, some believe, is a daunting prospect, and failure to meet deployment deadlines means delayed ROI. These are valid concerns for business decision-makers. However, some of these perceptions are far less valid, and such concerns are no match for today’s AI.
As more AI use cases emerge, the hybrid workforce (human and AI collaboration) takes shape across various industries. 2020 is poised to be the year people realise AI is not going to take their jobs but will make their jobs less mundane and more satisfying. This is through the rise of Digital Employees – which represent the next great chapter in human-machine collaboration. By combining an intuitive conversational interface with back-end integrations into enterprise systems, these solutions empower all users — regardless of technical proficiency — to easily access information and services.
This trend will put pressure on enterprises to demonstrate that they’re ready to offer reskilling and new job opportunities for employees impacted by AI implementations. Employees, meanwhile, can stay ahead of any AI-enabled changes in their companies by identifying how they can adapt and change their roles and skills for the Future of Work.
Staff who traditionally spend more time improving existing processes than handling day-to-day tasks will be happy to learn they’ll also have more interesting work to do. Not every task will be able to be fully automated. Digital Employees can learn enough of a process to help automate bulk requests, but it will still need to ask for human assistance if and where automation takes a process as far as it can go.
Most enterprise AI deployments previously required hours of meetings and a broad team of deployment specialists to get a use case up and running. Now, the process is as simple as downloading a game onto your iPhone. You’re now able to log onto a marketplace where Digital Employees are arranged according to skills. You select the ones you need for your business and “interview” the AI-powered worker before cloud-sourcing your new employee. If the employee meets your company’s standards, the solution will self-configure and integrate to your company’s cloud-based servers and systems — meaning ROI comes in days and weeks, not months or years.
Most businesses are aware of the benefits and opportunities of AI. However, the best way for businesses to facilitate positive experiences is to clearly communicate how AI will change worker roles in the short- and long-term. By making it clear that you’re not hiring Digital Employees to remove a workforce, but rather to augment it, you’ll immediately address your workers’ biggest fears.
There is still enormous potential for businesses to reap greater value from artificial intelligence (AI). Many attempt to fit AI-based solutions into traditional organisational structures or think too narrowly about how to take maximum advantage of it. Other organisations are stuck in their comfort zone and rely on intuition and subjective decision-making based on individual experience.
There are four critical changes business leaders must make to their organisations to effectively scale AI:
Some business leaders also consider AI an “unguided missile.” They are concerned advanced analytics, for example, without human intervention will make inaccurate decisions for their business. For leaders that have based past decisions on gut instinct, it’s likely they will never fully trust a machine. However, analytically minded leaders have grown up relying on data and AI and machine learning (ML) and will embrace the potential of these technologies.
With any new technology there are growing pains as kinks are worked out and teams learn how to use and then optimise the potential of the technology. There have been articles about Alexa recording home conversations and sending them to random people, for example. However, it’s highly unlikely brands and retailers will ignore the opportunity to reach people in their homes because of a few early glitches in voice technology.
Another reason apprehensions about AI exist is because many information-based workers are concerned about job security. Currently, in most businesses, there are many roles focused on the preparation of data and information for downstream, senior decision makers. The challenge and the opportunity is to transform the existing role of information workers, from preparing data and information for others to creating algorithms and automation that produces accurate and actionable recommendations for immediate utilisation, instead of just information for further interpretation.
A current joke among data scientists is “If It's built in Python it’s ML (machine learning), if it’s built in PowerPoint it’s AI.” Businesses have been talking a big game about industry advances through increasing automation, but the reality is that AI is still far more notional than functional. In fact recent articles have shown that one of the most widely hyped advantages of AI, increasing personalization and targeted advertising, barely performs better than the non-targeted kind and that in general marketing spend confuses correlation with causation: punters didn’t buy because we gave them a coupon or showed them a specific ad, they got a coupon or saw a specific ad because they were already likely to buy.
Beyond that, we’ve seen endless stories about AI discriminating against black patients when allocating health care resources, against the disabled in screening job applicants by video interview, and against women in determining credit limits even for married couples with identical credit scores and other financial information. Nobody wants to be next in the headlines for this kind of discrimination, yet businesses and governments seem endlessly seduced by the promise of AI, despite extremely dubious rates of success and the immense cost of building and maintaining these systems in a market where data scientists command immense salaries.
So where is it working? There are some real success stories in reducing time to completion and freeing up humans from repetitive tasks in various industries and predicting failure points in complex systems. Examples include predictive maintenance for manufacturing and field equipment, predicting load balancing for complex systems like telecommunications infrastructure, improvements to transcription and translation software, and automating process flows like billing and payments, producing documents, and search and categorisation for massive amounts of paperwork. In other words, it works best in environments where there is low risk of human rights violations because it’s working with abstract concepts like inventory or infrastructure rather than making decisions about people.
As discussed in my recent position paper Stemming Sinister Tides: Sustainable Digital Ethics for Evolution, AI does have the potential to actually mitigate bias in many of the disastrous examples above, but at the moment it continues to replicate existing societal biases because that’s all we have to train it on. In a way, many of these scandals are simply pointing out in procedural terms the biases that we humans have introduced against one another. It’s up to everyone involved in designing and building AI, from business leaders to software engineers, not to accept this state of affairs and allow it to be encoded into the fabric of our digital lives, to shrug our shoulders and say that the AI is only doing what we tell it to, but instead to push for a better world for all, online and off.
Despite the hype, comparatively few businesses are reaping success from digital transformation projects. Darren Birt, Director, at FHL Cloud Solutions shares his views on why these projects are failing and companies should carefully assess their key challenges, before making an investment in new technologies.
With all the noise currently around digital transformation projects and new technologies many CIOs are likely to be considering their next strategic IT choices. However, despite considerable outlay on technologies that promise to digitise and optimise core business processes, we still see many organisations drowning in complexity caused by ill-advised technology investments that do little to solve their fundamental business challenges.
We’ve seen businesses experience rapid growth through expansion or acquisition with multiple reporting systems that they are yet to consolidate; or those running disparate accounting systems put in place to manage different countries’ import/export requirements or tax regimes. These organisations ultimately have little visibility of their organisation-wide operations and are therefore unable to make effective decisions or even tactical ones. And for all those businesses where visibility is obscured by rapid expansion, we see just as many whose plans are constrained by legacy business systems or outmoded processes.
The right technology choices can provide businesses with significant advantages, but before leaping in and making a significant investment, CIOs need to fully and rigorously evaluate their organisational challenges and consider where new technologies can add value.
Getting it right from the outset is essential. According to a data from management consultancy, McKinsey, in spite of the growing number of digital transformation projects underway in businesses across the globe, the success rate of these projects is disappointingly low. Only 14% of business executives claimed that their digital transformation projects have delivered on sustained performance improvements, just 3% reported successfully sustaining business change.
The sad truth is that many businesses have been putting resource, energy and millions of pounds into new technologies − without really considering how they will resolve their business-wide operational challenges. Limited time, resource and budgets, and a failure to grasp the challenges that come with business change, are also key contributors to IT project failures.
Identifying the challenge and solution
To avoid a costly mistake, and before an expensive IT buying decision is made, CIOs need to assess how innovative solutions can help them solve common issues such as business-wide visibility on operations, scaling for rapid growth, or the eradication of inefficient, ineffective and costly processes.
CIOs or business managers are right to see technology as the answer to many of these issues, but it’s how and where it is deployed that matters. The real business value of any IT investment lies in the enhanced and optimal operation of its processes and business owners need to build a clearer understanding of exactly how their organisations operate.
The starting point for any business looking to make a change needs to be an evaluation of their current processes: consider if they are optimal and if not, how to re-engineer them for better performance.
The areas for improvement highlighted by this evaluation process immediately pinpoint where new technologies can be applied to successfully deliver significant business value. These new-found insights enable businesses to use technology and improvement processes to gain better visibility of their operations, allowing them to make informed decisions, boost responsiveness to customers and drive profitability.
The focus needs to be on getting to the root of the issues that need addressing within your business and finding a technology partner who can advise on how to solve your specific challenges such as making process efficiencies; or even opening up revenue streams in new territories.
Two examples show the benefits from organisations taking this deeper approach to identifying and evaluating their strategic challenges. One UK firm with multiple overseas subsidiaries identified the need to consolidate disparate accounting systems and invested in a new single platform; this has not only reduced resources needed for financial reporting but also delivered better visibility of operations and much improved insights to management based on data. In the second case, through a deeper examination of core business processes and its customer needs, a fast-growing business realised that replacing its current systems with a single system would enable it to rapidly set up an overseas sales operation that achieved easier compliance with local sales taxes and import controls while delivering better customer service and improved overall management information.
Without documenting core processes and building a clear picture of operational needs and areas for improvement, today’s IT innovations are no more likely to help a business achieve key goals than their current technology set up. Business owners or managers need to be wary of the hype around digital transformation and engage with experts who can help them review, document and realise true visibility of their business operations and identify where technology can be applied for improvement.
A decision on new technologies should not be taken lightly, and certainly not based on the promise of business transformation: understanding your processes and operational inefficiencies and applying technology to improve and transform these is the real key to effective decision making around IT and ensuring the investment delivers as promised.
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date? Are the majority of companies comfortable with the technology and how it can be used, and are many already using it? Or, are the majority of organisations still unclear as to exactly what it is and what it offers to the business? DW asked a range of experts for their thoughts, and there was no shortage of responses, and plenty of food for thought. Part 5.
AI-driven credit scoring – the benefits for SMEs and emerging market companies
The traditional credit scoring process is outdated, and remains reliant on a small number of accounting entries. Tradeteq’s Michael Boguslavsky believes technology is the key to achieving greater transparency and rigour in the credit scoring process, minimising the risks associated with global trade flows and most importantly, opening up access to trade finance for SMEs and emerging market companies:
Artificial intelligence (AI) is a very popular buzzword these days. Generally, it refers to the use of computers and computer aided systems to help people make decisions, or make decisions for them. It usually relies on large volumes of data or sophisticated models to help understand the best ways to make sense of all the information and draw intelligence.
In trade finance, AI is particularly helpful in analysing quantitative data; there are usually a large number of repetitive small transactions. The repetitive nature of trade finance means that there is a lot of non-traditional data at our disposal.
This means AI-driven models can be very efficient for data analysis and revealing intelligence relating to small companies that traditional trade analysis tools cannot cover.
AI-driven supply chain analysis for SMEs
Trade finance as a business is only beginning to be digitalised and inefficiencies therefore remain, but that is changing very rapidly. A group of institutions, including banks, pension funds, institutional investors, funders and trade associations, came together earlier this year to form the Trade Finance Distribution Initiative (TFD Initiative).
An important component of the Initiative is bringing new levels of transparency to a market that is opaque and paper-based. This allows institutions to identify attractive financing opportunities where it previously may not have been possible.
Beneficiaries of this include micro, small and medium-sized enterprises (MSMEs) and corporations based in emerging markets, where credit providers simply don’t have enough data, accounting or trade information to make sound decisions.
Tradeteq’s AI technology creates credit scoring models to analyse the history of a company and their transactions. It uses vast amounts of public and non-public data, including data on each company in the supply chain as well as each receivable. From this, we can create a sophisticated evidence-based, credit scoring model.
Tradeteq also allows a company to receive early warning signs when a supplier or counterparty is in distress or at risk of not fulfilling credit or trade requirements. Tradeteq’s algorithms predict the effect on each business to ensure the risk of interrupted trade flow is minimised.
For larger international banks, this can create more trust between them and smaller corporations, including those further down the supply chain where trade finance banks typically cannot provide coverage.
Overcoming the data challenges
Even though the process seems simple, substantial challenges remain due to the availability and reliability of the data. The traditional approach to credit scoring relies heavily on company accounts and the data within them, which can be out-of-date. This is a major barrier and prevents many small companies from accessing trade finance.
A lot of financial organisations are gradually collecting their historical data and merging it with current operations to create a single source. I don’t think that this has been very successful so far, but we welcome the strong effort to put data into a reasonably uniform format.
The second challenge is the legal aspect of accessing the data once it is available and retrievable. When looking at the cross-jurisdictional legal issues, there are still things that need to be resolved in order to ensure compliance with all local and international laws.
The third challenge, which probably the easiest to resolve, is modelling accurately when the data has been validated. This is where AI can become particularly useful.
Closing the trade finance gap for MSMEs
Once the data is available in its entirety, counterparties or funders can use AI to observe patterns in a small company’s trade and payment history, including non-payments, then look at the history of comparable companies in the supply chain to identify and assess undue risks.
As it stands, this cannot be done efficiently, at speed or at scale. AI can help funders make more confident trade predictions for companies, opening up trade finance access for many small companies who would have otherwise not had that access and reduce the trade finance gap.
How is artificial intelligence driving innovation in financial research? asks Rowland Park(right) and Simon Gregory(left), Co-Founders of Limeglass:
Despite innovation in many sectors of finance, the development of financial research has lagged far behind other areas. This seems counterintuitive when we consider that research forms the bedrock of any corporate or trading decision-making and has a huge impact on a company’s performance and profitability.
Yet many market participants rely on outdated processes such as scrolling through an email inbox or using 'Control+F’ in documents to try and find relevant paragraphs. The inefficiency of these methods is exacerbated by the sheer volume of reports that financial institutions create and receive every day. The result is information overload, with key pieces of information missed, and ultimately, potential loss of opportunity in the market.
How can AI maximise financial research?
In order to realise the value in research assets, research producers must have a way of zeroing in on the relevant details in the library of documents. In the traditional provision of text documents, this is not possible, and therefore the reports are of limited value.
Artificial intelligence (AI) offers a solution for the publishers of financial research, and consequently for their clients who read and use the reports.
An AI platform can scan documents quickly, tagging words and phrases, and identify paragraphs for cross-referencing. Rich Natural Language Processing complements this by analysing text to identify where synonyms are being used and can draw connections between associated topics or phrases. This can then be organised into an asset specific taxonomy, enabling users to very quickly access the right level of detail on any given topic in their entire library of research.
For the best results, the technology should be expanded upon with human expertise. When it comes to identifying what subjects and terms are linked together, an algorithm can produce a statistical answer, but it takes human expertise to be able to add the additional layer of subtlety which makes a technology like this as incisive as possible.
This means human experts should be involved in initially defining the taxonomy of tagged terms, while also having regular input to grow and develop the taxonomy as additional market issues emerge. New topics are appearing constantly with a raft of phrases and associated synonyms. For example, the general term for a viral infection, ‘Coronavirus’, has been given a specific name ‘COVID-19’, for the global outbreak which began in 2019 and this in turn has many variants as to how it is written.
In any scenario, there may be multiple synonyms that need to be linked in a taxonomy for which a research team provides invaluable knowledge.
Personalisation through AI
For financial research creators, the customisation of reports by surfacing key paragraphs and providing relevant information will lead to a better service to clients. The ability to break down a report into its component parts, via AI, enables the analyst to provide not only specific information that the reader is interested in, but also other paragraphs with associated information.
This personalises research with both a macro view of the topic as well as a granular level of detail of specific issues. Moreover, such personalisation means that the recipient no longer has to wade through a mass of long documents to find what they want or risk missing out on key pieces of information.
Following the recent resurgence of the market and the development of advanced AI solutions, organisations are faced with a unique opportunity to reimagine the way they manage their physical assets.
However, despite this revolution, it seems that many major oil and gas companies are not making the most out of these new advances in AI, nor the multiple sources of data available to them across their operations.
Although organisations often collect vast amounts of data, it’s clear that in the oil and gas industry at least, many remain challenged to be able to use it. To put it into context, one industry study found that less than 1% of the data being generated by 30,000 sensors on an offshore oil rig was actually used for decision making.
Our own research revealed that 71% of oil and gas asset managers still rely on just a single source of data to analyse their asset performance and risk management. This is a missed opportunity to maximise production, reduce waste, eliminate unnecessary downtime, and reduce the risk of a containment or safety incident.
The way companies with big infrastructure (e.g. chemicals, oil and gas, power, transport) typically inspect their assets to avoid failures and breakdowns has, until recently, followed a time-based approach. However, thanks to AI, 3D digital twins and machine learning, actionable insights can now be derived from multiple sources of data with speed.
Using AI in this way has been proven to lead to cost savings of 10% to 40%, yet only 18% of companies have adopted this approach according to those we surveyed. The slow pace of adoption, in spite of the efficiencies these innovations bring, sits somewhat at odds with the commonly held view that industry needs to pursue operational excellence to maintain growth.
If organisations can begin to draw on multiple sources of data, they’ll be able to derive previously unprecedented levels of insight from their assets. In time this would allow businesses to reinvigorate their maintenance approach, improving productivity and uptime.
While the majority of organisations are yet to transform their asset performance management systems, interest in emerging technologies is growing rapidly. Our research revealed the greatest interest is in big data (25%) and AI and machine learning (23%), while many are also excited by the possibility of 3D digital twins (19%).
Anecdotally, however, many organisations are reluctant to adopt new systems due to the required training and skill sets they can demand. Yet, there are now platforms available, including Lloyd Register’s AllAssets, that have been built with ease of implementation in mind, allowing oil and gas companies to rapidly modernise their approach to managing physical assets.
Advanced technology solutions are giving organisations a unique opportunity to reimagine their asset environment. It’s our view that it is time for organisations to move on from their legacy systems and enjoy a new age of operational success.
The gap between the realisation of value from deployments of AI based technologies and the hype around what today’s AI’s can achieve, is still very discernible across both the public and private sectors, according to Edward Charvet, Chief Strategy Officer, Logicalis:
At Logicalis we wanted to understand the extent to which AI is being used by businesses across the globe, how significant this value gap currently is and how fast it is closing. Our 2019 Global CIO study questioned over 800 CIOs from around the world on the extent to which they see potential in this technology, if they are embracing it today and if so, are they realising benefits as a result.
According to our survey, 41% of CIOs have deployed an AI based technology solution in their business. This number has doubled since our survey just one year ago. This supports the wider market commentary around AI deployment and underwrites the inroads that AI based technologies are making in aligning to business needs within the enterprise. The survey goes on to reinforce that this trend will continue, as 60% of respondents believe AI is going to be one of the technologies to have a significant impact on their businesses over the next two years.
And yet today the value gap is clear, as the results of our survey show. Innovative technology leaders remain vocal about the potential of AI. Many embrace the importance of investing ahead of the value realisation curve, to ensure the maximisation of the efficient advantages on offer. Yet, it would be naïve to ignore that a high proportion of respondent’s are struggling to realise the business benefits that they were promised. In our survey almost half (47%) of respondents who have deployed an AI based technology say they are still waiting to see significant value across different business departments; this is material.
Students of the history of AI development will recognise the value gap as a precursor to the AI winters of the past. However, Logicalis believes that some key metrics that sit around the current vocalisation of the value gap, indicate that the likelihood of an AI fall from grace is small.
The first significant point is that where AI deployments are working, they are working well. Many forget that todays AI’s are narrowly defined, goal centric applications and, by definition, if the context of the requirement is not extremely well defined, then the potential to deliver value will be limited. Logicalis has helped clients realise significant value within the pattern rich data environments, such as in security managed services, through the judicial application of AI based technology solutions supporting an improved understanding of threat behaviours.
Secondly the AI’s of today are carving out a reputation for success within the field of optimisation, where the complexity within business need is so significant, only an elegantly defined, goal centric AI, can offer any form of solution. Multisite distribution or collection services, such as waste bin collection or home delivery, is not only a reality today as a result of AI, but the efficiency of the solution is improving all the time as the underlying neural networks evolve in response to the growing volume of data inputs.
Combine these realities with the continued progress made around compute capacity and data processing speeds, and every business embracing AI today can have both the confidence of knowing that value extraction at a material level is achievable and the base case for deployment, the context and deployment rationale, is rapidly becoming clearer. This is allowing businesses to specify needs that can be addressed by AI with ever more assurance.
The survey results infer that today organisations aren’t yet aligning a tight requirements’ specification to the practical constraints of today’s AI based technologies, so that the business benefits show themselves quickly. This is the true gap that underpins the perception of ‘lack of value’. As Architects of Change TM Logicalis sees its role as the trusted business and technology advisory partner to its clients, to ensure this foundation understanding is in place before our clients move forward with an AI-based technology solution.
As cloud computing continues to grow, cloud management systems keep on generating huge amounts of data. To provide the best possible customer experience and to manage cloud costs effectively, in-depth analysis of this data needs to take place. Yet, clear insights cannot be drawn from these huge volumes of data using traditional rule-based systems. This is where AI is being utilised to analyse these enormous data sets in order to help enterprises get valuable and in-depth insights about their systems.
Used in this way, AI facilitates better alerting, stronger monitoring of availability and greater identification of the root cause of failure events. AI systems can predict outages, help provide proactive infrastructure management, and ensure better service availability.
Another area where AI is having an impact is in IT service desks. Unlike cloud management where the bulk of data is machine generated, most service desk data is generated by people. The manual processing of this data can slow down productivity, especially when it comes to responding to service desk tickets. Enterprises are benefitting from a powerful form of AI called natural language processing (NLP). When used in chatbots, NLP is boosting service desk productivity and response times.
NLP has the ability to recognise and understand the context of each service desk ticket as they are submitted. It can then automatically assign the ticket to the most suitable technician for resolving the ticket based on similar interactions that have taken place. This not only accelerates ticket response and resolution times but makes it easier for service desk agents who can focus their energies on other tasks.
As AI becomes increasingly important across different aspects of managing IT infrastructure, enterprises must work to adapt to the natural changes that occur when implementing any new technology. While AI is enabling decision automation across the stack, it is important that businesses ensure their existing process hierarchy also factors in the decisions reached from the AI system.
Decision-making has traditionally been a question of yes or no for enterprises. Yet AI has introduced an element of probability to decision-making processes. Today, many enterprise monitoring systems issue alerts when there is trouble with a particular server but an AI-powered monitoring system will advise that there will be a 60 percent chance a particular service will fail in the next hour. The impact of these advisories can be factored into the organisation's IT workflows.
Ensuring successful adoption of AI across the enterprise comes down to designing and modifying existing hierarchies to make them flexible enough to accommodate probabilistic decisions across various workflows in the organisation. Enterprises adopting AI will need to be ready to quickly adjust existing workflows in order to reap its benefits.
Why data migration costs time and budget for three quarters of software implementations.
By Neil Martin, commercial director at Qbase.
To the average person in the street, data migration may well seem like an irrelevant cog in the corporate world; something they don’t need to know about, let alone concern themselves with. That is, until that seemingly meaningless piece of IT jargon starts to cause personal pain. That’s exactly what happened to TSB customers in 2018, when a massive scale migration of customer data from an old system to a new one went awry. This initially gave customers access to other people’s confidential account information, then it resulted in the being locked out of their accounts for days, and in some cases weeks, while the problem was fixed.
Of course, the pain felt by those individual customers is just one page in a catalogue of errors but the business consequences for TSB has been much greater. Immediately after, the regulators came calling, staff were put under immense pressure, and the negative brand impact across social media left a lasting bad taste. Since the failed data migration, the bank has witnessed management changes, a legal challenge, further IT issues and, in December 2019 just 18 months after the event, it announced 82 branch closures. Whether these more recent challenges are related to the very public data error or not, it is fair to assume it will have been a contributory factor. According to Sky News, the debacle “sparked hundreds of thousands of customer complaints and led to TSB reporting a pre-tax loss of more than £105m last year.”
Large corporates are challenged by their need to move data from one system to another more than ever before due to legacy system upgrades or mergers, as in the case of TSB for example. This is exacerbated by the fact that technology adoption is advancing exponentially, requiring critical data to be moved more frequently. The stakes are rising and so are the risks. And, it’s often the crucial job of getting data from A to B where errors occur.
Looking at all the data migration projects Qbase has been engaged in over the past five years, in 63% of cases we were brought in due to a failed implementation. What this suggests is that roughly three out four software implementations fail to plan properly for data migration and suffer as a result. An interesting correlation is that according to Merkle Group, 63% of all CRM platform adoptions fail at the first attempt.
Data migration is a strategic consideration for any software implementation and should be part of the initial planning process. There are four main reasons where things go wrong.
Leaving it in the hands of the SI
We find that around four out of every five new software implementations involve an external Systems Integrator (or Solution Implementor). And, in most cases, the data migration process has been allocated to the list of SI responsibilities. While SIs are platform experts, they often lack the knowledge to successfully manage a data migration. In situations where an SI will be handling the migration, its crucial to ensure their methodology is completely watertight and that data isn’t treated as something that should merely be uploaded to see what happens.
Inadequate project scoping
Poor scoping crops up time and time again as the reason behind failed projects. In our experience it is usually a case of the requirements not being thought through and laid out in full, or misinterpretation of what those requirements mean from an implementation perspective. The crux of the issue is usually that either initial analysis of the situation, or the process of engaging stakeholders properly in order to get the facts straight from the beginning, has failed.
Setting unrealistic expectations
Effective data migration can take the same amount of time as the actual platform development yet, in many cases, it’s often viewed as a quick switch. Anyone that says otherwise is either trying to grease the wheels of a deal or doesn’t know what migration involves. Third party consultants and internal IT teams need to set expectations with the project sponsors and the board. It may make for a longer implementation timescale, but its likely that this time will be far shorter than the one faced if the project fails and has to be put back on the rails.
Failure to overcome resistance
Resistance is an inevitable part of change, especially when people believe the change is not in their best interest. Unless those resistors are tackled, the ability to migrate data effectively will always be flawed. The best way to illustrate this is through example. The last failed migration Qbase was brought in to was floundering because the SI has let the client dictate that their new platform should follow the same processes as the old ones they had used for years. They wanted the benefits of the new system but with the comfort blanket of the processes they’d spent years fine tuning. This led to the new application being forced to carry out the old processes, compromising the data model and leading to the patching of data and processes. Resistance must be addressed at the scoping stage.
As far as data migration is concerned, the failure to plan is tantamount to planning to fail. TSB is just one high profile casualty of this, but experience shows that around three quarters of all IT projects have issues resulting from poorly planned and handled migration. Organisations need to balance their focus on the platform and the data.
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date? Are the majority of companies comfortable with the technology and how it can be used, and are many already using it? Or, are the majority of organisations still unclear as to exactly what it is and what it offers to the business? DW asked a range of experts for their thoughts, and there was no shortage of responses, and plenty of food for thought. Part 6.
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date?
Peter Gray, Senior Vice President, Advanced Technology Group: Sport – NTT Ltd – offers the following thoughts:
Artificial Intelligence (AI) is already embedded in the business world, just as it is already embedded within our daily lives. Different people choose to define AI in different ways, but broadly we can consider AI as machines acting with some form of intelligence. Capabilities such as machine learning, natural language processing, image recognition, optimisation, problem solving etc. are all forms of AI that we can see commonly in action today as we use our smartphone’s and increasingly connected digital ecosystem. They guide us to our next destination, recommend a song we might enjoy, or use facial recognition to grant us access to our phone or computer.
AI in the enterprise is similarly becoming embedded within business systems, processes and productivity applications. Often this will not be in the form of a project to “implement AI”, but through the continual process of business improvement across all areas of the business, or driven by a need to transform the business in order to adapt and compete in a rapidly changing world. Some of the key areas AI is playing a defining role in the enterprise today include:
Together with ASO, the organisers of the Tour de France, NTT has been working to revolutionise the fan experience, and engage an increasing digital audience. Using AI together with IOT, we are able to make race predictions, provide live data driven insights, and provide services to fans by the roadside such as the estimated time of arrival of the peloton. All these contribute to the ability of ASO to provide new digital services to their fans.
Another great example of AI in action is in the rapidly evolving space of cyber security. Using AI filtering through vast quantities of data we are able to assess and respond to potential cyber threats in real time, something that humans simply could not do.
Are the majority of companies comfortable with the technology and how it can be used, and already using it? Or, are many still unclear as to exactly what it is and what it offers to the business?
I believe almost every enterprise is now using some form of AI in their day to day business operations as it becomes embedded within the tools and technologies that we all use daily. The vast majority of organisations would also recognise the importance of AI and its impact on their future business strategies, however there are a much smaller, but growing number of organisations that are actively working to develop strategic AI capabilities.
There are many challenges in developing working AI solutions. Access to the required skills and expertise is a key barrier. It is a complex field that requires a multi-skilled cross functional team with the right business, technology and data scientist skills some of which are difficult to find in the market.
Availability and access to the required data can also be a challenge. Organisations that have a clear data strategy and solid data platforms and pipelines will be much better positioned to delivery business benefits with AI, allowing it to be fully integrated into business systems and processes. The biggest challenge for most organisations is in developing the culture of using AI and the insights it can produce to actually transform the business. Businesses are used to using financial data to drive strategic decisions, but are not yet used to using the operational, organisational and customer data to rethink how their business operates. Entrenched business models, organisational hierarchies and business processes can be hard to change.
We would argue that it is important to get started on your AI journey, building skills and experience, and testing new ideas. Increasingly technology vendors including the major cloud platforms are providing tools and cloud services that do make this process simpler. Selecting real, but achievable use cases, and starting with the simpler supervised learning techniques that are easier to evaluate and explain before moving on to more complex techniques such as unsupervised learning.
There is a lot of hype and discussion surrounding how artificial intelligence (AI) is changing the business world. In fact, we’ve only just begun to scratch the surface of what AI can deliver across fields, including governance, risk, and compliance (GRC).
Today, organisations are beginning to explore how AI can help them anticipate and prevent financial fraud; or, simplify compliance by automatically condensing lengthy and complex regulatory tomes into short, actionable insights; or, minimize the time taken to respond to a security threat or breach. According to Capgemini research, 69 per cent of organisations believe they will not be able to respond to security threats without AI.
One of the exciting potential use cases of AI in GRC is the implementation of chatbots in the front line. These conversational interfaces can help business users flag observations on potential risks, anomalies, and deviations. Through a casual conversation with the user, the bots can capture information on control weaknesses or gaps, as well as deficiencies in internal processes. These insights can then be automatically routed to the risk and compliance functions in the second line of defence for further analysis and investigation. It’s a simple, yet effective way of enabling the front line to actively participate in risk management without needing to undergo intensive training on risk tools and terminologies.
Next on the horizon for those working within GRC will be how to govern the use of AI bots – in other words, enable ‘GRC for AI’. This idea of ensuring responsible, ethical AI is steadily gaining ground as organisations and governments recognise the importance of building technology that serves humanity and not the other way around.
AI can be easily misused or overused, even when the initial intention is good. For example, flawed facial recognition, deepfake voice attacks, gender-skewed credit, and biased recruitment tools are just some of the many AI-related risks that we are seeing today.
One of the reasons for these risks is that often, the people creating the AI algorithms are far away from the boardrooms. So, the board and leadership team may be defining their risk appetites around AI very clearly, but if this information isn’t filtered down properly to the front lines through the appropriate checks and controls, that’s when AI risks such as creator biases become an issue.
It is critical to have a well-coordinated approach to AI governance that includes training employees on the risks of new technologies. While businesses are currently exploring the countless possibilities of AI and big data, they must also ensure that the development and deployment of algorithms, data and AI are based on an ethical and considered approach.
The opportunity to automate manual tasks and gain new insights to support business decision-making is set to have a hugely positive impact on organisations. These newly formed insights will play a crucial role in changing the clock speed of business and freeing up workforces to focus on higher value activities.
While many organisations are still in the early adoption phase, we’re already seeing some impressive results. In the insurance industry, for example, one company is now using AI capabilities to discern which of its customers are most likely to renew their premiums, before automatically contacting them via SMS. This company has witnessed a 60 percent engagement rate and 30 percent renewal rate from that text message bot campaign alone.
Despite these promising early successes, there are challenges to overcome before AI reaches its full potential. According to research conducted by CompTIA, only 29% of businesses are regularly using AI. This low rate of deployment indicates companies are struggling to get to grips with AI, and whilst there are many ethical considerations, connectivity challenges are also undoubtedly a factor. Many organisations approach AI with the unrealistic expectation that they’ll be able to just plug it in and it will begin delivering the desired benefits. However, the siloed data stores and lack of integration between enterprise applications that exists in many organisations severely hinders AI’s ability to influence the digital ecosystem around it, rendering it an expensive yet fundamentally limited tool.
Getting the most value out of AI is about more than just knowing the right questions to ask, it’s about being able to connect AI engines to the right sources of data. To do so, organisations need a much more fluid approach to connectivity; one that allows them to decouple very complex systems and turn their technology components into flexible building blocks. They ultimately need to become more composable, so that AI can be plugged in and out of any data source or capability that can provide or consume the intelligence it creates.
This can best be achieved with an API strategy orchestrated in an application network, which allows organisations to easily connect any application, data source or device via a central nervous system of sorts, where data can flow freely. This central nervous system, or application network, is how the ‘brain’ of AI can plug into a business’ digital ecosystem to consume its data and then provide valuable insights and actions.
UK enterprises are gradually getting to grips with the technology and are experimenting with various forms of AI, according to Chris Greenwood, Managing Director, UK & Ireland at NetApp:
A study by the Aberdeen Group found that users of predictive analytics (a form of ML-driven propensity modelling) are twice as likely to identify high-value customers and market a product or service to them successfully. This adds value and enables businesses to find, engage and covert new customers at a faster rate with the constantly evolving tools that AI offers.
The AI-embedded devices with which businesses are leading with—virtual assistants, banking biometrics and self-driving cars —are only getting smarter. But with a combination of a lack of available talent, short term and siloed strategy, truly transformational AI use cases have so far been limited.
To be considered as AI-ready, businesses should demonstrate they can confidently tap into growing data sources with virtually unlimited, non-disruptive scalability and bandwidth to feed, train, and operate data hungry machine learning and deep learning applications. What’s more, while HPC AI systems typically operate locally on-premises, the supporting infrastructure needed to access potentially disparate and far-afield data nodes must also be fit for purpose.
Enterprises must not only have a data fabric strategy that allows these to come together without friction, but also a tailored, hybrid system architecture where each solution is carefully selected specifically for each use case. Successfully establishing ourselves as leaders in AI in the long-term would see the UK become a hugely attractive investment prospect in highly fraught trade environment. Effective artificial intelligence could boost productivity, create altogether new, disruptive services, and empower UK businesses post-brexit, to better compete at a global scale.
“Nuxeo recently launched research with UK Financial Services (FS) workers which included questions about the value and impact of artificial intelligence. Although two-thirds said AI could transform the sector and more than half stated that the creative use of AI made organizations more attractive employers, a full 58% also said their firms lack sufficient AI talent and expertise.”
“There are very similar challenges with AI in the wider business world. Despite appreciating the potential of advanced technologies like AI - and in some cases already using them in small pockets of the organisation - businesses are generally not well positioned to fully leverage all of the benefits AI can deliver.”
“For all the hype around AI, and the general excitement about its potential, it has arguably added the most value so far to businesses in perhaps less glamourous, but still highly valuable, ways. Some of the most widely used AI applications tend to centre around improving search outcomes and automating routine back-office processes. Using AI for such purposes is not a bad thing – it adds value, addresses specific operational pain points and helps improve productivity by freeing up time spent on back-office tasks to spend more productively elsewhere.”
“But AI can deliver much more than this. Our survey revealed that transforming customer service delivery is a key focus for AI ambitions, and AI can also be used to extract insight from customer data, information that can be used to deepen customer understanding, mitigate business risk, enhance the customer experience and, ultimately, drive digital transformation.”
“In this era of content and data, AI can play a key role in delivering more value from both existing and new information. Given that each organisation’s content and data requirements are unique, AI can predict, classify and enrich content to surface the information that is most important to a business. Beyond utilizing common, but generic AI services, most organisations will benefit from developing their own, custom and business-specific AI models that will better serve their unique business needs and data requirements.”
“13% of our research respondents believe their organisation’s inability to adopt AI quickly enough is one of the main challenges it faces in 2020, so it’s something that needs to be addressed sooner rather than later. People can also be resistant to change so it’s important not to jump in feet first with AI. A gradual migration path from the ‘old’ world to the ‘new’ is the best way forward: a managed modernisation journey, which helps address the most acute pain points and delivers quick wins from AI, without incurring new risk or disrupting the organisation with too rapid a change. AI can be genuinely transformative, but a thoughtful approach to modernisation will produce better results over time.”
With Data Centre World around the corner, we speak to Riello UPS Managing Director Leo Craig about what promises to be an even busier show than normal for the uninterruptible power supply manufacturer.
This year’s Data Centre World marks a big moment for Riello UPS?
Well, Data Centre World is always our biggest and best show of the year, but 2020 is shaping up to be that little bit extra special. The event coincides with the official UK launch of our new and improved Multi Power modular range.
We’ve expanded the series with two new power modules designed to meet the needs of smaller data centres. They’re both 2U in height and offer 15 kW or 25 kW of power. These new modules complement the 42 kW module that’s proved one of our best-selling products in recent years.
And to house these two new power modules, we’ve also introduced a dedicated new cabinet (PWC X) ideal for space-restricted installations. It measures 60 cm wide and 120 cm high, compared to the 2-metre tall chassis we use with the 42 kW modules.
What’s the thinking behind these new power modules and a smaller cabinet?
Modular UPS systems are increasingly the go-to option for high-end data centres thanks to their combination of performance, scalability and efficiency.
Our extension to the Multi Power range enables smaller data centres and similar mission-critical applications to enjoy these benefits too.
The new cabinet holds up to five of the modules. That’s up to 60 kW N+1 if you’re using the 15 kW modules, or 100 kW N+1 for the 25 kW versions, in a single frame. You can also parallel up to four cabinets together for extra power.
There’s still the 42 kW modules for higher-power installations. The standard Multi Power cabinet can take seven modules, so that’s 252 kW N+1. And again, parallel up to four cabinets to deliver maximum power of 1,008 kW N+1 in a single UPS system.
Efficiency and reducing energy consumption are always hot topics – does the new Multi Power tackle this?
One of the major benefits of modular UPS is the inherent scalability they offer data centre operators. This has the knock-on benefit of cutting energy waste.
Not too many years ago, UPSs tended to be oversized at initial installation to allow for future expansion, which meant they often ran inefficiently on low loads.
Modular systems enable the UPS to closely mirror the load requirements at installation. Then when data centres need more power or battery autonomy, operators can simply plug in extra power modules or add cabinets in parallel.
With the potential for more than 1 MW of power plus redundancy in a single modular UPS system, such “pay as you grow” scalability future-proofs your power protection needs without wasting floor space and electricity, while it’ll also cut down on unnecessary air conditioning costs.
Thanks to its high-performing, specially designed components, the Multi Power delivers 96.5% operating efficiency in double conversion online mode. But it also features a special mode that provides high efficiency (up to 95%) even when running loads as low as 20-25%.
Energy Saving Mode sees the UPS run in online mode with the inverter continuing to support the load, so protection isn’t compromised. But depending on the load level, the Multi Power’s microprocessors only activate the necessary number of modules to power the load.
All the other power modules are idle with the inverter closed and charger switched off. This saves unnecessary power use and maximises system efficiency.
Inactive power modules can remain in this energy-saving status for a maximum of 15 hours. Then it swaps with one of the other active modules. This makes sure the power modules share the load reasonably equally, which ensures components age at a similar rate.
Data centres can be large-scale power users. Does the industry need to do more to cut its power consumption?
It would be unfair to accuse data centres of not taking their environmental responsibilities seriously. The industry's made great efforts to improve the efficiency of cooling, air conditioning and power systems, to name just three.
But the demand for data centre processing is constantly growing. All that processing and storage doesn’t come for free. It needs lots of electricity.
There’s also a growing feeling that the much-vaunted benefits of Moore’s Law are starting to level out. So much so that a recent Uptime Institute report reveals nearly two-thirds of data centre IT power (65%) carries out just 7% of the actual processing. This is because of ageing equipment and efficiency improvements plateauing.
We also can’t afford to ignore another major trend. The way our electricity network generates power is going through rapid change. 2019 was the first year where more electricity came from zero-carbon sources like renewables than fossil fuels such as coal and gas (48.5% versus 43%).
With challenging net-zero targets to meet, this development is surely irreversible. But such reliance on unpredictable wind and solar means National Grid has to be smarter in the way it balances supply with demand and maintains a stable frequency.
Demand side response (DSR), which incentivises electricity users to store power and shift usage from busy to off-peak periods, will become increasingly influential.
We believe this provides data centre operators, many of whom currently have annual electricity bills upwards of £1 million, with an opportunity to rethink the role of their backup power.
All data centres deploy uninterruptible power supplies to protect against downtime and ensure a constant flow of clean electricity.
But in reality, how often are they really needed? More often than not, a UPS can be an essential but ultimately underutilised and expensive asset.
That’s why we’re working on an exciting pilot project that turns a UPS into a ‘Virtual Power Plant’ that takes part in demand side response without impacting on overall resilience.
Can you explain more about this ‘Virtual Power Plant’ concept?
We’re working in partnership with the energy trading arm of RWE, the biggest electricity supplier in Germany. Our Master+ solution involves one of our UPSs fitted with a special rectifier to enable bidirectional flow of electricity to and from the network.
This smart grid-ready UPS system is backed by premium lead-acid or lithium-ion batteries, along with RWE’s intuitive battery monitoring and communications software. This is crucial as it allows for real-time analysis and two-way communication with the grid.
We divide the UPS’s battery capacity into two distinct roles. There’s a part that purely provides emergency standby power in case of any disruption to the mains. But there’s also a ‘commercial’ section which stores electricity and feeds into the DSR scheme Firm Frequency Response, which helps maintain grid frequency within the required safe range of 49.5-50.5 Hz.
So you get the traditional safety net that a UPS offers plus the opportunity to reduce network charges and sell any surplus energy back to the grid.
If there is a power cut, any energy still stored in the ‘commercial’ part of the batteries automatically tops up the main backup to lengthen runtime.
But doesn’t using batteries in this way put the data centre at risk?
That’s always been the main stumbling block to more data centres taking part in DSR. When uptime is the number one priority, you can understand why operators are reluctant to use the ultimate insurance policy of their UPS and batteries for something that isn’t its primary role.
That UPS could be all that’s standing between your data centre going offline or not.
But unless you’ve installed hugely-expensive battery monitoring and management systems, it’s really difficult to assess the condition of traditional sealed lead-acid batteries effectively. Hand on heart, can you be 100% sure they’ll kick-in to action when called upon?
Whereas with the Master+ model, battery monitoring is mandatory. That way you know exactly when there’s any cell deterioration. It identifies any blocks that need replacing before it’s too late.
So rather than undermining system resilience, we believe smart grid-ready solutions such as ours actually improve overall reliability.
So you’ve covered system resilience. Are there other areas that data centres can benefit from demand side response?
Well, two positives to mention right at the start are that RWE subsidises the cost of the more expensive premium batteries, which makes the upfront cost more manageable. It also takes on any risks associated with trading on the energy market.
It’s tricky to put an exact cost saving as everything is project-dependent, but so far, the tests at our two pilot sites – one at RWE’s HQ in Essen, Germany, the other here in the UK – have proved hugely positive.
Projecting the results onto a data centre with a typical 1 MW load and batteries to provide 10 minutes autonomy plus 1 MWh of ‘commercial’ energy storage, the upfront capital costs are roughly a fifth lower compared against a standard UPS. That’s primarily due to the subsidised battery costs.
Then ongoing operating and maintenance costs are lower too, as the monitoring software reduces the number of time-consuming manual service visits. Throughout the 10-15 year lifespan of a typical UPS, these savings would add up to tens of thousands of pounds.
Operators can also save up to £6,000 per MW every year through reduced grid network charges too.
Our Technical Services Manager Jason Yates and Dario Hernandez from RWE will be explaining more about the Master+ concept at a special seminar on the first day of Data Centre World (Wednesday 11 March).
They’ll be sharing some of the results so far from our pilot plants and outlining what the potential benefits are for data centre operators.
I’d urge any attendee interested in finding out more to check out their session from 10:45-11:10 in the Facilities and Critical Equipment arena.
What else are you expecting from Data Centre World?
We’re anticipating plenty of discussion about the rollout of 5G and what the implications might be for the data centre industry. For instance, there’s likely to be an increased emphasis on edge computing, which might go hand in hand with a growth in containerised or micro data centres.
From a Riello UPS perspective, as well as the Multi Power launch and the smart grid seminar, we expect our team to be as busy as always. Data Centre World is always a fantastic chance to catch up with some familiar faces and meet plenty of new contacts too.
We’re also offering attendees at the show a chance to win a 55-inch smart TV worth £1,500. We can’t reveal too much at the moment, but if you’re coming to Data Centre World it’s definitely worth popping over to our stand D920 to find out more!
Riello UPS is exhibiting at the upcoming Data Centre World event on stand D920. The team will be on hand to discuss its award-winning range of uninterruptible power supplies and industry-leading maintenance support. This year’s show coincides with the official UK launch of its new and improved Multi Power modular UPS range.
Also at the show, Riello UPS’s Technical Services Manager Jason Yates and RWE’s Dario Hernandez will be leading a special seminar showcasing its Master+ smart grid-ready UPS and energy storage solution. Join Jason and Dario in the Facilities and Critical Equipment arena from 10:45-11:10 on Wednesday 11 March.
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date? Are the majority of companies comfortable with the technology and how it can be used, and are many already using it? Or, are the majority of organisations still unclear as to exactly what it is and what it offers to the business? DW asked a range of experts for their thoughts, and there was no shortage of responses, and plenty of food for thought. Part 7.
The capabilities of Artificial Intelligence in strengthening Identity Management, by Jesper Frederiksen, VP and GM EMEA, Okta:
With only 37% of global enterprises having implemented AI in some form (according to Gartner), there remains some way for businesses to go in their adoption. However, this figure is a 12% growth on the previous year, highlighting that AI is becoming increasingly embedded into organisation’s approaches. For the role of cyber-security and in particular, identity management, this is a positive sign of things to come.
Today’s wealth of data and computing power enables security providers to create tools to mitigate and prevent identity and privacy attacks. At present, many businesses remain at risk due to poor password security, software vulnerabilities, human error and the abuse of access and privileges. Verizon’s 2019 report found that compromised and weak credentials are cited as the cause for more than 80% of data breaches.
AI, on the other hand, is a step in the right direction of better security, by enabling the creation of predictive models that differentiate between ‘good,’ ‘bad,’ and ‘normal’ behaviour. This makes these models intelligent enough to proactively halt bot attacks and identify malicious activity. Specifically, pattern recognition, thanks to AI, is able to detect threats and viruses faster than previously imagined, enhancing businesses defences. As data volumes grow for businesses, the speed of intelligent predictions will only grow as AI models begin tapping into this wealth of intel.
In the case of biometric identity, AI is being widely used within facial recognition, fingerprints and iris scans. It is becoming more widespread on personal and work devices, while enterprises are also deploying their own biometric security measures. However, trust in biometrics is still holding its deployment back. Okta’s Passwordless Future Report found that 86% of the respondents had reservations about sharing their biometric data, showing that work needs to be done to address how comfortable employees feel about its use and more transparency when it comes to how secure biometric data is.
In the current security landscape, security teams are often under-budget, meaning many organisations have few security analysts on hand to verify and contain threats, leaving them open to longer dwell times and large-scale breaches. Many teams do not have AI specialists in-house, leading to teams outsourcing AI specialists to bring in the specialism they need. In light of this, the arrival of productised security automation and orchestration tools cannot happen soon enough.
Dawn Dent – Managing Associate – Oliver Wight EAME, offers the following thoughts:
Organisations have habitually turned to new technology in the workplace, in a bid to stay ahead of the competition and to continue to satisfy customers, efficiently and profitably. Of course, many companies have the financial muscle to invest in the latest technology but often do so without sufficient planning and preparation, or crucially making the necessary investment in the staff who will be using and managing these systems.
Now business leaders are eyeing the latest generation of technology innovations - think artificial intelligence and machine learning - as a new opportunity for their companies to realise their full potential in demanding consumer markets. However, future success is rooted in the lessons of the past. Before jumping on the latest technological bandwagon or picking up the phone to the systems provider, companies need to first determine which new systems they really serve to benefit from and how they want to use them. A one-size-fits-all mentality can lead to a downfall, and for decision-makers to assume technology is the new panacea for all their problems, is a Groundhog Day faux-pas.
It is people who determine the success of your systems. They can't just be dictated to use AI, they need to be given enough time and education to take the benefits on board, to understand ‘what’s in it for them,’ and only then will they start to engage with it willingly.
So, the leadership team considering investing in AI, could do worse than to start by having discussions with team leaders, supervisors and the HR department, who can shed light on a number of factors. What education and training need to be provided for staff to understand the potential of the new technology and how it can be applied to improve processes? Leaders will also have to acknowledge that they may need to accommodate a smooth transition from long-standing practices and that the transition may result in a short-term loss of performance.
It’s a good idea to find out what people already know. And it’s free! Bringing staff who have some experience of these new technologies together as a focus group, to discuss the successes or failures of technological implementations in their previous organisations for example - what was good about it and what could have been done better?
The question of whether and when to invest in new systems should be considered within the context of Continuous Improvement. Involve people in designing and modifying the processes to leverage the benefits of the new systems and enhance business performance – and ensure that this is done before the systems are implemented.
It is an undeniable truth that the latest technology, such as analytics, AI and machine learning, present real opportunities for businesses. However, business leaders have to recognise that if technologies are truly going to be optimised in the workplace of the future, then the workforce of the future will have to receive the education it needs for new ways of working.
Rachel Roumeliotis, Vice President of Content Strategy at O’Reilly, talks about the need for the right talent and the best data:
“It is fair to say that artificial intelligence (AI) is everywhere. Newspapers and magazines are littered with articles about the latest advancements and new projects being launched because of AI and machine learning (ML) technology. In the last few years it seems like all of the necessary ingredients – powerful, affordable computer technologies, advanced algorithms, and the huge amounts of data required – have come together. We’re even at the point of mutual acceptance for this technology from consumers, businesses, and regulators alike. It has been speculated that over the next few decades, AI could be the biggest commercial driver for companies and even entire nations.
“However, with any new technology, the adoption must be thoughtful both in how it is designed and how it is used. Organisations also need to make sure that they have the people to manage it, which can often be an afterthought in the rush to achieve the promised benefits. Before jumping on the bandwagon, it is worth taking a step back, looking more closely at where AI blind spots might develop, and what can be done to counteract them.
“AI maturity and usage has grown exponentially in the last year. However, considerable hurdles remain that keep it from reaching critical mass. To ensure that AI and ML are both represented by the masses and that they can be used in a safe way, organisations need to adopt certain best practices.
“One of these is making sure technologists who build AI models reflect the broader population. Both from a data set and developer perspective this can be difficult, especially in the technology’s infancy. This means it is vital that developers are aware of the issues that are relevant to the diverse set of users expected to interact with these systems. If we want to create AI technologies that work for everyone – they need to be representative of all races and genders.
“As machine learning inevitably becomes more widespread, it will become even more important for companies to adopt and excel in this technology. The rise of machine learning, AI, and data-driven decision-making means that data risks extend much further beyond data breaches, and now include deletion and alteration. For certain applications, data integrity may end up eclipsing data confidentiality.
“We will need to learn to live with the premise that there will not always be a perfect answer when using AIs. These AI systems learn from the data that are fed into them, so this is subjective to each use case. There may never be a time where we get the right results. With no precise check lists to overcome these situations, we must learn to adapt and provide new training and education platforms. These new training platforms will be vital to allow AI to become representative of all races and gender over the next few years. The talent pool is only set to grow, yet – the challenge remains to ensure it becomes even more diverse.
“As AI and ML become increasingly automated, it’s essential that organisations invest the necessary time and resources to get security and ethics right. To do this, enterprises need the right talent and the best data. Closing the skills gap and taking another look at data quality should be their top priorities in the coming year.”
Oleg Rogynskyy, CEO/founder, People.ai, on the need to collect data now:
“During the last year, we have seen Artificial Intelligence (AI) cross the chasm from tech-hype to the mainstream. In the business world we have seen many applications start to have functionalities that are truly based on AI and machine learning algorithms, and not just user input rules. In terms of what has actually been achieved with AI, the results vary from sector to sector. In the financial services sector for instance, organisations have been implementing customer facing AI platforms and algorithms for several years. Yet, in contrast, the healthcare industry is a lot further behind in terms of the sophistication of AI technologies used.
One of the reasons for this varying level of uptake is due to explainable AI; that is the ability of AI and machine learning algorithms to be able to justify the decisions they have made. In highly regulated industries, such as healthcare, this is of paramount importance. Over the next few years, we should expect to see the explainability aspect of AI become more of a focus for developers and dominate the AI landscape for businesses as it enables users to understand the technology, act responsibly and establish a strong and efficient team between humans and machines.
We are starting to see enterprise companies especially become more comfortable with AI. AI solutions are being implemented that focus on automating and augmenting specific business processes, making the process more efficient in the long-run and enabling the employee behind the process to make better decisions. The AI algorithms applied to Customer Relationship Management (CRM) platforms are great examples of this. Previously, sales teams would have to spend hours each day inputting their user data in a processing machine. Now, AI solutions can now automate this time-consuming data-capture process and deliver the sales rep with real-time actionable insight from their data. This helps sales reps to drive their pipeline by spending more time engaging with prospects and customers.
Only 60 Fortune 500 companies from the ‘60s still exist today. Many companies died out because they missed the opportunity implement automation – something they could’ve caught up on and fixed. With AI, you have to start early. Every piece of data that every employee is emitting today, every piece of digital exhaust, is what companies will need tomorrow. If they aren’t collecting it today and their competitor is, they will quickly die out”.
Businesses around the world and in every sector will be completely revolutionised by Artificial Intelligence (AI) within the next five years - or so we keep hearing.
Realistically, however, AI has the same rate of development and growth as any other advanced technology. It is, unfortunately, constrained by the people researching, funding and implementing the technology to iron out issues in the field.
Perhaps more so than other technologies, AI relies on a large amount of people using it and feeding it vital data so that it can learn and progress. This means that its growth is indeed exponential - i.e. the more AI applications being used in real-world scenarios, the faster AI can progress. But the reality is that real-world limitations such as concerns over the power of AI and data privacy are still pervasive.
These concerns also mean that AI’s progress in the business world often has two sides. Significant advances in AI’s ability to manage human tasks, for example, can be seen as both a positive and a negative depending on your perspective.
An AI application that gives employees in-call guidance on customer service, care or sales calls in enterprises, for example, might be seen as a troubling development by some people. Although, this is very much in use in businesses already, and AI’s ability to recognise (but not replicate) human emotions allows call operators to navigate sensitive conversations more smoothly, whilst also providing training for customer care best practice more quickly.
The more AI programmes can analyse the data of enterprises, connected devices or internet sources, the more they can learn about business processes and consumer habits, thus improving operational efficiency and aiding sales and marketing initiatives. These benefits can only be felt if businesses embrace the technology wholeheartedly and understand the value, as well as the limitations, of AI in a business context.
Appreciating that AI must be fed a large amount of appropriate data, for example, is crucial to getting real benefits from the technology. Training an AI on every piece of information in a company or restricting the amount of data available to an AI, will not yield useful results.
Curiosity conquers all
Many companies are now overcoming fears about AI and recognising that while it is a powerful technology, it is only as powerful as we can train it to be as humans. This means there is still a long way to go before the majority of companies are comfortable with AI and understand it sufficiently to see real business benefits.
As AI becomes far more commonplace in a business context, as well as in our daily lives, fears and misunderstandings about the technology will surely be overcome by curiosity. As proven success stories begin to make the headlines - instead of speculation about the risks of self-driving cars - the real-world benefits of AI may live up to the hype after all.
Last November, Forrester predicted that the public cloud market would grow to $299.4 billion in 2020. Given this, it shouldn’t be surprising that businesses want to capitalise on the benefits of the cloud computing model by developing cloud-native apps. Doing so promises extremely flexible applications which can be developed and deployed at a more rapid pace than ever before.
By Erica Langhi, Senior Solutions Architect, Red Hat.
Cloud-native app development allows organisations to build and run new apps based on cloud principles. This approach helps organisations build and run flexible, scalable and fault tolerant apps which can be deployed anywhere. Going cloud-native allows an organisation to iteratively transition their old services towards containerised, service-based, and API-driven platforms overseen by DevOps automation.
Cloud-native development is about responding to change with speed, elasticity, and agility. This response is achieved through more frequent deployments that significantly reduce the lead time to react to change.
Cloud-native app development is a model that applies across development and deployment of a product or a service. This requires a transition to many new practices, with one of the most challenging transitions to pull off being to build a DevOps culture.
A DevOps culture is one where there exists a set of practices and processes that unify development and operations. Developers and operations have a shared understanding and responsibility for their work, which results in a far more integrated model to creating and deploying applications.
Evolving a DevOps culture and practices can be difficult for an organisation. Changing what your personnel do on a day-to-day basis is, by its very nature, going to be disruptive for an organisation. This is especially the case when one is combining the culture, procedures, and workflow of two separate departments into one new one with a different structure, mission, and set of processes.
Despite the time and effort taken to adopt DevOps, IT leaders know that the long-term benefit of making the transition far outweighs the initial difficulties of the transition. This evolution will inevitably take time, but once complete, the result will be shorter development times and applications that are more flexible for your organisation.
Evolving Modular Architecture
Another key part of cloud-native development is adopting a modular architecture, which sees the functions of applications broken down into independent modules. There is no single path to a more modular architecture. One path is through adopting what is called a microservices-styled architecture, which sees applications broken down into their smallest parts, independently of one another. This is well-suited for cloud-native apps as they share the same principles of elasticity and modularity.
However, these architectures are difficult to implement and maintain. Most microservices success stories see a monolith broken down into a microservices architecture. This is because analysing the existing monolith could help developers to see how the ‘parts’ of an application work together, which helps inform where the boundaries of microservices should be.
Those existing monolithic applications cannot simply be thrown away, since many of them have been dutifully and efficiently running a function for an organisation for many years. The focus should be on iterating towards cloud-native principles in these applications, rather than forcing them to comply with a standard and potentially identifying what parts of the monolith can be exposed via APIs. After all that, one can begin to break monolithic applications down into microservices or miniservices.
Automation and Self-Service Infrastructure
Manual IT tasks are a great source of lost time for development and operations teams. Automating away this work can free up time and resources among your team, meaning software is developed and deployed far quicker. Hence, embracing IT management, automation and tools that create procedures to replace labour-intensive manual processes is an essential part of a cloud-native strategy.
A similar idea runs through pushing for reusability when undertaking cloud-native application development. Recreating the same capabilities again and again – such as caching services, workflow engines, or integration connectors – can be a big waste of time. Modern middleware leverages containers and microservices and developers should take advantage of it as the capabilities it provides have already been optimised and integrated into the underlying container-based infrastructure.
This philosophy extends to the infrastructure you use. Self-service and on-demand infrastructure helps teams quickly create consistent environments, which allow developers to focus on building applications without the time usually spent preparing infrastructure for projects. Containers and container orchestration technology help simplify access to the underlying infrastructure of an application, and provide life-cycle management for applications across different environments – whether they be private clouds, public clouds, or data centres.
Transitioning to a cloud-native development strategy isn’t something that can be done overnight. Rather, it should be seen as a gradual process and also as a learning experience. With few exceptions, such as startups, most organisations have established intricate IT environments that use on-premise applications, along with cloud platforms and services. It’s also not realistic for most organisations to unify all their systems and platforms into a single architecture, at least not all at once. While you can immediately start using a cloud-native platform for your greenfield applications, transitioning your existing applications is a long process.
The key to such a successful cloud-native transition is to go about it with small and incremental steps. Begin by migrating monoliths and applications to the cloud first, either on-premise or off-premise. Then, begin to containerise workloads and establish container orchestration platforms, including monoliths. Then, look at those monoliths and assess which can be broken down into microservices or serverless function calls.
As you run more services, monoliths, or third-party applications on the cloud platform, begin to take advantage of cloud services like integration, business process automation, and API management. These will take advantage of the opportunities afforded by the cloud platform.
After that, it’s a matter of continually learning and improving. As your team becomes ever-more familiar with the changes involved with going cloud native, you can adjust your processes to help improve the efficiency of your developers and software alike.
With a vacillating European backdrop, businesses are racing to digitise, modernise and optimise. To retain their competitive edge and keep up with changing customer demands, they need to be nimbler than ever. Service transformation is a crucial step to achieving this – but how can businesses get there faster? The answer is already in their hands, and it is the greatest tool at their disposal: data.
By Anurag Bhatia, Senior Vice President and Head of Europe at Mphasis.
Advancing technologies can automate the process of harnessing increasingly bigger and richer data sets, to unearth actionable insights and underpin growth. The key to not only implementing meaningful service transformation – but accelerating it – is to future-proof business models by understanding the underlying data and applying analytics.
The shifting landscape – adapt or lose ground
European companies are ramping up efforts to adapt to the new digital age, or lose out to more agile players. The European Commission’s investment proposal, unveiled in 2018, featured €2.7 billion specifically to support a thriving business landscape by boosting the region’s supercomputing and data processing capabilities.
However, despite Europe accelerating at a faster pace than the US on service transformation, we are still lagging by over a year in terms of embracing technology-led transformation. AI and, more importantly, machine learning and deep learning techniques, are evolving into the mainstream. They are now becoming foundational to the process of mining data to set the framework for service transformation and deliver measurable outcomes.
There is encouraging momentum in the analytics space, with UK businesses projected to spend £24 billion on advanced analytics this year. That’s double the investment figure of just three years ago. However, to achieve transformational success, this investment must be accompanied by a more fundamental attitude shift and greater data literacy.
Transformation starts from within
Data analytics is a powerful intelligence tool that draws a holistic map of an organisation to uncover in-depth insights into the effectiveness of your current offering. Secondly, it allows you to become more targeted to your business objectives and the needs of your customer base, facilitating quicker, more accurate decision-making. Let’s explore further why data should inform each component of service transformation…
Driving operational efficiencies and agility by streamlining processes. Service transformation entails working simultaneously from two sides to shrink the core. The first part is moving away from legacy systems to cut down on inefficiencies and time wasted on manual effort. Through analytics you can also identify the applications that are best suited to automation and modernisation, taking a business priority based view. This sets the groundwork and generates the budget for the other half of the equation – introducing next-generation systems that are better architected and on the cloud.
Improving the bottom line. Data unlocks the knowledge required to pin-point exactly where to direct your attention to add value. This means you can act fast to seize opportunities and get the most bang for your buck with investments. Take the UK accountancy firm BDO, for instance, which is directing more investment into robotics and data analytics so that employees can provide an even higher level of specialised consultancy on complex queries. This is an effective way to differentiate yourself and maximise the bottom line.
Elevating customer experience and service. Organisations cater to an increasingly disparate range of end users who want greater personalisation. Hence, an enhanced customer journey should be a top priority in service transformation. Consider the likes of Lemonade, the US player now breaking the mould in European insurtech by utilising machine learning to analyse behavioural and historical data to give customers a seamless, customised experience.
Improving security and risk management. Data analytics enables you to mitigate the risks that you can realistically manage, and foresee how macro developments could affect your business. Helped by advancing machine learning and deep learning techniques, companies can now generate data and use predictive analytics to identify risks and take timely preventative action.
Addressing the principal challenges
Most businesses already collect and store enormous amounts of data but the majority are still not unlocking its full benefits through strategic analysis. What’s stopping them?
Lack of understanding. Many don’t know where to start with service transformation or analytics, let alone both. The difficulty is not so much in adopting the latest tools but in deciding the questions that you want to answer, and the minimum viable data that you need to do so. Work backwards to define your commercial and operational performance goals – from there, the right questions and metrics become apparent.
Overcoming data silos. Data that is hard to access, fragmented or stored in formats that are incompatible with each other poses the chief hurdle to becoming data-driven. You must first engender collaboration and transparency across departments and stop duplicating efforts. Instead of restricting responsibility to the leadership team, encourage organisation-wide commitment and education. Without this, you compromise your ability to deliver a high-quality service to end users.
Legacy systems. Last year, it was reported that the Bank of England spends 33.6% more on IT than any other department in the public sector, largely because it has not updated its systems. Many businesses worry that replacing outmoded systems can be cumbersome, expensive or disruptive. Actually, it costs more to remain stationary than to embark on service transformation. Over time, legacy platforms will not be supported by newer systems, will pose a higher security breach risk, and will not offer the flexibility and scalability to accommodate expansion. Keep in mind that current analytics technology does not necessarily entail complicated or lengthy installation.
European data regulation. More stringent personal privacy laws are shining a spotlight on data management policies. Most notably, the EU’s GDPR and the UK’s Data Protection Act 2018, both of which carry substantial cost and reputational consequences for non-compliance. Of particular significance to your data strategy are the requirements for data minimisation and concerns around automated profiling. Rather than hindering service transformation, this offers a rare chance to review your data and how you process it. Use this to cultivate best practice and customer trust.
Cybersecurity. A recent survey reveals that UK chief executives see cybersecurity as one of their top concerns for the year ahead. With the issue of migrating existing data to the cloud, use of data from both internal and external sources, and the prevalence of IoT, the risk factor climbs and threatens your transformation progress. There’s much debate surrounding public vs private cloud (look at the Capital One breach, partly blamed on public cloud adoption). Hybrid-cloud can offer the best of both worlds, with the upside of cloud while storing your most sensitive data elsewhere. Outsmarting increasingly sophisticated hackers requires real time analytics; the self-learning and pattern recognition properties of AI can spot suspicious activity and minimise damage.
There is excellent scope for service transformation to accelerate in the European landscape thanks to rapidly maturing technologies. However, this bright future relies on a greater understanding of how to make your data work for you. The opportunities and challenges outlined above are interconnected, emphasising that service transformation is not a linear programme and requires a multi-layered approach. They say “knowledge is power”. The knowledge that you need to build a strategy to triumph in a tough and crowded marketplace is already embedded in your data.
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date? Are the majority of companies comfortable with the technology and how it can be used, and are many already using it? Or, are the majority of organisations still unclear as to exactly what it is and what it offers to the business? DW asked a range of experts for their thoughts, and there was no shortage of responses, and plenty of food for thought. Part 8.
Why today’s AI should stand for ‘Augmented Intelligence’ not ‘Artificial Intelligence’
Artificial Intelligence is plastered everywhere, but there are few signs of genuine ‘intelligence’ within today’s technology. Jim Preston, VP EMEA at Showpad, explains why today’s AI should stand for ‘Augmented Intelligence’ and where organisations can make the most of it: in their sales teams:
You can’t walk ten feet without bumping into something ‘AI’ today. Everything seems to be AI-powered in the technology world, but with the possible exception of Boston Dynamics’ robot dog, none of these developments (thankfully) look like the autonomous robots that Star Wars and The Terminator promised us.
This is largely because what we define as artificial intelligence today is actually still the evolution of what we called big data and analytics a few years ago. Machines are getting smarter, but they aren’t ‘intelligent’. They’re better at thinking in a programmed fashion, but they’re still unable to do it independently.
However, this doesn’t mean that today’s concept of AI is wrong – it’s simply a misnomer. It’s more accurate to talk about ‘augmented intelligence’ rather than ‘artificial intelligence’, and it’s human intelligence that’s being augmented. Sifting through reams and reams of numbers, figures and data generally isn’t our strong point – but computers can handle it easily.
According to Gartner, Augmented Intelligence is all about a machine’s ability to “enhance cognitive performance. This includes learning, decision making, and new experience”. It’s all about assisting people, not replacing them. It’s not about entirely simulating ‘independent intelligence’, but rather helping people to make decisions by providing better data analytics.
One of the most interesting applications of this is in the sales industry.
Consider this: during the last two decades, the internet has been democratising information. For a buyer, this means that information needed for purchasing products and services can be found online. In fact, Forrester estimates that 61% of business customers prefer to do all of their product research online, and 67% prefer not to interact with a salesperson for information at all!
This is bad news for salespeople, with many customers not even giving them the chance to provide information. Similarly, when customers do get in touch, staff often find themselves facing well-informed customers who are already familiar with many of the facts and figures that salespeople have up their sleeves.
This is exactly where ‘augmented intelligence’ comes in. Rapid analytics and insight tools can examine the content that prospects are looking at and analyse patterns. Senior buyers will need a different ‘pattern’ of content to junior buyers. Similarly, buyers in the financial services industry may be more limited in their ability to flex, because of industry regulations such as Sarbanes Oxley, which means that sellers will have to provide greater content about their own organisation, in addition to compelling sales content.
While an experienced salesperson will often have an unconscious understanding of these trends and needs, augmented intelligence systems can analyse them effortlessly and provide content in advance, based on reliable data. This can take a lot of the intensive brainwork away from the process, but can also help to bring new salespeople up to speed quicker.
Ultimately, this kind of AI support helps salespeople provide a better experience for buyers, allowing them to do what they do best – sell!
AI is not just about automation, says Justin Silver, PhD, Data Science Manager & AI Strategist at PROS:
‘In many industries, artificial intelligence (AI) is perceived as a threat to eliminate jobs. While this is true for some jobs, in many cases AI will augment human work. Businesses that lag in adoption of AI will be left in the dust by their competitors who are able to effectively leverage AI.
Businesses today understand the importance of AI in their digital transformation. Perhaps the main obstacle for prolific adoption of AI in the business world today is a lack of understanding of how it can be best used to drive direct value for the company. AI can only drive value if it is applied to a well-defined business problem. Rather than jumping to implement an AI solution, businesses need to devote adequate time and focus to clearly define the use case, determine how success will be measured, and assess the data that will be available. Depending on the objective, AI can target improved profitability, customer experience, or efficiency.
Pricing is the most powerful lever a business can pull to impact profitability. AI can be used to automate pricing processes and drive more profitable pricing. AI-based price optimisation enables companies to quickly offer a competitive price point so that prospective customers don’t feel the need to look elsewhere. Making the most of market pricing data means that AI can dynamically price products at a competitive and profitable price in real-time, rather than risking human error doing it manually.
In many industries, consumers are continuing to seek a more tailored buying experience, including more self-service options. That’s why it’s important to enable AI-based product recommendations to customers online. Personalised offers transform static eCommerce websites into an intuitive experience that can increase customer engagement and satisfaction and boost purchase likelihood.
AI is not just about automation. The defining characteristic of AI is a learning loop that drives improvement of the system over time. The learning loop enables AI to drive business value through outcomes like more profitable pricing and product offers that lead to more sales and customer satisfaction.’
The state of AI in the business world - Matthew Walker, VP EMEA & GM at Resolve:
To thrive in today’s technology-driven business landscape, organisations must embrace digital transformation. However, these initiatives often lead to increased IT and infrastructure complexity. Combined with expectations for perfect performance and faster service delivery, IT teams find themselves under immense pressure. Fortunately, disruptive new technologies, such as AI, can lend a helping hand.
Microsoft research indicates that businesses who have deployed AI technology are outperforming other organisations by 5% on productivity, performance, and business outcomes. This increase in throughput not only has a positive impact on the bottom line, but it also improves morale as the workforce can participate in more rewarding and creative tasks that truly impact the business.
For organisations to effectively manage increasing infrastructure complexity amidst budget cuts and a growing skills gap, they must embrace technology to solve the IT conundrum and usher in the next phase of digital transformation for the IT department itself. Introducing AI to IT operations offers tremendous potential for under-resourced IT teams to regain control and truly do more with less, while improving across a wide variety of KPIs.
Artificial intelligence for IT operations (AIOps) applies machine learning (ML) and advanced analytics to gather and analyse immense volumes of data and quickly identify existing or potential network performance issues, spot anomalies, and pinpoint the root cause. Going one step further and combining AIOps with intelligent, cross-domain automation allows IT operations to automatically predict, prevent and fix issues autonomously, improving performance and service quality in addition to cost savings and greater efficiency.
Despite the clear benefits of AIOps to businesses, some people are fearful of its adoption. Employees may worry that AI will make their roles redundant, but this is far from the truth. With AIOps and automation taking care of many of the repetitive and menial operational tasks, IT employees have room to redevelop their roles into something more transformative and fulfilling, driving innovation and growth. For businesses to truly achieve digital transformation success, employees across the enterprise must embrace AI and its myriad applications and champion the benefits as the way forward.
Expect a dramatic effect on the global job market, says Samuel Leach, Director Samuel and Co Trading:
Robotics, 3D printing, artificial intelligence, 5G wireless network, and other technologies provide us with machine capabilities never before possible and will dramatically alter the global job market. Plenty of industries that were once a staple of the economy have evolved, now, tasks that are routine, repetitive and requiring high levels of attention to detail are giving way to automation.
It is now imperative that people at any stage in their career review their skills to prepare for the future. At Samuel and Co. Trading we already use some of the emerging technologies and believe we need to keep our employees skilled in these areas. Intelligent machines struggle with creativity, social intelligence, complex perception and manipulation, as the most in-demand skills for the future. These skills need to be developed, which will take time and practice.
The key to surviving and thriving is to embrace these advances, and then invest in the resilience, adaptability and skills of your workforce. Leadership, complex problem-solving and innovation will be at the forefront of skills required. Artificial Intelligence (AI) is a technology that is seamlessly changing the way we live, move, interact with each other and shop.
At Samuel and Co Trading we use Algorithmic Trading, with ML (Machine Learning). Most applications of algorithmic trading happen behind closed doors of investment banks or hedge funds. Trading, very often, comes to analysing data and making decisions fast. ML algorithm excels in analysing data, whatever its size and density. The only prerequisite is to have enough data to train the model, which is what trading has in abundance (market data, current and historical). The algorithm detects patterns usually difficult to spot by a human, it reacts faster than human traders, and it can execute trades automatically based on the insight derived from the data.
The algorithm takes the price movement from the index and predicts a corresponding move in the individual stock (e.g. Apple). The stock is then bought (or sold) immediately with a limit order placed at the prediction level, in hopes the stock reaches that price. In investment finance, a large portion of time is spent doing research. New machine learning models increase the available data around given trade ideas.
(NLP), Natural Language Processes is a type of machine learning model that can process data in the format of human language. A layer of product recommendation model can be added, allowing the assistant to recommend products/services based on the transactions that occurred between the algorithm and the human user.
Chatbots, are already used in finance and banking. Managers give access to the bot to the users’ transactional data and it uses NLP to detect the meaning of the request sent by the user. Requests could be related to balance inquiries, spending habits, general account information and more. The bot then processes the requests and displays the results.
In my opinion, AI technology will continue to expand and Companies which adopt AI will improve their operations, marketing, sales, customer experience, revenues and quality of deals overall.
Craig Stewart, CTO at SnapLogic on overcoming some key barriers to entry:
The most successful companies of the last decade have all now leveraged AI in order to streamline business processes and operations, power their decision making, and deliver next generation insights about their employees and customers. However whilst there is some broad familiarity with what AI is, business are struggling to overcome some key barriers to entry.
One of the major reasons that companies are not comfortable initiating AI projects is the growing AI skills gap. Recent advancements have put a premium on talent skilled in data and the techniques and systems used to acquire, integrate, prepare, manage, and interpret data -- a skill set with more demand than current supply. In order to address the AI skills gap, businesses must first learn to cultivate collaborative relationships with the academic community as opposed to treating it like a transactional, one-stop-shop for talented new hires. The data-capable talent of the future can benefit from this symbiotic business-academic partnership – aspiring data scientists can deploy their learned knowledge in a real world setting, gaining experience that they otherwise wouldn’t get in an academic setting and, in turn, businesses gain access to the top tier of technological talent.
In addition to closer academic collaboration, companies must cultivate a data culture, with employee upskilling and retraining offered where needed, where every employee within the organisation understands the company’s data strategy, the value of data to their organisation’s success, and their role in using data to achieve their day-to-day work goals. Until businesses address the data and AI skills gap, they will struggle to truly harness the power of this next generation technology.
Another key barrier to AI adoption for businesses is data availability, which often severely delays and even halts enterprise AI projects. Siloed or disconnected data can mean that machine learning engines do not have the much needed raw source information they need to thrive. To avoid this, businesses must ensure they equip themselves with the tools to properly manage, move, and integrate data. Similarly, data that is of poor or inconsistent quality can also be damaging to a business as recommendations may be made based on faulty evidence. Lastly, technology aside, individual departments are sometimes averse to sharing data with other teams throughout the organisation -- this reluctance must be overcome if companies are to leverage all of their data and finally achieve a single source of the truth.
Due to these significant hurdles to AI adoption, many businesses have turned to third-party SaaS software that has machine learning at its core. This helps to align process and operational optimisation with companies who have the expertise to weave AI technology into the services that they offer. Today, companies can partner with a range of SaaS providers to marry AI with a variety of business processes, from HR to finance to customer experience. This is likely to be the way the majority of enterprises fast-track their engagement with AI and use it to further their businesses objectives.
With the increasing application of the Internet of Things, the impending global rollout of 5G and more enterprises increasing their need for SaaS and IaaS applications and infrastructure services, digital data is going into overdrive.
Ciarán Flanagan from ABB looks at how colocation and multi-tenant data centers can achieve faster deployments and realize their assets quickly through modularization.
We depend more and more on data, computing power and connectivity. Data centers are at the heart of our connected world and it has never been more important that they are built and run reliably and efficiently to maximize value for customers and data users.
By 2022, the colocation market is expected to exceed USD 60 billion, with demand for cloud capacity and repositioning of IT expenditure being the key drivers of growth in the market. (1)
North America, Europe and Asia continue to be the strongest players in the retail and wholesale colocation segment. But emerging markets such as India are now offering viable and alternative locations for providers as space, rising costs and reliable power are at a premium in developed markets.
And the pace of change is not going to slow down – in fact, it is going to accelerate as more and more enterprises chose retail and wholesale colocation providers to reduce the overall cost of IT and move from on premise data centers to rent space, racks and cages, or entire rooms and facilities for their IT equipment.
With such rising demand for colocation data centers, builders and system integrators need to enable simpler and faster deployments and expansions.
Quicker and leaner lead times from enterprise contracts mean that even if a colocation data center wanted to expand its current operations by approximately 30MW load capacity, it would need to guarantee that the facility will be ready on time and within budget.
So, how do colocation data centers guarantee timeliness and budget control without compromising on quality standards and security protocols? To stay ahead of the curve, operators need to review alternative options in the design, build and management of colocation projects.
The Need for Speed
Data center owners are now looking to achieve build cycles of between six to nine months for new builds, compared to traditional cycles of 12 to 18 months. As such, modular and scalable builds help owners to achieve aggressive time to market goals more efficiently.
With complicated data center projects that have short delivery timeframes, a pre-engineered product package or prefabricated skid solution, built in an offsite facility can present a viable alternative to traditional brick and mortar construction.
Modular system solutions which feature prefabricated eHouses and skid-mounted power substations including switchgear, transformers, ancillaries and other electrification components in one, offer flexibility, a higher level of safety and integration of intelligent technology, and importantly greater power reliability.
Modular data centers offer a more compact, timely, scalable and convenient method of deploying data capacity and a standardized power infrastructure to where it is needed.
Data center modularization also offers significant benefits in terms of build and testing as much of the build and pre-commissioning work is taken away from a construction site. Costly delays are prevented through parallel stage builds instead of sequential build procedures seen on construction sites.
Projects can be fast tracked, and risks are further mitigated with offsite testing being undertaken in a controlled environment thereby protecting the system integrity and providing peace of mind for operators. This type of approach offers the same level of rigor, consistency and quality as achieved in a traditional data center build, with program timelines being unaffected by onsite testing and engineering works.
Building Blocks to Success
Colocation companies looking to generate return on investment quickly, should also consider pre-engineered building block solution architecture.
These building block systems leverage the benefits of modular solutions together with proven pre-engineered designs to reduce project cycle times, meeting the need for speed and reliability of today’s colocation market.
This is a shift from a traditional “system plus system” approach to an optimized design which provides better economic value through higher utilization of assets while maintaining the required system reliability. The designs have been proven and developed from complex custom solution architectures previously delivered and deployed.
The proven design reduces engineering time by up to 80 percent, and standard designs deliver manufacturing efficiencies which reduce testing and manufacturing time as well as late delivery risk.
GIGA Data Centers
ABB recently deployed a block system approach to a data center in Mooresville, NC for GIGA Data Centers, to develop the critical power distribution design and help GIGA reduce total cost of ownership and speed to market.
Based in the US, GIGA Data Centers aims to use modular data center technology to bring hyperscale efficiency to the colocation market. They believe companies seeking data center colocation should have access to the same high levels of efficiency and flexibility at a competitive cost.
ABB and GIGA jointly developed the system design specifications as well as the sequence of operations for input and output switchboards. The solution comprised of low-voltage switchboards, dry type transformers and Uninterrupted Power Supplies (UPS) to support the IT server and network equipment infrastructure. Such an approach created a more compact and efficient data center that met all GIGA’s requirements and delivered outstanding power distribution and protection performance.
Shaping the future
There is no doubt that the need for speed will continue to shape colocation data center development as demand for enterprise data and services grows. Emerging technologies like autonomous vehicles, “massive” 5G and smart cities will drive the need for colocation at the “edge” and for near real time performance and further efficiency.
Partnering with experienced contractors is important to evaluate current and future business needs and specify the high-quality procedure with the same high level of build skills across the data and smart power architecture. This together with incorporating digitalization can deliver safe, robust and modular data centers which flex and grow with demand, while also delivering faster return on investment for colocation data centers.
To read the whitepaper in full visit: https://campaign.abb.com/l/501021/2020-01-22/qsmskw
1)Data Center Construction Market Due to Grow with a CAGR of Almost 9% During the Forecast Period, 2019–2023 – ResearchAndMarkets.com.” Business Wire, Research and Markets, 10 Oct. 2019, Stack, Tim, and Ian Griffin. https://www.businesswire.com/news/home/20191010005553/en/Data-Center-Construction-Market-Due-Grow-CAGR
In recent years, cloud computing has become popular in the workplace due to its simplicity and ability to access centralised services but with more devices connecting to the cloud, traditional centralised networks are beginning to drown. Edge computing has been introduced to help solve this problem, with more sectors now relying on Edge to play a pivotal role in everyday tasks.
By Ondřej Krajíček, Chief Technology Strategist, Y Soft.
With its ability to foster near real-time speeds and capability of having the data and processing close to the server, Edge will further enhance the applications used in industries like healthcare and manufacturing, improving the way they operate and function.
Edge has the unique ability to push applications, data and services away from centralised nodes to the logical extremes of a network. While some organisations are struggling even to understand the concept and its applications, others are steaming ahead in terms of implementing the newest technology by transforming business processes and decisions. By 2022, it is thought that the Edge market will grow to $6.73 Billion.
Many drivers are contributing to the rise of the Edge, from a growing load on cloud infrastructure to an increase in the number of different applications. The growth of the Edge market is encouraged by the dawn of ubiquitous connectivity delivered by emerging standards, such as Wi-Fi 6, 5G or long-range/low energy networks and the availability of a wide range of applications. It’s clear that this technology is here to stay and is set to have a dramatic impact on all industries.
So, with all of this technology available and the rise of 5G connectivity, what are the future uses of Edge Computing and what will it mean for the workplace?
Perhaps the most obvious application of Edge Computing is autonomous vehicles. Generating a huge quantity of data, connected autonomous vehicles (CAV) are a prime example of how applications may benefit from the Edge data being processed closer to where it’s created, such as the motor, generator, pump and sensors. The need for constant connection and reduced latency is apparent, imagine what would happen if a CAV lost connection whilst on the highway. According to Gartner, by 2020 50% of motor vehicle manufacturers will apply advanced analytics to CAV data to identify and correct defects.
Edge Computing can improve existing manufacturing processes by making them more intelligent and autonomous while providing responsiveness and agility. It can increase reliability and provide real-time insights while minimising failure and costs for storage and bandwidth, Edge can improve the efficiency and use of IoT in manufacturing.
From gathering and analysing local patients data, to patients wearing devices that can diagnose certain conditions, healthcare is embracing technology enabled by Edge Computing. As resources become more stretched providers are looking to technology to deliver services to patients. For example, wearable devices that diagnose or monitor conditions can reduce the number of GP appointments required and given that appointments are often in short supply, this is invaluable.
By using Edge Computing for print jobs, organisations can securely solve issues of lag and bandwidth cost while ensuring continuous business-critical operations. Latency is eliminated as printing is processed on-site, with total bandwidth reduced. Print jobs are processed locally with all data remaining on the company’s network so only metadata is sent over the network. With continuous connectivity, there is no risk of down time, data loss or printing errors.
These are just a few examples of the applications of Edge Computing and as the technology continues to rise in the coming years, we will see more inventive applications of the technology and see many processes improving and speeding up as a result.
The future of the Edge will come with many apps, but with printing used across industries such as healthcare, manufacturing and education, the impact of this change will indeed be killer. Are you ready to take your print management to the Edge?
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date? Are the majority of companies comfortable with the technology and how it can be used, and are many already using it? Or, are the majority of organisations still unclear as to exactly what it is and what it offers to the business? DW asked a range of experts for their thoughts, and there was no shortage of responses, and plenty of food for thought. Part 9.
The symbiotic relationship between AI and data
By Simon Field, Field CTO, Snowflake Inc:
Organisations are increasingly aware of the importance of AI, however there is still a universal lack of understanding of how exactly AI deployment can benefit a company’s operations and business processes. Just 23% of businesses have incorporated AI into processes and product offerings with the majority failing to capitalise on the opportunities afforded to them by AI and machine learning technology.
Despite this lack of deployment, AI is not something organisations should neglect. AI exists and flourishes due to its symbiotic relationship with big data, whereby the former can require large amounts of data to build its intelligence. But the opposite is also true. With such vast volumes of data being created each minute, AI is a great enabler to sift through data and extract the most compelling insights to maximise business value.
A big sticking point in the wider rollout of AI initiatives within business is often due to a lack of strategic direction in terms of its deployment. The first step is to identify what problems within the company AI could solve or support. Once this has been established then strategic leads can formulate a set of metrics to determine what needs to be achieved.
Furthermore, companies must gather and utilise high-quality, accurate data to enjoy the full benefit of AI. Without sufficient volumes of quality data as input, AI algorithms will inevitably be unable to learn and make predictions. The more precise the data, the more accurate the predictions. Data Scientists typically attribute 80% of their effort in sourcing and deriving the input data for AI projects. Gathering, cleaning and maintaining data centrally for AI initiatives can significantly increase the productivity of AI initiatives, by providing a quality and catalogued source of approved data for use in AI models.
Adopting the cloud is also another major factor towards successfully utilising AI and machine learning. Cloud technology is fundamental to the advancement of AI because it enables businesses to ingest the vast amounts of raw data being produced, and which will only increase as more companies begin capitalising on real-time data. Building AI models can require huge computational processing for both data preparation and training, for which the Clouds elastic and pay-per-use economic model is ideally suited. By accessing this compute and data effectively, AI is able to thrive and deliver on its true cognitive promise. It’s no surprise that 49% of companies that have deployed AI today are using a cloud-based service.
Despite the fact many companies are not properly deploying AI technology, success stories within the business world are tangible. Monzo is using AI to deliver on its customer-led approach by allowing customers to discover places in their location based on financial transactions. AI software also streamlines its support service network by developing a system that recommends to its customer support agents how best to respond to certain customer queries.
The retail sector has been a major beneficiary of AI as it has used the technology to completely reimagine the business-to-consumer relationship. Major UK retailers, such as River Island, are now implementing AI at the heart of their operations. The fashion retailer is successfully using AI software to improve its stock management and ability to forecast sales figures in order to anticipate and fulfil changes in demand. By using AI, retailers can personalise the shopping experience for their consumers by better understanding consumer trends, either when shopping online or in-store, providing an optimal multi-channel experience.
Just like the retail industry has achieved, organisations across all sectors will find true value when harnessing the power of AI in order to unlock unrivalled experiences and knowledge about its customers. Data is now plentiful and without the right skills and AI tools in place, organisations are going to be left with vast volumes of untapped data, which ultimately hold a treasure trove of new insights.
Leon Adato, Head Geek, SolarWinds, comments:
“AI is still a point of differentiation across the IT industry. The tech giants are the frontrunners with cloud service providers like Amazon Web Services and Microsoft Azure in a bidding war to become the developer’s preferred cloud computing choice. The key here? Both are investing heavily to provide frictionless AI adoption by offering it (and its concomitant sub-systems) as a built-in service on their platform. And the competition only gets fiercer as Google continues to nip at their heels.
For the rest of the IT world is not far behind. In fact, a survey by SolarWinds shows that one in four IT managers across the UK in small, midsize and enterprise companies cite AI as their biggest priority right now. It won’t happen tomorrow (let’s be realistic, the on-the-ground initiatives in most companies today involve ongoing cloud migrations, containerisation of essential systems, and the miniaturisation of essential motions into functions-as-a-service), but sooner than we may be willing to believe (or admit), AI will stop being a weapon to be brandished against the competition by bleeding-edge frontrunners and will fast become a real capability necessary for survival.
The benefits of the analytics and predictive capabilities enabled by machine learning (ML) and AI are far-reaching, automating many manual, labour-intensive tasks that often require little human input. As organisations push to leverage business data in new ways, it’s the IT professional’s job to implement and manage that data responsibly.”
The state of AI in the business world – a perspective from Alex Tarter, Chief Cyber Consultant & CTO at Thales:
Hackers are gaining ground, according to many executives responsible for cyber security, and organisations are leaving themselves open to attack. As businesses increasingly adopt and expand their digital transformation strategies, this is resulting in them sharing huge quantities of data with external organisations and integrating new technologies and Internet of Things (IoT) devices into their core operations. As a result, this has increased the target surface area for hackers to aim for as they attempt to break in and steal that precious data.
Constantly under threat, security professionals are increasingly bogged down in detection and response processes that are too slow and not fit for purpose, putting a strain on human resources. In this constant game of cat-and-mouse, defenders are looking to Artificial Intelligence (AI) and Machine Learning (ML) algorithms to help them get ahead of attackers. Whilst the number of cyber-criminals continues to rapidly escalate, AI augments the role of security professionals by identifying malicious activities within massive data sets and enables businesses to scale their teams more effectively.
As a second line of defence, ML can be used to combat hackers in two ways. Firstly, unsupervised ML algorithms are able to find anomalies within more generic data sets, helping security experts to identify new threats, while supervised algorithms are fed ‘trainer’ data, which enables the AI to become more efficient at detecting specific cyber-attacks over time. However, it’s clear that AI and ML are not going to solve the problem by themselves, and businesses shouldn’t see these as the ‘silver bullet’ to be left to their own devices. Automation should be used to aid, rather than replace human intelligence. Humans ultimately have the expertise to add context to the data that AI and ML process, helping us decipher what is and isn’t malicious. Sometimes anomalies may be just that – anomalous.
AI won’t provide an immediate and significant winning advantage for defenders but refining ML algorithms to react with automatic or semi-automatic responses to specific threats will bolster human teams facing a scarcity of resources and expertise. We’re currently experiencing a massive skills shortage within the technology and cyber security industries – a situation that’s only going to get worse as more digital processes demand constant expert attention. So, any chance to automate must be seized by businesses to help scale teams efficiently while providing further layers of security and protection.
AI in translation, some thoughts from Alan White, Business Development Director, The Translation People:
“For those who don’t have a lot of experience of the translation industry, it could be assumed that free, on-line translation tools are the pinnacle of technology. While this kind of tool is suitable for quick, literal translations, it is only illustrative of the very basic options available. There are a whole range of other more sophisticated programmes available, offering greater security and efficiencies. And yes, there are huge technological advancements being made with artificial intelligence (AI) to make translations and communicating in different languages easier and more cost-effective – but the real power of this tech only comes when it’s used alongside people.
“Businesses who need large files translated may think they can simply run the text and data through a piece of software, but to achieve a professional and highly accurate translation, you need to make sure there is a person involved throughout the process.
“The tech that’s available can be trained and retrained over time with input from the user – who will be a skilled and experienced translator. By nature, the machine translation engines become customisable to each client – for example, it will ‘learn’ its style guide and preferred ways of translating terms – meaning the more that we translate for a particular business, the greater efficiencies we achieve over time. By using tech and people together in translation, we’re able to efficiently turn the text into an emotive, humanised piece of content, which appeals to the nuances, culture, colloquialisms and jargon relevant to the person receiving the copy.
“For example, the tech we use at the Translation People is designed and used in such a way that it gets better and better over time. Our team of experienced translators review the language translations produced by machine technology at each key stage of the project. They input rules and edits, which are incorporated into the software, making adaptions which help it ‘learn’ how to translate more effectively over time. This learning, memory and reapplication is made bespoke to each client depending on their own unique style guides, terminology, sector, and target country, meaning next time we translate text for them, it’s done more time and cost efficiently.
“Using AI alongside humans is vital in the translation industry. Anyone who has had even just a small amount of experience of using a free translation tool when on holiday, for example, will realise that there’s a lot more to language than simply swapping one word for another. Depending on which language we’re translating from and to, there are multiple factors we have to consider. Whether its tone, rhythm, or more creative needs for crafting the perfect messages in marketing or literature, there are so many different subtleties that may get lost with AI and machine translation alone. In fact, in some situations – for example very creative projects like marketing messages – tech as it currently stands just isn’t appropriate at all and we need a person to carry out the project from start to finish.
“AI does offer huge benefits to the translation industry. It helps us to translate quicker and we can turnaround higher volumes in a shorter period of time, which can provide huge cost savings for businesses. But in order for the tech to work as effectively as possible, we still need humans to check the output, train the machines so they evolve and improve over time – especially when the languages we all speak and the cultures we live and work in have so many wonderful subtleties and artistic merits that need to be considered.”
Digital transformation has been on the agenda of business leaders and IT departments alike for some time now. And for good reason. The promise of improved operational efficiencies, faster time to market and better employee collaboration is too great for many to ignore. According to recent research, 70% of companies already either have a digital transformation strategy in place or are working on one.
By Nick Offin, Head of Sales, Marketing and Operations, Dynabook Northern Europe.
However, the road to a digital-first workforce isn’t an easy one and although digital transformation efforts are underway in most organisations, many common challenges remain. The top barriers range from insufficient budgets and cybersecurity concerns to legacy systems and the adoption of new technologies. Mobility and the rise of mobile working, which go hand in hand with digital transformation, also pose a challenge.
Despite these potential challenges, for many businesses – whether they operate in retail, finance or manufacturing industries – it’s a matter of transform or be left behind. Tackling the hurdles to digital transformation and embracing digital change has never been more important. So, how can CTOs and IT departments overcome these common challenges and successfully transform their business into a smarter organisation?
Overcoming a lack of budget
Digital transformation obviously doesn’t come without extra investment. It’s therefore unsurprising that a common roadblock on a business’ digital transformation journey is often a lack of funds. For businesses who are lucky enough to receive greater IT budgets, much of the attention is placed on emerging technologies like AI and IoT. However, digital transformation isn’t just about shiny new technologies that transform business and operational models, it’s about change taking place at all levels – right down to the technologies that employees are using on a day-to-day basis.
Hardware remains an integral part of any IT strategy and therefore should be part of any organisation’s IT modernisation efforts. Completely overhauling an employee device strategy with new devices is an expensive and time-intensive process. Not to mention managing the day-to-day device lifecycle management which adds significant complexity.
To overcome this many businesses are considering other purchasing options, such as PC-as-a-Service (PCaaS), as part of their digital transformation strategy. PCaaS, which can encompass everything from mobile devices to desktop PCs, is an Opex (operational expenditure) subscription-based model which often includes services such as purchasing, configurating, managing, refreshing and retiring devices. This means an organisation pays a monthly rate for the use of a vendor’s devices and additional services, rather than buying hardware outright (referred to as Capex). Businesses can benefit from updated technology, whilst being able to amortise device costs over time, as well as scaling up and down depending on the need.
Surmounting security concerns
The PCaaS model also provides a potential solution to another common barrier to digital transformation – security. PCaaS can encompass data backup, recovery and remote wiping, giving a business the peace of mind that if the device is stolen, damaged or suffers a cyber-attack that valuable corporate data can be both wiped or recovered.
As more companies embrace digital transformation and with it mobile working strategies, this raises concerns over keeping sensitive data safe. In fact, cybersecurity concerns are a primary worry for business leaders, hindering potential organisational transformation. In a recent survey, 60% of respondents said that exposure of customer data was the largest security concern, closely followed by cybercriminal sophistication (56%) and increased threat surface (53%).
Overcoming the growing sophistication of cybercriminals is no easy task. However, as with any IT project, it’s fundamental that businesses consider security from day zero and equip themselves with the right protective technologies to ease concerns. Devices which include advanced biometric features and hardware-based credential storage capabilities provide a strong first line of defence.
Other security measures such as zero client solutions go even further and help nullify data-related threats by withdrawing sensitive data from the device itself. With information stored away on a central, cloud-based system, these solutions protect against unsolicited access to information if a device is lost or stolen. This is especially useful for mobile workers looking to access data outside of the office or on the move.
Leaving legacy systems behind
A new wave of technologies such as AR, 5G and IoT have been the catalyst for business-wide digital transformation efforts. However, reliance on legacy systems is posing a major bottleneck for many organisations looking to build a digital-first business. The problem is that whilst these new technologies are new and exciting, they create vast amounts of data and many organisations are struggling to efficiently scale their IT capabilities to cope with this increase in demand.
So, what’s the answer for organisations who don’t have the resources to simply ‘rip and replace’ infrastructure? Edge computing solutions enable organisations to resolve this challenge, while at the same time creating new methods of gathering, analysing and redistributing data and derived intelligence. Processing data at the edge reduces strain on the cloud so users can be more selective of the data they send to the network core.
Digital transformation will undoubtedly remain a major part of business conversations for many years to come. Whilst it seems we’re on the way to a digital-first future, not all organisations have arrived in the digital age and there are still several challenges to overcome. With digital-first companies expected to be 64% more likely to meet their business goals, the benefits for those embracing digital transformation are clearer than ever – they’ll just need to overcome a few stumbling blocks along the way.
Everyone's talking about the potential for artificial intelligence in the enterprise, but what's actually been achieved to date? Are the majority of companies comfortable with the technology and how it can be used, and are many already using it? Or, are the majority of organisations still unclear as to exactly what it is and what it offers to the business? DW asked a range of experts for their thoughts, and there was no shortage of responses, and plenty of food for thought. Part 10.
By Anjali Sohoni, Senior Decision Scientist at Mastek .
Retailers need to adapt to rapid changes in consumer attitudes and the marketplace. Every aspect of retail operations has the potential to make or break the customer experience. AI technologies are breaking down barriers and making it imperative for retailers to adopt a globally competitive retail business model. The power of machine learning has made it possible for businesses to continuously scrutinise customer behaviour data and generate alerts when the time is right for the next best action.
FMCG (Fast Moving Consumer Goods) companies are already rich with data. Successful transitioning to AI can be achieved through a gradual approach, the right combination of people, processes and tools. By upscaling AI efforts, companies can glean valuable and actionable business insights. With data-driven intelligence at their disposal, organisations can strengthen four key pillars crucial to delivering a next-level retail experience— the ability to understand their customers, smart merchandising, hyper-relevant marketing and optimised operations- that centres around the products and services their customers crave.
1. Customer lifetime management
Derive factors that influence purchase behaviour
Purchase decisions are now influenced by a variety of factors that were not envisaged. Customer segmentation based on behaviour now takes in account not just frequency-recency-monetary but also diversity of purchases, potential to experiment with a new product, seriousness, profitability (amount bought/amount returned) and other such variables. These aspects map differently to different product categories.
AI can help in bringing clarity in what really drives experience. It can also help determine the production quantity and potential consumers.
Track and prevent customer attrition
Customer behaviour can be tracked to glean insights into their loyalty. Creating customer analytical records and detailed journey maps help in identifying which customers have deviated from their usual behaviour and what should be the next best action.
Customer look-alike mapping aids in quickly gauging the time to action and in identifying the action that is most likely to achieve the objective.
Drive higher revenues per customer by running targeted cross-sell and up-sell
Customers are willing to share their data if they enjoy a higher quality shopping experience. By offering differentiated shopping journey, retailers and brands can demonstrate that their relationship is authentic.
Past purchase behaviour of similar customers can be combined with product features and pricing to derive a propensity model for future needs.
Pricing plans and promotions to optimise certain KPI
To remain profitable, optimal pricing is everything. There are inherent trade-offs between balancing growth in volume vs driving return on investment. A combination of metrics has to be modelled so that it can be achieved in unison.
Being able to simulate scenarios and measure probable impact of business decisions can be achieved through detailed modelling of available data and algorithms that learn from that data.
Creating customer-centric assortments
Good assortments which give ‘instant gratification’ to customers are at the heart of retail. They yield better sell-through, lower chances of mark downs and help optimise the inventory store by store. Understanding customer demographics, behaviours, tastes and millions of transactions requires a mix of data analytics and machine learning.
Edge sensors can be deployed to gather in-store data points such as duration for which a customer looked at a certain item, and if the item was picked up from the shelf. Scalable AI solutions can anticipate customer behaviour and replicate market conditions to run complex analysis on large datasets to help decision-making.
AI can help retailers deliver outstanding customer experience and meet their financial goals.
Retailers need to see, at an aggregate level, who is moving around in cities and where are they going, so that they can market products and services more effectively to consumers, and also identify the prime locations for new stores. Retailers are also tapping into location data to find out more about the potential customers that are walking by their storefronts every day.
AI leverages open data sources to analyse the interplay between demographic factors of a certain location and the success of their operations in that area.
Market Mix Modelling (MMM)
An effective MMM takes in account the marketing spending across product base and their impact on the revenue. With a sharp increase in customer touch points, there is a corresponding rise in the data that needs to be analysed. Without ironing out the collinearities and correlations in the data, such analysis can become inadequate.
Powerful AI algorithms can aid ‘what-if’ analysis and highlight correlations in the data.
Personalised Assistance in Real-time
According to a Salesforce report, 70% of consumers say a company’s understanding of their individual needs influences their loyalty. Retailers are now equipped with the next level of personalisation where they are able to provide just-in-time assistance and proactive care.
AI can identify potential customers based on purchase history to promote and share focussed offerings in a brand. When such customers come in vicinity of the store, a message with an offer can be sent on a mobile phone.
Improving Adoption of Loyalty Programs
Loyalty programs seek to go beyond registration and occasional cashing in offers. Inactive memberships can result in relationship breach. Demographic look-alike mapping, targeted efforts can be made to convert this into an active membership by sending right offers at the right time.
Retailers can now obtain a holistic view of their supply chain covering raw material acquisition, production and last-mile delivery by adopting intelligent, data-driven processes. This perspective is essential not only for stock forecasting and managing customer expectations but also for a shared common understanding among all the members of the chain.
Advanced analytics can predict demand spikes, identify bottlenecks and reduce supply shortages for dozens of products.
Digital processing of invoices requires extracting textual information from images of invoices. Formats of invoices may vary depending on issuer of the invoice.
Cloud-based, OCR enabled service can help in expediting information extraction and improve SLA compliance significantly.
Artificial intelligence has been getting a lot of attention lately, and rightly so. New techniques and advances in technology now make wide use of AI possible across all major industries and sectors. But being successful with AI requires more than selecting the right technology. Knowing this, organisations are scrambling to hire or develop the right talent and establish suitable structures to make effective use of AI. And if you look for it, you’ll find plenty of advice on “organising for AI”.
But there is something missing in much of this advice: How to organise for analytics in general. It wouldn’t be as much of a concern if these basics were already in place in most organisations, but they aren’t. Decision makers in large enterprises still must contend with missing data, overlapping and inconsistent reports, and “information” presented as perplexing stacks of numbers without even the most basic visualisation applied. Besides, why would you want to put the burden on decision makers to decide whether the analytics they need are “advanced” or “regular”?
In a properly designed analytics function that serves the entire enterprise, there is – and always has been – a place for advanced techniques. While it does make sense to put together a task force to assess the potential value of AI, it would be a mistake to assume that this team should ensconce itself as yet another permanent silo. It would be much better to thoughtfully align the capability with existing functions while simultaneously addressing a long-standing need to better serve the organisation with information and analysis of all kinds.
Whether AI provides the impetus or not, here are a few things to do when creating an analytics team.
Define the mission of the team. The purpose of the team should be comprehensive without going too far. The team should be responsible for offering reporting and analytics along with associated business insights throughout the organisation. However, it should not be responsible for establishing (de facto or official) production data resources, unless you are willing to take this responsibility away from the existing IT department. You should also be clear about departments that are to be served and any departments that will be responsible for their own reporting and analytics. Overlaps and conflicts can last for years – better to address them right at the beginning.
Identify an existing team that could be transformed into an analytics team. In almost every case, if you look, you’ll find an organisation that is already offering reporting and analytic services to other departments. There’s usually such a group in the finance function, for example. Even if they are primarily focused on reporting financial information, they usually venture beyond that core and into factors that affect financial performance. In some cases, groups that just happen to have good analytic skills – such as marketing, operations research, or strategy functions – generously offer those capabilities to other areas outside of their group. In other cases, there is an IT group that goes beyond technical enablement and into business analysis. Often, you’ll find all of the above. Even if one of these groups is not selected to become the primary enterprise analytic team, you’ll need to solicit their buy-in and participation and, again, avoid overlap.
Define and fill the role to lead the team. Whatever the actual title, the role to lead the team would essentially be serving as a Chief Analytics Officer. So, this role could be labeled as such, or the responsibilities could be assigned to an existing Chief Data Officer, possibly renamed to a Chief Data and Analytics Officer. If neither a CAO nor CDO is currently in place, it’s probably best to create one combined role at least initially, then split the responsibilities if that seems right after some experience. The role could report into the CFO, CIO, COO, or even directly to the CEO, depending primarily on the mission and organisational influence of the executive.
Define the team structure. Even though you’ll often hear that “there is no one right way”, I believe in this case, there is. Any other choice is just a concession to political realities. However, there are variations on the theme. Any internal function, like an analytics team, that offers services to other areas within the organisation should have a centralised core for shared capabilities and representation for each department served. The departmental component of the analytics team can either report straight line to the central group and dotted line to the served department (my preference), or straight line to the served department and dotted line to the central group – or a mix of the two. One of the shared capabilities in the central group should be advanced analytics, including artificial intelligence. Of course, if specific departments require full-time dedication for advanced capabilities, that should be provided as well.
Define the relationship of the team to other functions. I already mentioned the need to coordinate with IT for production data deployment, but there are other important relationships to consider. For example, the analytics team should work with IT business liaisons and associated application teams to ensure that there is one and only one funnel for production application deployment. While the analytics team should develop and deliver production analytics – whether one-off studies or institutionalised report distribution – IT should be responsible to deploy custom or packaged application software solutions, even if they contain an analytic component. There isn’t a hard line here, and a good working relationship between the liaison and the lead analyst assigned to the same department will be crucial. Keep in mind that when any advanced capability – including artificial intelligence – demonstrates repeatable value, it’s likely that software companies have already packaged that value into an off-the-shelf application, so it’s important for both groups to understand the broad range of possible ways to solve a problem – avoiding the hammer-nail syndrome.
Develop or recruit the needed skills. If your kids ever wonder why they need to take math in school, just show them the latest salary figures for a capable data scientist. Of course, it’s not so much the mechanics of mathematics that matter, but the ability to use logical thinking to leverage the most appropriate techniques to solve problems. These abilities, combined with good communication skills, are the key. You do not have to participate in the frenzy to hire a team of seasoned data scientists. With the right base capabilities and mindset, the technical part can be learned. And if there is an effective division of responsibility between IT and the analytics team, as noted earlier, the data scientists (and other analysts) won’t have to spend most of their time collecting and managing data. Placing excessive data management burden on data scientists inflates the number of analysts needed for the team and increases the breadth of skill requirements for the role, making it much more difficult to staff.
I often encounter resistance to recommendations like these. There are various reasons for this – always understandable, but ultimately, I believe, political. Or maybe it’s better to call it organisational inertia. I think that’s why it’s so tempting to treat AI as an add-on to the organisation. Dealing with legacy is tough business. Making changes to deep-rooted organisational structures requires a lot of finesse and, let’s be honest, at least a little bit of force. But if you’re willing to address analytics with the breadth and seriousness that it deserves, you will be rewarded with a world class capability that has meaningful impact on a broad range of opportunities, unconstrained by a narrow – if valuable – focus on AI alone.
In this article Ian Bitterlin, of Critical Facilities Consulting a DCA Corporate Partner, looks at the Data Centre Cooling in the past and provides his views on the current position.
The cooling of ICT hardware in data centres has a history from the mid-50s when IBM set the requirements for temperature and humidity in machine rooms, resplendent with hairy brown raised floor and orange walls. Those limits were 21⁰C±1⁰K and 45-55%RH and set for very good reasons: If the temperature varied outside of those limits (and too rapidly) the magnetic tape-heads produced read/write errors whist, for humidity, if the air was too dry the punch-card sorting machines created enough static discharge to wake Frankenstein’s monster but too damp and the cards absorbed moisture, swelled up and jammed the mechanism. It is worth noting that the limits were in no way related to the computing hardware, not least because that was liquid-cooled chilled water at 6⁰C/10⁰C flow/return – it’s funny that what goes around comes around. It is a testament to our industry and its highly conservative behaviour that those limits are still occasionally seen in specifications today, even though the magnetic-tape and punch-card hardware (as seen in Billion Dollar Brain and Hidden Figures) are long dead.
However the IBM specification did not say ‘where’ the temperature and humidity should be measured – ignoring the fact that a load increases the air temperature and reduces its relative humidity – and so the only safe place to measure it for a performance test was at the return point of the cooling system. Thus, was spawned the ‘cold’ data centre as, with no airflow management around Big Blue, you had to pick the point which would not have any argument, and ‘close control’ air conditioning was born. This resulted in 6°C flow chilled water pushing 15°C cold air into the raised floor, with 50% bypass air and 21°C return air temperature to the CRAC. Expensive to buy, run and service but very reliable if you tended to its needs of drive-belt, air-filter, humidifier and condensation drain. It’s another story but the power was 208V/441Hz, so called 400Hz, copied from aircraft supplies but when Amdahl departed IBM, he introduced the air-cooled 60Hz machine and the era of odd power and liquid-cooling was soon over.
There was no progress in standardisation, and little agreement between ICT OEMs in the years between the IBM Planning Manual and the late 90s but in manufacturers identified the need to provide standardization across the industry, and in 1998, a Thermal Management Consortium was formed. This was followed in 2002 by the creation of a new ASHRAE Technical Group, TG9-HDEC, High Density Electronic Equipment Facility Cooling, to help bridge the gap between equipment manufacturers and facilities. In 2003, they became a technical committee (TC 9.9, Mission Critical Facilities, Technology Spaces and Electronic Equipment), and the new publication: ‘Thermal Guidelines for Data Processing Environments’ was its first publication.
The first, and some might agree the most important, change was to define ‘where’ the temperature was to be defined – the server inlet – which immediately changed the supply air from 12°C to 21°. Since then ASHRAE TC9.9 (populated by the ICT OEMs) have continually widened the envelope of temperature and humidity in both ‘Recommended’ and ‘Allowable’ ranges for at least four classes of hardware – with Class 1 being the sort of kit that will find itself in the data centre. As you move up in Class (2,3, and 4 etc) the Allowable range for temperature and humidity widens even further but one thing is common for all – there is almost no need for humidity control as long as the air does not condense on the hot equipment.
Everyone can buy a copy of the Thermal Guidelines and look up the ranges but here it is only worth touching on the most conservative, Class 1 hardware used in the Recommended range. This is, today, 18-28°C non-condensing at the load input and far (far) away from the precision 21⁰C±1⁰ and 45-55%RH at the CRAC return. So, the warm, if not hot, data centre has been born.
But we live in a funny world driven by money and paranoia and the number of people who will only use Recommended in the ‘previous’ ASHRAE is the large majority. Only the hyperscale operators (and not all of those, including the largest) push the boundary to achieve ‘cutting edge’ energy saving whilst most colocation providers know their customers and act conservatively and only save energy as long as there are no provable risks.
And there ARE risks. They are all documented in the Thermal Guidelines. If you get a combination of higher temperature, elevated humidity and dirty air you will get accelerated corrosion – and if you don’t refresh your hardware every 2-3 years some of it will start to rot and fail. Even running at elevated temperatures will shorten the hardware life – electronics don’t last as long when hot than when warm. The result is that the data centre industry is highly conservative and looks at the value of the load being protected (about £180 of business value for each kWh burnt) versus the cost savings of 40% of energy when the energy costs £0.12-15 per kWh – a tiny saving compared to the value of the load. If you are asking ‘why do people like Google and Facebook take risks?’ it is because the load is not all that critical and they don’t charge us for their services – they get their revenue by selling our data.
So, the choice of cooling system is NOT the primary question. The primary question must be ‘what Class of hardware will I buy, how often will I refresh it and how tight will I control its environment?’ All available systems ‘work’ and you can buy them to work in the UK with a pPUE from as low as 1.04 (indirect adiabatic for which you need a special building and rural location), to a multi-story city-centre chilled water system with a pPUE of better than 1.35 if you design it right. The second question should be ‘will I have partial load?’ – but that is for another day…
Matteo Mezzanotte, PR, Communication and Content
Submer Immersion Cooling
What we refer to as Data Centres today, started out as computer rooms in the 60’s to house the first mainframe computers. While computers and servers evolved quickly over the years, an evolution of the Data Centre itself per se and more specifically the cooling was not really necessary as the evolution in computing also brought significant innovations that allowed (apart from a few notable exceptions such as the CRAY 2 in 1985) denser computing and continued cooling with air and fans.
Cooling a Data Centres typically takes a lot of energy (around 40% depending on location and design) and a lot of planning. Traditionally, there are three different ways to cool a Data Centre: air-based cooling, liquid-based cooling and a hybrid of the two, and within these categories are a couple different methods.
Among the air-based cooling systems, we have:
Liquid Cooling: Back to the Future
When it comes to liquid-based cooling systems, it is necessary to distinguish between water, synthetic fluid and mineral coolant.
Water-cooled racks (otherwise known as rear-door chillers) have water flowing alongside the racks, but never actually touching the servers. This solution works well, but there is still the legitimate concern of water leaking onto the servers and components and compromise the IT Hardware integrity, not to mention the fact that the chilling is done with compressors and uses quite some energy to cool. Direct liquid cooling mostly refers to water or other liquids that are delivered directly to a hot plate which is only on CPUs and GPUs, dealing with 70% of the heat. With rear door cooling and direct liquid cooling, it is normally advised to have a secondary form of cooling such as a traditional CRAC unit to handle the excess heat in the room.
Liquid Immersion Cooling means that the entire servers (switches, etc.) are completely submerged in a synthetic or mineral fluid, designed to deal with 100% of the heat and preserving the IT Hardware (no particles being shot) and is the most efficient method to save energy if combined with dry coolers (with or without adiabatic functionality) using only ambient temperatures to lower the warm water of the secondary cooling loop.
Liquid Immersion Cooling isn’t a new technology, it has been widely used since the 1940s to cool high-voltage transformers. In the 1960s, IBM developed the very first direct liquid cooling system. In the 1980s, with the advent of metal oxide semiconductors, the concept of a liquid cooling solution became less of a priority as this new method of packing multiple transistors didn’t generate the same proportionate heat as the size of the transistors and therefore their voltages were drastically reduced.
Today, cooling systems are definitely more complex and high-tech than 4-5 years ago, as cooling has had to catch up with the demands for denser computing and in part the beginnings of Moore’s law breaking down.
Immersion Cooling has come back into the spotlight (kind of analogous to the case for the electric car) and in 2019, we started to collect evidence that it will represent the most efficient cooling solution in the years to come and become mainstream in the not so distant future. Not everybody is fully convinced that Immersion Cooling can represent a viable solution, typical for anything different from the current normal way of doing things. Submer (amongst other) are invested in Liquid Immersion Cooling and are gathering increasing support for their vision of this tech as the present and the future of next generation Data Centres.
Immersion Cooling for Data Centres
Just like the containment method improved the cold aisle/hot aisle system, the Immersion Cooling method designed by Submer can be seen as an improvement of the water-cooled racks system. The real revolution done by Submer (and that banishes all the doubts and fears about having your IT Hardware in direct contact with a liquid) is the use of a specific dielectric (does not conduct electricity), non-toxic, environmentally friendly, non-flammable fluid, called SmartCoolant, that allows to dissipate the heat produced by the IT Hardware while preserving and protecting it. This cutting-edge method goes against the grain of the traditional air-based cooling systems and, considering the extremely high densities that can be achieved and the energy saving, it can be quite confidently said that it represents the future of Data Centre cooling.
Immersion Cooling is more efficient than any air-based or water-based systems. According to Data Center Frontier, it is 1400 times more effective than traditional air-cooling. With air-based systems, fans need to be blasting 24 hours a day even when it isn’t needed (the old, unsolved question of “idle” Data Centres). With Immersion Cooling, you just don’t need fans anymore and this means 15%-25% less power consumption than traditional methods. The typical Data Centres and colocation provider will see up to 50% decrease in general electricity cost by adopting a Data Centres Immersion Cooling solution.
Taken together, the electrical costs of the internal cooling fans and chilling of ambient air around the servers often exceeded the power used in productive computing.
To maximize return on investment (ROI) while ensuring the full potential of IT equipment and thus guarantee high quality results, Data Centres need to adopt a better cooling strategy than air-cooling (The Green Grid suggests a range of 15-25kW/rack as the limit for air cooled racks “without the use of additional cooling equipment such as rear door heat exchangers”).
Now, it is true that implementing a completely new system of cooling can seem difficult. Some people in the industry might even question if the change in method makes financial sense since lowering one's costs isn’t everything one needs to consider. All these doubts and concerns are legitimate. That’s why it is essential to raise awareness and educate people about the real positive impact of the LIC technology for business and for the environment.
A Tsunami of Data
The global data centre market size is poised to grow by USD 284.44 billion during 2019-2023, according to a new report by Technavio, progressing at a CAGR of more than 17% during the forecast period, as reported by Technavio.
The new digital trends that are transforming the social, urban and economic landscape and the advanced computer modelling and big-data applications have all something in common: they require to process an ever-increasing amount of data. In this scenario, it is obvious that Data Centres are destined to play a fundamental role in the future (present and remote) of our societies. It appears an irrefutable necessity as well to find sustainable and smart ways to address the needs of Data Centres, exponentially growing, with no signs of slowing down.
“We have a tsunami of data approaching.”
Anders S.G. Andrae (Senior Expert Life Cycle Assessment at Huawei Technologies)
Processing all that data in a smart way has thus become a competitive and economic imperative in order to develop new products and services, maximize marketing efforts, and bring greater efficiencies to the business. Nevertheless, Data Centre systems generate heat-loads far above traditional, server-based applications – drastically increasing the costs of cooling the equipment and the space needed to operate it effectively beyond the costs of the systems themselves and lowering the return on capital investment (ROIC).
The “Social” Data Centre
We live in a deeply connected world that will become hyper-connected. This transition is inevitable and cannot be executed without Data Centres. Given the digital growth rate projections of those areas of the world where the Internet has only started to be widespread, it is reasonable to think that our society will rely more and more on Data Centres. For this reason, the sector must find alternative solutions to promote sustainable innovation (renewable energies, open source solutions, circular economy, efficient software and hardware design, etc.) and avoid “cannibal”, uncontrolled growth.
Big players, such as Apple, Facebook, Amazon, Google and Microsoft, have promised or have already started adopting policies to take further action on climate change or become carbon negative. It is definitely a first step (and maybe not always towards the right direction, as denounced by Greenpeace), but there’s a long road ahead and it would be also recommendable to start teaching some “digital sobriety” to ICT users.
A Three-pillar Strategy
Economy, Environment and Social are the three pillars around which the (r)evolution of the IT sector will be built.
Too often, TCO and ROI are the primary elements for Data Centre managers to determine their success. It’d be advisable to start considering Data Centres as a sort of orchestra, where all the elements dialogue with each other contributing to the execution. In this way, adopting energy efficiency solutions (software and hardware design, liquid cooling solutions, smart buildings, circular economy, etc.) would lead to substantial savings on the electric and water consumption. These savings could then be used to invest on more IT equipment, that would translate into more IT Hardware density and, consequently, improved performance.
Last but not least, diminishing the electric and water consumption, and implementing circular economy best practices, etc. would minimise the impact on the environment with a clear positive image return for any company.
A Green Immersion Cooling Data Centre
Submer is proud to announce the installation - at BitNAP Data Centre - of 10 SmartPod units delivering up to 500kW of heat dissipation. This is the first sustainable Liquid Immersion Cooling cluster in Spain to rely on renewable energies as a secondary cooling system, being BitNAP part of the district heating and cooling (DHC) system operated by Ecoenergies. The smart thermal energy network designed by Ecoenergies produces - among other things - cold water used by BitNAP as a secondary cooling loop. This smart grid - powered by a biomass plant that uses residues from Barcelona’s parks and gardens, gas and electricity - reduces the fossil fuel energy consumption by 67,000 MWh and the CO2 emissions by 13,412 T per year. The installation at BitNAP is the perfect example of the three-pillar strategy applied to the Data Centre industry. This is the first of many examples to come of real symbiosis between Data Centres, renewable energies, smart technology and a truly viable industrial application.
Paving the Way Towards Next Generation Data Centres
Submer Technologies was founded in 2015 by Daniel Pope and Pol Valls to answer a precise need and intuition: the necessity to tackle the Data Centre business from a new angle, creating highly efficient solutions to pave the way towards next generation Data Centres.
In the last 10 years, we have witnessed a major evolution in the Data Centre industry, from ICT equipment rooms to cloud Data Centres. In the next 10 years, we believe we will be witness to a revolution.
Submer is changing how Data Centres and supercomputers are being built from the ground up, to be as efficient as possible and to have little or positive impact on the environment around them (reducing their footprint and their consumption of precious resources such as water).
We believe that “innovation” doesn’t have to contradict “sustainability”. We believe there is a better way, a smarter way to stay cool. And we also believe that DCA represents the perfect “agora” where to share our ideas, our concerns, our hopes about next generation Data Centres and a privileged spot for keeping media, governments and public informed of the vital role of Data Centres in the society of the future.
 In the United States, e-waste accounts for 2% of waste and 70% of total toxic waste.
As demand for data collection, storage and exchange multiplies at an unprecedented rate, experts are concerned that the energy consumption of data centre infrastructure is becoming an increasing drain on the world’s energy resources.
In an era where on an annual basis, data centres in the United States alone consume the entire output of 34 of the world’s largest power stations, methods of conserving and reducing the energy input of some of the globes most socially and economically critical buildings is more important than ever.
Here, Tim Bound, Director at Transtherm Cooling Industries, explores the evolving role of ambient cooling technologies in keeping energy costs low in data centre applications.
By 2020 the UK will be the largest single market in Europe for data centres, and analysts predict this will cost the industry up to £7 billion per year in energy alone. Disclosure of data centre energy and carbon performance metrics is now driving change in a sector where the potential energy savings could comfortably sit in the hundreds-of-millions.
There are a number of ways to deliver power savings in data centres, from optimising renewable energy sources, to updating the physical infrastructure of the building, investing in modern server technology with lower heat emissions, or carefully specifying the right cooling equipment.
Why is ambient cooling technology increasing in popularity?
Historically, data centres have been cooled using compressor or refrigerated technology, often with an adiabatic cooler installed to dissipate the heat generated by the compressor. The performance of this conventional chilling technology has serviced the mission critical data centre sector well thus far, but change is definitely afoot.
There is a distinct shift in popularity from traditional compressor based chilling methods to far more energy efficient ambient cooling technologies, with a particular focus on adiabatic solutions due to its retrofitting capabilities compared to other ambient systems such as direct evaporative or heat exchanger technology. Why? The two main drivers for change are a surge in investment in the construction of new or extended data centres, plus a significant improvement in the operating parameters of server technology.
The global data centre construction market is estimated to grow from $14.59bn in 2014 to $22.73bn by 2019, at a Compound Annual Growth Rate (CAGR) of 9.3%. This thriving market growth is down to many companies transforming traditional facilities into mega data centres and others planning to build new ‘monster sites’ in the coming years.
With modernisation, comes state-of-the-art server technology which is capable of withstanding higher temperatures whilst maintaining optimised performance and reliability.
The modernisation of legacy data centre infrastructure, the increased construction of new ultra-modern sites and the widespread adoption of more temperature resilient servers has driven a desire for compressorless cooling, choosing the energy efficient benefits of ambient cooling technologies instead.
Why is this happening now?
Historically, ambient cooling equipment has been unable to cool data centres to within the right temperature range, which created an industry reliant on refrigerated or compressor led solutions. Now, thanks to more adaptive build methods and the marginal, but vital, increase in server temperature resilience, the temperature parameters of data centres has risen to a range which ambient cooling technology can comfortably achieve.
In other words, instead of cooling the compressor, adiabatic equipment is now directly cooling the data centre equipment itself. This is reducing CAPEX expenditure by eradicating an entire chiller from many new build specifications.
Why is this a more energy efficient solution?
Adiabatic and other ambient cooling equipment achieves lower Energy Efficiency Ratios (EER) than compressor based chillers. Take the example of a typical compressor chiller which will consume 1kW of energy for every 3-4kW of cooling delivered.
Compare this to an equivalent sized adiabatic solution, which for the same 1kW of energy consumed, will deliver up to 75kW of cooling.
In terms of EER, this difference translates as an approximate EER of 4 (4kWth/1kWe) for a conventional compressor solution, and an impressive EER of around 75 (75kWh/1kWe) for the equivalent ambient solution. This of course, translates to substantial energy and cost savings for energy-hungry mission critical environments.
Can legacy data centres still benefit from ambient cooling?
Older data centres which house older style servers will still rely on compressor or refrigeration chillers to dissipate enough heat from the data centre, and maintain an internal building environment which meets the operational temperature thresholds of dated server technology.
That said, given that most leading ambient coolers are installed externally, retrofitting additional cooling technology to supplement an existing compressor solution is a highly viable option for many older data centres looking to maximise geographical assets and weather patterns to deliver free cooling during certain seasons.
If an older data centre retains its legacy infrastructure but is refitted with state-of-the art servers which can operate at higher temperatures, adiabatic cooling equipment can be retrofitted to the same inlets as older equipment.
As the data centre industry evolves and grows at a rapid pace, energy conservation focused industry leaders like Facebook have already maximised ambient and free cooling technologies within their data centres. As we move towards a future even more reliant on the storage and exchange of data, the rest of the industry is expected to follow suit.
For more information on ambient cooling technologies for data centres, talk to Transtherm Cooling Industries, or visit www.transtherm .co.uk.
Transtherm Cooling Industries Ltd is family owned and operated in Coventry, UK and has been pioneering the technological advancement of air blast cooler, adiabatic cooler and pump set systems for commercial and industrial applications since 1989. Operating across industries such as; automotive, power generation, HVAC, and data centres, Transtherm provides advanced, cost effective, energy- efficient cooling solutions.