By the time you read this, the UK may, or may not, have obtained some clarity over its future outside of, or still inside, the European Union. However, whatever the outcome of the General Election, it’s not immediately obvious how a result for the right or left will actually provide real direction and certainty in the foreseeable future. Leaving Europe still requires the negotiation of what would seem to be a massively complicated trade agreement. Remaining in Europe would appear to require another referendum, so months of further delay.
For many businesses contemplating a digital future, a similar lack of confidence seems to be the order of the day. Rushing headlong to embrace the ‘amazing’ potential of the various technologies which underpin digital transformation may well be fraught with as many pitfalls as benefits – not least the fact that many of these technologies (like a final Brexit deal) require quite some time before they are either easily available and affordable (ie 5G).
Where remain does remain an option for the UK, keeping the same IT infrastructure and expecting to remain competitive is not an option for almost all businesses. That said, a political remain deal will almost certainly come with some fundamental changes either between the UK and the rest of the EU countries and/or some changes for all within the EU community. In other words, whether it’s politics or technology, doing nothing is impossible, but what to do is not that easy to determine.
2020 will see some of the new technologies and ideas begin to gain significant traction, but, sorry to be a doom and gloom merchant, it will probably be characterised by continuing confusion for many, if not all businesses, who know they need to do something, but are not sure quite what, where or when.
All I can say is, make sure you do your homework before you do make changes, and, as with the politicians, don’t believe the hype or fake news!
Happy Christmas to one and all.
New study finds that 74% of European organisations are struggling to introduce digital tools and practices to improve decision making.
The majority (74 per cent) of businesses across the UK and Europe are still facing major obstacles trying to introduce new digital tools and practices to support their decision making, according to a new report.
The primary barrier, according to 57 per cent of European C-Suite leaders, is data fragmentation. Siloed data results in a lack of visibility across business processes, and in turn a poor decision-making culture.
These difficulties are compounded by the findings that 65 per cent of leaders feel significant pressure to deliver a successful digital transformation strategy. They need to be able to act and react fast enough to dynamic market conditions and innovation opportunities – and open themselves up to critical risks.
In partnership with IDC, Domo, a business to business cloud software company - polled 375 employees across multiple industries to understand how they are using modern digital technologies to improve and help deliver value across business.
The study also found that C-Suite leaders find it difficult to create a holistic view of their business. This restricts leaders in taking an integrated approach to digital transformation, as they struggle to connect the dots between varying stakeholders. Key barriers include:
“Leaders are facing unprecedented amounts of pressure from technological, social and regulatory forces. It’s no longer about driving digital transformation in vertical stacks, but rather horizontally across the entire enterprise,” said Ian Tickle, Senior Vice President and General Manager for EMEA at Domo. “However, when data is stuck in silos across the business it hinders new and efficient ways of working, and ultimately stalls digital transformation”.
The research predicts that by 2029, 75 per cent of European organisations will be completely digitally transformed or what IDC defines as a ‘Digital Native Enterprise’. It is clear, however, that in order to achieve this the right processes need to be put in place - plans and predictions can’t happen without effective business decision-making.
Most firms have not fully embraced digitisation of business processes, products and services. For example, only 4 per cent of ‘digital laggards’, have integrated tools such as Domo to harness the power of their data and accelerate time-to-value. Compared to 67 per cent of ‘digital leaders’ who, as a result are able to react quickly and accurately enough to cope with the dynamic market conditions.
Tickle summarises: “True Digital Transformation isn’t just about creating a ‘digital outside’ to your organisation: leaders must consider every business element, investing in reinventing their data, expertise and workflows to create advantage. Having effective business decision-making tools that harness all the data available is the key foundational step to advance any digital transformation journey.”
Human-machine collaboration the new normal for successful businesses
Researchers find that organisations need content intelligence to advance digital transformation.
New survey from leading research firm IDC has revealed the true extent of software robots supporting humans in the workplace. The IDC whitepaper, Content Intelligence for the Future of Work, sponsored by ABBYY, indicates that the contribution of software robots, or digital workers, to the global workforce will increase by over 50% in the next two years. These results, from a survey of 500 senior decision-makers in large enterprises, illustrate a fundamental shift to a future of work dependent on human-machine collaboration.
“A growing number of employees will find themselves working side-by-side with a digital coworker in the future as technology automates many work activities,” commented Holly Muscolino, Research Vice President of Content and Process Strategies and the Future of Work at IDC. “Think human and machine. The human-machine collaboration is not just the future of work, but it is the new normal for today's high-performing enterprises.”
It is not just mundane, repetitive jobs like data input that new digital colleagues will help human workers complete in the years ahead. The growth of machine learning (ML) through human-centric artificial intelligence (AI) means robot assistants will also help employees make better decisions. In most cases, these technologies enhance rather than replace human capabilities. For example, the survey found that technology evaluating information will grow by 28% in two years, and 18% of activities related to reasoning and decision making will be performed by machines.
IDC forecasts that the intelligent process automation (IPA) software market, which includes content intelligence and robotic process automation (RPA), will grow from $13.1 billion in 2019 to $20.7 billion in 2023. Since many of the repetitive processes and tasks that are well suited for automation by RPA are document and content centric, content intelligence technologies frequently go hand-in-hand with RPA in intelligent process automation use cases. Automation initiatives will also be enabled by process intelligence, a new generation of process mining tools providing complete visibility into business processes – the critical foresight needed to improve the success of an IPA project.
Over 40% of survey respondents have experienced a notable increase in customer satisfaction and employee productivity by deploying content intelligence technologies into their digital transformation strategy. Additionally, more than 1/3 of respondents saw an improvement in responsiveness to customers, new product or revenue opportunities, increased visibility and/or accountability, or increased customer engagement.
“The IDC survey proves that automation can and should be human-centric, augmented with artificial intelligence,” said Neil Murphy, VP Global Business Development at ABBYY. “Ethical, responsible automation will create a more productive, happier future where human workers can focus on higher-level, creative and socially responsible tasks, and customers get better experiences with faster service. Businesses that are early-adopters of incorporating content intelligence within their automation platforms will gain a significant competitive edge.”
Other key findings:
Manufacturers plan to invest more than ever in smart factories but challenges in scaling must be overcome.
A new study from the Capgemini Research Institute has found that smart factories could add at least $1.5 trillion to the global economy through productivity gains, improvements in quality and market share, along with customer services. However, two-thirds of this overall value is still to be realized: efficiency by design and operational excellence through closed- loopoperations will make equal contributions. According to the new research, China, Germany and Japan are the top three countries in smart factory adoption, closely followed by South Korea, United States and France.
The report entitled, “Smart Factories @ Scale”, identified the two main challenges to scaling up: the IT-OT convergence and the range of skills and capabilities required to drive the transformation including cross-functional capabilities and soft skills in addition to digital talent. The report also highlights how the technology led-disruption, towards an ‘Intelligent Industry’, is an opportunity for manufacturers striving to find new ways to create business value, optimize their operations and innovate for a sustainable future.
Key findings of the study, which surveyed over 1000 industrial company executives across 13 countries, include:
Organizations are showing an increasing appetite and aptitude for smart factories:compared to two years ago, more organizations are progressing with their smart initiatives today and one-third of factories have already been transformed into smart facilities. Manufacturers now plan to create 40% more smart factories in the next five years and increase their annual investments by 1.7x compared to the last three years.
The potential value add from smart factories is bigger than ever: based on this potential for growth, Capgemini estimates that smart factories can add anywhere between $1.5 trillion to $2.2 trillion to the global economy over the next five years. In 2017 Capgemini found that 43% of organizations had ongoing smart factory projects; which has shown a promising increase to 68% in two years. 5G is set to become a key enabler as its features would provide manufacturers the opportunity to introduce or enhance a variety of real-time and highly reliable applications.
Scaling up is the next challenge for Industry 4.0: despite this positive outlook, manufacturers say success is hard to come by, with just 14% characterizing their existing initiatives as ‘successful’ and nearly 60% of organizations saying that they are struggling to scale. The two main challenges to scale up are:
·The IT-OT convergence - including digital platforms deployment and integration, data readiness and cybersecurity - which will be critical to ensure digital continuity and enable collaboration. Agnostic and secure multilayer architectures will allow a progressive convergence.
·In addition to digital talent, a range of skills and capabilities will be required to drive smart factory transformation including cross-functional profiles, such as engineering-manufacturing, manufacturing-maintenance, and safety-security. While soft skills, such as problem solving and collaborative skills will also be critical.
According to the report, organizations need to learn from high performers (10% of the total sample) that make significant investments in the foundations - digital platforms, data readiness, cybersecurity, talent, governance - and well-balanced “efficiency by design” and “effectiveness in operations” approach, leveraging the power of data and collaboration.
Jean-Pierre Petit, Director of Digital Manufacturing at Capgemini said: “A factory is a complex and living ecosystem where production systems efficiency is the next frontier rather than labor productivity. Secure data, real- time interactions and virtual-physical loopbacks will make the difference. To unlock the promise of the smart factory, organizations need to design and implement a strong governance program and develop a culture of data-driven operations.”
“The move to an Intelligent Industry is a strategic opportunity for global manufacturers to leverage the convergence of Information Technology and Operational Technology, in order to change the way their industries will operate and be future ready,” he further added.
Mourad Tamoud, EVP, Global Supply Chain Operations at Schneider Electric said: “Through Schneider Electric’s TSC4.0 Transformation, Tailored, Sustainable & Connected 4.0, a sustainable and connected journey which integrates the Smart Factory initiative, we have created a tremendous dynamic. We had started with just 1 flagship pilot several years ago and towards the end of 2019, we have over 70 Smart Factory sites certified with external recognition by the World Economic Forum. By training our managers, engineers, support staff, and operators, we have equipped them with the right knowledge and competences. In parallel, we have also started to scale this experience across the organization through a virtual network to achieve such a fast ramp up. “
“This is only the beginning - we will continue to innovate by leveraging internally and externally our EcoStruxure™ solution - an IoT enabled, plug and play, open architecture and platform - and use the latest best practices in the digital world,” he further added.
Two thirds (67%) of businesses say that driving collaboration between security and IT ops teams is a major challenge.
Research released by Tanium and conducted by Forrester Consultinghas found strained relationships between security and IT ops teams leave businesses vulnerable to disruption, even with increased spending on IT security and management tools.
According to the study of more than 400 IT leaders at large enterprises, 67% of businesses say that driving collaboration between security and IT ops teams is a major challenge, which not only hampers team relationships, but also leaves organisations open to vulnerabilities. Over 40% of businesses with strained relationships consider maintaining basic IT hygiene more of a challenge than those with good partnerships (32%). In fact, it takes teams with strained relationships nearly two weeks longer to patch IT vulnerabilities than teams with healthy relationships (37 business days versus 27.8 business days).
The study also found that increased investment in IT solutions has not translated to improved visibility of computing devices and has created false confidence among security and IT ops teams in the veracity of their endpoint management data.
Increased investment without improved visibility
In recent years, there has been a considerable investment in security and IT operations tools, as well as an increased focus at the board level on cybersecurity. According to the study, 81% of respondents feel very confident that their senior leadership/board has more focus on IT security, IT operations and compliance than two years ago.
Enterprises who reported budget increases said they have seen considerable additional investment in IT security (18.3%) and operations (10.9%) over the last two years, with teams procuring an average of five new tools over this same time period.
Misplaced confidence leaves firms vulnerable
Despite the increased investment in IT security and operational tools, businesses have a false sense of security regarding how well they can protect their IT environment from threats and disruption. 80% of respondents claimed that they can take action instantly on the results of their vulnerability scans and 89% stated that they could report a breach within 72 hours. However, only half (51%) believe they have full visibility into the vulnerabilities and risks and fewer than half (49%) believe they have visibility of all hardware and software assets in their environment.
The study also showed that 71% of respondents struggle to gain end-to-end visibility of endpoints and their health, which could lead to consequences such as poor IT hygiene, limited agility to secure the business, vulnerability to cyber threats and collaboration between teams.
Chris Hodson, EMEA CISO at Tanium, said: “IT security is increasingly a boardroom-level issue and businesses have accordingly started to invest much more in shoring up their defences. Yet there’s a prevailing misconception that investing in multiple point solutions is the most comprehensive way to prepare for cyberthreats. In fact, quite the opposite is true. Having multiple pieces of cybersecurity software is helping to cement these internal tensions between IT operations and security teams, as well as contributing to this increasingly siloed approach to security, which further leaves the business vulnerable.
“Our research suggests that increased investment in IT solutions has not translated to improved visibility of computing devices and created misplaced confidence among security and IT ops teams. The truth is that companies who rely on a gamut of point solutions, but lack complete visibility of their IT environment, are essentially basing their IT security posture on a coin flip.”
Unified endpoint solutions allows firms to operate at scale
A unified endpoint management and security solution – a common toolset for both security and IT ops – can help address these challenges. In the study, IT decision makers stated that a unified solution would allow enterprises to operate at scale (59 %), decrease vulnerabilities (54 %), and improve communication between security and operations teams (52%).
IT decision makers also say that a unified endpoint solution would help them see faster response times (53%) and have more efficient security investigations (51%), while improving visibility through improved data integration (49%) and accurate real-time data (45%).
According to the Forrester study: “IT leaders today face pressure from all sides. To cope with this pressure, many have invested in a number of point solutions. However, these solutions often operate in silos, straining organisational alignment and inhibiting the visibility and control needed to protect the environment. Using a unified endpoint security solution that centralises device data management enables companies to accelerate operations, enhance security, and drive collaboration between Security and IT ops teams.”
The four largest European colocation FLAP markets of Frankfurt, London, Amsterdam and Paris are set for a record-breaking finish to 2019, with a supercharged last quarter, according to figures from CBRE, the world's leading data centre real estate advisor.
CBRE analysis shows that there was 38MW of take-up and 63MW of new supply across the four markets during Q3. London and Frankfurt were particularly strong on the demand side, and Amsterdam was responsible for nearly 50% of the new supply. Despite this strong performance, CBRE forecasts that market activity in Q4 will double that of Q3 to create a record year.
CBRE forecasts that 70MW of new take-up will be added to the FLAP market total in the final quarter, pushing the full year to beyond 200MW. This would be the first time on record that the four markets have breached 200MW of take-up in a single year.
According to CBRE, a further 150MW of new supply will be brought online in Q4. The new capacity represents nearly 50% of all the new capacity in the four markets during 2019 and will equate to a 23% growth in total market size during the year.
Mitul Patel, Head of EMEA Data Centre Research at CBRE commented:
“These record levels of development underway in the major European markets are creating challenges. The availability of freehold land in popular data centre hubs, which offer proximity to large amounts of HV power and fibre routes, such as Slough in the UK and Schiphol in Amsterdam, is highly constrained.The effects of these barriers to entry are that data centre developers are either choosing to locate in new, sometimes unproven, locations or are competing aggressively on price for land opportunities.
“Despite cloud providers driving market activity, enterprise demand for colocation remains consistent across the major markets. CBRE analysis shows that in the four years from 2016, there has been an average of 43MW of enterprise take-up per year across the four FLAP markets. As enterprise companies continue to utilize colocation footprints as part of their hybrid IT architecture, we expect this to remain consistent.”
Global study reveals rising customer expectations are outpacing adoption of AI tools that drive efficiency and customer experience.
LogMeIn has published the results of a global study conducted in partnership with Ovum to understand how support agents are faring in the age of ever-rising consumer expectations. The findings reveal that the vast majority of surveyed agents believe that the technology tools provided to customer-facing employees are not evolving as quickly as their needs are.
Today’s customers expect agents to have increasingly detailed knowledge of products, services, and company policies so they can achieve first contact resolution (FCR). However, the reality is that only 35% of agents say this is possible, as the majority (57%) do not have any AI tools, and more than half (53%) do not use a knowledge base.
The study surveyed 750 customer-facing employees, customer experience (CX) managers, and content managers in seven countries across North America, EMEA and APAC, and found that agents in physical locations are worse off than their counterparts in contact centres. Only 30% of field agents have AI tools compared to 44% in contact centres, and the unpleasant result for customers is that 20% of interactions require a call-back and 13% get transferred.
“We know that there is a direct correlation between agent frustration and customer discontent, and 85% of customer-facing employees expressed a very high degree of frustration because they can’t meet customer expectations,” said Ken Landoline, Principal Analyst, Ovum. “As the study highlights, all customer support employees need to be better equipped to meet rapidly growing customer expectations. Employees want to step up but are hampered by mediocre training and outdated, inefficient tools. Clearly this needs to change, or customer loyalty and revenue will ultimately suffer.”
AI’s Untapped Potential
AI-powered knowledge management tools offer a powerful solution to customer-facing employees working inside or outside of the contact centre who need instant access to company information, and 56% of surveyed knowledge base users are either extremely or very satisfied with them. Knowledge bases reduce the amount of time it takes to find information (66% think they are easy to search) and serve as a single source of truth for employees across teams and departments. LogMeIn, who commissioned the study with Ovum, is already working closely with companies to help reduce internal escalations by up to 30% with Bold360’s Advise solution, which leverages AI-powered knowledge management.
AI deployments for customer service and support also go beyond knowledge management. The majority are for automating routine tasks (60%) and assisting agents in real time (50%), followed by AI for customer journey mapping. Seventy-five percent of agents have a feedback system in place to advise management of issues they are facing during the course of their workday, and one-third utilise automatic pop-ups that recommend helpful next-best actions.
Yet, AI adoption is still in early stages. The majority of managers who participated in the survey are still formulating their AI strategy (38%) or only have an early-phase strategy in place (28%). The dominant approach to implementation is to put ad hoc point solutions in place for a few selected use cases (44%).
“A lack of tools for customer service agents creates a vicious circle: staff can’t meet customer expectations which creates employee frustration, turnover, and of course, a poor customer experience,” said Ryan Lester, Senior Director of Customer Engagement Technologies at LogMeIn. “Even though powerful technologies like AI-based knowledge management tools can reverse the trend, adoption is slow and it’s hurting these organisations. Poor customer experiences have a negative impact on sales and repeat business, so this is a pressing issue that businesses need to address at the highest levels.”
A vast majority of enterprises worldwide have adopted multi-cloud strategies to keep pace with the need for digital transformation and IT efficiency, but they face significant challenges in managing the complexities and added requirements of these new application and data delivery infrastructures, according to a global survey conducted by the Business Performance Innovation (BPI) Network, in partnership with A10 Networks.
The new study, entitled ‘Mapping The Multi-Cloud Enterprise,’ finds that improved security, including centralised security and performance management, multi-cloud visibility of threats and attacks, and security automation, is the number one IT challenge facing companies in these new compute environments.
Among key survey findings:
“Multi-cloud is the de facto new standard for today’s software- and data-driven enterprise,” said Dave Murray, head of thought leadership and research for the BPI Network. “However, our study makes clear that IT and business leaders are struggling with how to reassert the same levels of management, security, visibility and control that existed in past IT models. Particularly in security, our respondents are currently assessing and mapping the platforms, solutions and policies they will need to realise the benefits and reduce the risks associated of their multi-cloud environments.”
“The BPI Network survey underscores a critical desire and requirement for companies to reevaluate their security platforms and architectures in light of multi-cloud proliferation,” said Gunter Reiss, vice president at A10 Networks. “The rise of 5G-enabled edge clouds is expected to be another driver for multi-cloud adoption. A10 believes enterprises must begin to deploy robust Polynimbus security and application delivery models that advance centralised visibility and management and deliver greater security automation across clouds, networks, applications and data.”
The study finds that some 38% of companies have or will reassess their current relationships with security and load balancer suppliers in light of multi-cloud, with most others still undecided about whether a change in vendors is needed.
Benefits and Drivers of Multi-Cloud
IT and business executives respondents point to a number of benefits and business and technology forces that are driving their move into multi-cloud environments.
The top-four drivers for multi-cloud:
The top-four benefits for multi-cloud:
Security Tops IT To-Do List
Respondents report facing a long list of challenges in managing multi-cloud compute environments, with security at the top of their agenda.
The top-four challenges for multi-cloud:
The top-four requirements for improving multi-cloud security and performance:
The top-four security-specific solution needs:
New survey highlights how aligning cloud transformation with core business planning leads to success.
Most organisations worldwide (93%) are migrating to the cloud for critical IT requirements, but nearly a third (30%) say they have failed to realise notable benefits from cloud computing, largely because they have not integrated their adoption plan as a core part of their broader business transformation strategy, according to the first Unisys Cloud Success Barometer™. Surveying 1,000 senior IT and business leaders on the impact and importance of cloud in 13 countries around the world, including the UK, Germany, Belgium, and the Netherlands, the research discovered a strong correlation between cloud success and strategic planning.
Cloud Commitment is Key
The study found that, when cloud transition is core to business strategy, there is a more dramatic improvement in organisational effectiveness – in fact, 83% said that this has improved for the better since moving to the cloud. In contrast, amongst those who say the cloud is a minor part of their business strategy, just 30% say organisational effectiveness has improved since moving to the cloud.
Commitment through investment also produced positive results. Once organisations have experienced the benefits, they then continue to invest further in the cloud. Four-fifths (80%) of those who plan to spend substantially on their cloud computing in 2020 have seen their organisational effectiveness change significantly for the better.
Kevin Turner, Digital Workplace Strategy Lead, Unisys, said: “Our findings show that the majority of organisations are approaching their use of cloud computing from a tactical perspective – and whilst tactical moves can be very powerful – by taking a broader strategic view, and integrating with core business planning, cloud adoption will deliver greater results. Committing to the cloud with a considered approach, ensuring best practice supported by a robust methodology is imperative to leveraging the cloud to meet your objectives.”
The Future is Multi-Cloud
Meanwhile, only 28% of organisations have embraced multi-cloud solutions indicating more opportunity yet to reap business benefits, especially as multi-cloud users see the cloud as essential to staying competitive - 42% have been impacted by a competitor who leverages cloud innovations. By choosing multiple cloud providers, businesses can take advantage of the best parts of each provider’s services and customise these to suit the needs – and expectations – of the organisation. This partial transition could be a potential limit to the benefits of the cloud.
“Multi-cloud represents the future of cloud computing, and for obvious reasons. Organisations that adopt multi-cloud strategies can design applications to run across any public cloud platform, expanding their marketplace power,” said Turner. “Additionally, a multi-cloud strategy helps organisations gain greater sovereignty over their data, spread their risk in case of downtime and increase the business's negotiating leverage – as well as offering cost savings by allowing businesses to shop rates for different service needs from multiple vendors.”
Cloud Pros and Cons
Nearly three in four (73%) of senior business leaders say the benefits of cloud computing outweigh the barriers, and 66% have seen their organisational effectiveness significantly change for the better through the adoption of cloud computing. The top expectations of cloud benefits were; improved security (64%) and reduced costs (50%), higher staff productivity (40%), improved agility to meet demand (40%) and delivering better customer experience (40%).
The majority of business leaders (77%) said migration to the cloud had met or exceeded their expectations for security, improving the supply chain (75%) and driving innovation (74%). The areas that fell short in meeting expectations of our respondents were; improved staff productivity (32%), increased revenue (32%), managing or reducing cost (35%) and reducing headcount (38%).
“Closing the gaps on these business results – including revenues, costs, productivity, innovation and organisational effectiveness – requires more than a ‘lift and shift’ transition of IT applications and infrastructure to the cloud. It requires changing the way companies work to better suit customers and staff, and changing their attitudes to digital innovation,” continued Turner. “Strategic planning along with security, scalability, realistic timelines and upskilling staff are all key to successful cloud implementation.”
Nutanix has announced the findings of its second global Enterprise Cloud Index survey and research report, which measures enterprise progress with adopting private, hybrid and public clouds. The new report found enterprises plan to aggressively shift investment to hybrid cloud architectures, with respondents reporting steady and substantial hybrid deployment plans over the next five years. The vast majority of 2019 survey respondents (85%) selected hybrid cloud as their ideal IT operating model.
For the second consecutive year, Vanson Bourne conducted research on behalf of Nutanix to learn about the state of global enterprise cloud deployments and adoption plans. The researcher surveyed 2,650 IT decision-makers in 24 countries around the world about where they’re running their business applications today, where they plan to run them in the future, what their cloud challenges are, and how their cloud initiatives stack up against other IT projects and priorities. The 2019 respondent base spanned multiple industries, business sizes, and the following geographies: the Americas; Europe, the Middle East, and Africa (EMEA); and the Asia-Pacific (APJ) region.
This year’s report illustrated that creating and executing a cloud strategy has become a multidimensional challenge. At one time, a primary value proposition associated with the public cloud was substantial upfront capex savings. Now, enterprises have discovered that there are other considerations when selecting the best cloud for the business as well, and that one size cloud strategy doesn’t fit all use cases. For example, while applications with unpredictable usage may be best suited to the public clouds offering elastic IT resources, workloads with more predictable characteristics can often run on-premises at a lower cost than public cloud. Savings are also dependent on businesses’ ability to match each application to the appropriate cloud service and pricing tier, and to remain diligent about regularly reviewing service plans and fees, which change frequently.
In this ever-changing environment, flexibility is essential, and a hybrid cloud provides this choice. Other key findings from the report include:
“As organisations continue to grapple with complex digital transformation initiatives, flexibility and security are critical components to enable seamless and reliable cloud adoption,” said Wendy M. Pfeiffer, CIO of Nutanix. “The enterprise has progressed in its understanding and adoption of hybrid cloud, but there is still work to do when it comes to reaping all of its benefits. In the next few years, we’ll see businesses rethinking how to best utilise hybrid cloud, including hiring for hybrid computing skills and reskilling IT teams to keep up with emerging technologies.”
Datrium has released findings from its industry report on the State of Enterprise Data Resiliency and Disaster Recovery 2019, which assesses how organizations are implementing disaster recovery (DR) to protect their data from attack or disaster.
Findings suggest that organizations are growing more concerned about the threat of disaster as a result of ransomware, human error, power failure and natural disaster. The heightened threat of ransomware is particularly concerning for the enterprise data centre, with nearly 90% of companies considering ransomware a critical threat to their business, and this is driving the need for DR. The research also found that the public cloud is increasingly being considered as a DR site. The cloud offers greater ease of use and cost-efficient DR, solving several pain points that are holding organizations back from responding to DR events including the complexity of DR products and processes as well as high associated costs.
The State of Enterprise Data Resiliency and Disaster Recovery 2019 study was developed to identify what DR solutions businesses currently utilize, their confidence in those solutions and how effective the solutions are at helping businesses return to normal operations following a disaster, in addition to the key capabilities IT teams consider when evaluating and selecting a DR solution to create their DR plan.
"This research confirms that ransomware is one of the biggest concerns for IT managers today," said Tim Page, CEO of Datrium. "This threat is significantly driving people to reevaluate their DR plans. It's no surprise that more than 88% of respondents said they’d use the public cloud for their DR site if they could pay for it only when they need to test or run their DR plans.”
Ransomware is Plaguing the Enterprise Data Centre
As the nucleus of an enterprise, the data centre must be protected from the threat of disaster or deliberate attacks. Half (50.4%) of all organizations surveyed have recently experienced a DR event, with ransomware reported as the leading cause.
Traditional Approaches to DR Put Organizations at Risk
Traditional DR approaches are lacking. Two significant challenges faced by more than half of organizations who experienced a DR event in the past 24 months were the difficulty of both failover to their DR location and failback.
The top three reasons why failback was difficult included difficulty with format conversion, the amount of time required to failback and challenges in understanding changes in the system since failover.
Pay-for-Use DR in the Public Cloud is in High Demand
The industry norm today is to have physical sites for DR, however the industry is shifting toward DR in the public cloud. The vast majority (88.1%) of respondents said they would use the public cloud as their DR site if they would only have to pay for it when they need it. The most common approach to DR according to more than half (52.7%) of respondents is having more than one physical DR site.
"As companies weigh the best defence options against ransomware, they should consider the benefits of using the public cloud for their DR sites. They can counter the high costs of traditional DR solutions and free their IT staff to focus on revenue-generating initiatives. The public cloud offers lower management overhead and costs along with lower risk and higher reliability in testing and executing DR plans," added Page.
Nearly one in four respondents (23%) stated that their organization is not responding to DR events as effectively as it could be. The top three considerations holding organizations back from responding to DR events include: 1) the complexity of DR products and processes, 2) high associated costs and 3) lack of staff skilled in managing DR.
Growing Threat of Disaster Increases DR Budgets
Given the growing threat of disasters, DR budgets are significantly increasing.
“It's our mission to bring enterprise-grade DR to every IT team by cutting cost and complexity with an on-demand failproof model leveraging VMware Cloud on AWS,” said Page. “Today, Datrium also announced new Disaster Recovery as a Service capabilities that provide VMware users, both on premises and in the cloud, a reliable, cost-effective cloud-based disaster recovery solution with the industry’s first instant Recovery Time Objective (RTO) restarts.”
Research from Ensono and Wipro, conducted by Forrester, has found that 50 per cent of enterprise organisations are expanding or upgrading mainframe versus 69 per cent who are expanding or upgrading implementation of cloud. The research shows that 88 per cent of organisations are adopting a hybrid IT approach and 89 per cent state that adoption includes a hybrid cloud strategy.
Research from Ensono and Wipro, conducted by Forrester, has found that 50 per cent of enterprise organisations are expanding or upgrading mainframe versus69 per cent who are expanding or upgrading implementation of cloud. The research shows that 88 per cent of organisations are adopting a hybrid IT approach and 89 per cent state that adoption includes a hybrid cloud strategy.
Such is the focus on hybrid cloud uptake that one in four enterprise businesses cite internal pressures as a key motivator to exploring new cloud technologies.
Of the enterprise organisations still relying on mainframe, more than a quarter (28 per cent) are refactoring a portion of their apps to take advantage of new cloud technologies such as serverless (74 per cent) and containers (90 per cent).
However, the research also highlighted that companies are not as likely to migrate finance and accounting workloads, preferring to migrate (refactor) only a portion of the workload. According to the respondents, the barriers to cloud migration include substantial costs, potential business disruption during migration, and security and compliance concerns. Refactoring enables organisations to move parts of workloads to the cloud, whatever the most appropriate one that may be.
The high pace of expansion taking place in the global healthcare industry and adoption of advanced technologies is boosting the market for healthcare AR VR devices. The healthcare industry has shown increasing adoption of AR VR technologies and incorporated these in routine operations.
Global Healthcare AR VR Market: Snapshot
Virtual reality enables a person to experience and interact with a 3D environment. Whereas augmented reality provides digital information in the form of sound, audio and graphics.
These technologies are in much demand in hospitals and clinics for medical trainings, research processes, and in diagnostic centers. The market is expected to grow exponentially given the growing preference of people towards advanced technologies. The healthcare industry is also evolving to become an advanced sector, providing better services to treat severe health issues and to provide advanced health related assistance. The incorporation of advanced technologies is also a result of growing competition among market companies. Top companies are vying with one another to introduce the latest technologies before competition, which is intended to help them stay ahead in the race. According to the report forecasts, the global healthcare AR VR market is expected to hold a market value of over US$ 600 Mn in 2018, which may reach a market value of over US$ 15,000 Mn by the end of 2026. The market is anticipated to grow at an exceptionally high CAGR of 49.1% during the forecast period 2018 – 2026.
Healthcare AR VR Market: Competitive Landscape
There are a number of companies outshining in the global market. Most of these companies are ahead in making huge investments and entering into acquisitions and mergers. This research report combines a brief profile of all such leading companies and their strategic planning for the coming years. Some these key players included in the report are Samsung Electronics Co. Ltd, Google Inc., DAQRI LLC, Oculus VR, LLC, Magic Leap, Inc., ImmersiveTouch, Inc., FIRSTHAND TECHNOLOGY INC., HTC Corporation etc.
The research report highlights the forecast for the healthcare AR VR market based on key factors impacting the market and also the trends that are prevailing in the market. There has been a growth in the adoption of smartphones among individuals due to the increasing usage of digital technologies in healthcare AR VR such as surgery planning, medical treatment and others. This has increased revenue contribution to GDP especially in developing regions such as Africa, APAC, and MEA. North America is expected to hold the highest revenue share of over US$ 5,000 Mn by the end of 2026 in the global healthcare AR VR market. Furthermore, SEA and other APAC regions are also expected to be the most lucrative regions for companies looking for emerging opportunities in the market.
However, it is yet to be seen whether the developing regions will be capable enough to hold a decent share of the healthcare AR VR market in the coming years.
But IT should beware of overcoming AI-wash only to fall victim to ML-sprawl
Bryan Betts, Freeform Dynamics Ltd.
We hear and read a lot about AI and machine learning (ML) these days, but outside a few core applications, such as autonomous vehicles or decision-support systems, how much use is it really seeing? How well is it doing in industry, for example? After all, the huge volumes of data generated within the modern factory, or by connected products and devices, would seem like a natural fit for ML’s data-sifting capabilities.
Yet in a recent study of AI plans and perceptions carried out by Freeform Dynamics, just over half of the IT professionals surveyed said their organisations were not currently prioritising AI investments (Figure 1). The reasons most commonly cited for this were excess hype and concerns over AI’s relevance and readiness.
The perceptual part of the problem is made worse by the tendency of some marketing people to indulge in ‘AI-washing’. They label almost anything automated as AI or AI-enabled, even when it simply has a few rules embedded in it – and of course when ML is just one component of what might one day turn into artificial intelligence.
Indeed, when most people refer to AI they are actually referring to ML, or perhaps to deep learning (DL), an ML subset that’s especially fashionable at present. Sometimes they excuse the conflation by labelling the thinking machines of the future as “general AI” and ML as “narrow AI”, although in some ways that’s just a differently-coloured coat of AI-wash.
This is gradually changing, with the term AI becoming more acceptable in certain usages. An example is virtual assistants or intelligent agents, especially where the AI is also able to act autonomously – remedying an insecure network port or S3 bucket, say. The key factor seems to be that these assistants combine multiple AI subsets, such as ML plus natural language processing and automated planning.
Making ML measurable
In any case, there is more to AI resistance than just an allergy to hype. Our study also revealed concerns around how to measure return on investment, how to scope, specify and cost the AI platform, and of course the availability of the necessary skills.
In addition, AI/ML is very much a multi-disciplinary matter, with dependencies that go way beyond IT. It’s essential therefore to also have business users, operational staff and other engineering disciplines involved right from the very start, from evaluation, budgeting and planning, through to implementation and ongoing operations.
Yet most of our survey respondents said it was a challenge to get all the relevant stakeholders and disciplines or departments working together. Just 11% of those with experience of AI initiatives said they had the full involvement of those required, while 37% said their stakeholders were “not at all” working together effectively.
The one area in our study where it flipped around to AI acceptance – to some degree, at least – was in manufacturing, where a slim majority of respondents (53%) said they were already prioritising AI investment. If you include those who agreed that they ought to be prioritising AI investment, as well as those already doing so, the totals rise from 60% for all sectors to 72% in manufacturing alone.
One of the biggest areas for AI/ML today is intelligent automation. This is often presented as a necessity: think about areas such as network security, or predictive maintenance in a factory, for instance. There is simply too much data now, too many alerts coming in, for a human to process it all. In applications such as these, ML can not only act as a filter but it can also take action, assuming that action would also be automatic on the human’s part.
One thing our research into AI has shown though is that this only tells part of the story. For a start, the technology we want (or need) to automate must actually be capable of automation. It is something of a self-reinforcing loop – the application of technology to a process also enables it to be automated, and as the process’s speed and complexity ramp up, that automation becomes pretty much essential (Figure 2).
AI as an agent of change
And then there is automation and ML not just as ways to keep up with an ever-faster world, but as creators of opportunity. By automating the routine, we should – in theory, at least – free up human and mental resources for new tasks. Certainly, past industrial revolutions have subsequently driven innovations and changes in culture and society that went far beyond their initial industrial impact.
It remains to be seen if intelligent automation will indeed amount to a new industrial revolution, as some believe it will. Even if it does, will it be as disruptive and all-embracing as the changes brought by steam power and mass production? For instance, the shopper of today is used to clothing and fashion being largely disposable. Yet in pre-industrial times, the effort required to grow fibres on a sheep or plant, then manually spin, weave, sew and dye them, meant that your clothing could be the most valuable thing you owned.
It is quite possible that we are in the position of our medieval ancestors, trying to imagine a world in which a garment, far from being something that you leave in your will, costs the same as a pint of beer – and can be discarded as readily. That means we need to plan for a future that we cannot see – not just for the things that we know that we need more information on (the known-unknowns), but for the “unknown unknowns”.
For example, we know that while previous industrial revolutions destroyed some jobs, they created others. But can we assume that will always be true, and even if it is, will the new jobs be of equal value, status and skill – will master weavers once again be replaced by mere machine-minders?
Taking advantage of AI today
Whatever changes may follow in the future, our survey respondents clearly see opportunities in using AI today, to help both staff and customers, and to improve operational efficiency (Figure 3).
Yes, there is the problem of simply knowing how and where to start, and then what to expect. And yes, there is the challenge of avoiding AI bias, which can creep in in very simple ways – for example by using training data that reflects the current state of affairs when that itself is biased.
AI tools and technologies are increasingly available in packaged forms, however, and it’s useful to remember that they are just tools and technologies. For any technology to win adoption it has to be trustworthy: people don’t need to understand how it works, but it needs to be explainable, fair and accessible. AI is no different.
The one caveat is to starting thinking now about strategy and architecture – if you’re not already doing so. Something we’ve seen in our research time and again is that although the barrier to acceptance is high, once a new technology has proved itself in one application it tends to find others pretty fast. So unless you get reusable processes and platforms in place first, you could end up in a disjointed landscape of incompatible silos and stacks. We’ve already had VM-sprawl and cloud-sprawl – let’s not have ML-sprawl next!
IT spending in EMEA will return to growth at $798 billion in 2020, an increase of 3.4% from 2019, according to the latest forecast by Gartner, Inc.
“2020 will be a recovery year for IT spending in EMEA after three consecutive years of decline,” said John Lovelock, research vice president at Gartner. “This year declines in the Euro and the British Pound against the U.S. Dollar, at least partially due to Brexit concerns, pushed some IT spending down and caused a rise in local prices for technology hardware. However, 2020 will be a rebound year as Brexit is expected to be resolved and the pressure on currency rates relieved.”
Spending on Devices Still Down While Enterprise Software Spending Keeps Rising
In 2019, EMEA spending on devices (including PCs, tablets and mobile phones) is set to decline 10.7% (see Table 1). Higher prices — partly due to currency declines and a lack of new “must have” features in mobile phones — have allowed consumers to defer upgrades for another year. Devices spending will not rebound in 2020, but instead fall by 1.3%, as both businesses and consumers move away from spending on PCs and tablets.
After 2019, the communications services segment will achieve long-term growth despite fixed-line services in both the consumer and business spaces declining every year through 2023. While mobile voice spending is flat — mainly due to price declines — mobile data spending increases 3% to 4% per year, which is keeping the overall communications services market growing in 2020.
Enterprise software will remain the fastest-growing market segment in 2020. EMEA spending for enterprise software will increase 3.4% and 9.2% in 2019 and 2020, respectively. Software as a service (SaaS) will achieve 14.1% growth in 2019 and 17.7% in 2020.
Table 1. EMEA IT Spending Forecast (Millions of U.S. Dollars)
2019 Growth (%)
2020 Growth (%)
Data Center Systems
Source: Gartner (November 2019)
Regulatory Compliance Fuels Spending
The complex geopolitical environment across EMEA has pushed regulatory compliance to the top of the priority list for many organizations. EMEA spending on security will grow 9.3% in 2019 and rise by 8.9% in 2020.
“Globally, security spending is increasing and being driven by the need to be compliant with tariffs, trade policies and intellectual property rights. In EMEA, privacy and compliance concerns, further driven by GDPR, take precedence,” said Mr. Lovelock.
The U.S. is leading cloud adoption and accounts for over half of global spending on cloud, which will total $140.4 billion in 2020, up 15.5% from 2019. In terms of cloud spending, the U.K. holds the No. 2 position behind the U.S. The U.K. spends 8% of IT spending on public cloud services, which will total $US 16.6 billion in 2020, up 13.2% from 2019. In EMEA, the overall spending on public cloud services will reach $57.7 billion in 2020, up from $50 billion in 2019.
Gartner predicts that organizations with a high percentage of IT spending dedicated to the cloud will become the recognized digital leaders in the future.
“Organizations in Europe, regardless of industry, are shifting their balance from traditional to digital — moving toward “techquilibrium,” a technological balancing point that defines how digital an enterprise needs to be to compete or lead,” said Mr. Lovelock. “Not every company needs to be digital in the same way or to the same extent. This move towards rebalancing the traditional and digital is clearly visible amongst EMEA companies."
Even as digitalization is becoming pervasive among EMEA organizations, these organizations remain vulnerable to “turns,” forces that create uncertainty and pressure for their CIOs. Ninety percent of EMEA organizations have faced a turn in the last four years, according to Gartner Inc.’s annual global survey of CIOs. With 2020 likely to be more volatile than 2019, CIOs must help their organizations acquire the capabilities needed to win when the next turn arrives.
“Simply being digital isn’t really going to cut it anymore. Forty-one percent of EMEA CIOs are already running mature digital businesses, up from 35% last year. It’s the coming turns that are the problem, not digitalization,” said Andy Rowsell-Jones, vice president and distinguished analyst at Gartner. “No one is immune from economic, geopolitical, technical or societal turns, which are likely to be more common in 2020 and beyond. These turns can take different forms and can disrupt an organization’s abilities in many ways.”
The 2020 Gartner CIO Agenda Survey gathered data from more than 1,000 CIO respondents, including 383 in 40 EMEA countries, and all major industries. The EMEA sample represents nearly $1.7 trillion in revenue and $23 billion in IT spending.
Turns Weaken an Organization’s Ability to Build Fitness
The nature and severity of the turns reported by CIOs varied widely. In EMEA, they included adverse regulatory intervention (41% of CIO respondents), organizational disruption (40%) and severe operating cost pressure (40%). “Those and other severe scenarios such as IT service failure or natural disaster have affected how organizations do business, and in most cases are making new things harder,” said Mr. Rowsell-Jones. “The real trick is how well organizations weathered these challenges.”
Mr. Rowsell-Jones added that the problem with a turn is that, once you’re in one, it disrupts your ability to respond. Thirty-six percent of EMEA CIOs reported that turns had handicapped them in bringing new business initiatives to market, pushing out time to value and ultimately reduced their success.
Turns also weaken an organization’s ability to build its “fitness” muscles. Thirty-four percent of EMEA CIOs said they were behind in the race to attract the right talent, 30% were suffering slowed or negative IT budget growth, and 32% said funding for new business initiatives had dried up. “No matter what it is, prepare for a turn before you attempt to go around it,” said Mr. Rowsell-Jones.
Fit Organizations Emerge Stronger From Turns Than Fragile Organizations
In the global survey, Gartner separated CIO respondents’ organizations that had suffered a severe turn into two groups — “fit” and “fragile” — based on business outcomes. Gartner looked at over 50 organizational performance attributes to determine what differentiated the two groups.
Results were bundled into three themes — alignment, anticipation and adaptability — which indicated simple shifts in leadership priorities. For example, in times of crisis, leaders of fit organizations ensure that the organization stays together while it shifts to a new direction. They actively search for emerging trends or situations that require change. They take calculated risks but trust the core business capabilities. Leaders in fragile businesses are measurably worse at these things.
The differences between fit and fragile organizations do not end with leadership attitudes. Internal institutions and process matter too. For example, disciplined IT investment decisions and having a clear strategy were cited as key areas of alignment by fit organizations, but much less so by fragile organizations. The survey found that 53% of global fit organizations are likely to have a flexible IT funding model to respond to changes, compared to 43% in EMEA. In addition, having a clear and consistent overall business strategy ranks as one of the most distinctive traits of fit organizations. Sixty-seven percent of the global fit CIOs said their organization excels at this; so, too, did 63% of EMEA CIOs.
A robust relationship with the CEO is also a strong differentiator. A strong CEO relationship helps CIOs learn about a change in business strategy as it occurs. In EMEA, 43% of CIOs are reporting directly to their CEOs. This demonstrates that EMEA CIOs are aligning their purpose and direction with business executives.
EMEA CIOs Are Prioritizing Cybersecurity, RPA and AI for 2020
Fit IT leaders use IT to gain competitive advantage and help organizations anticipate changes ahead of time. Similar to the global fit CIOs, a large percentage of EMEA CIOs have already deployed or will deploy cybersecurity, robotic process automation (RPA) and artificial intelligence (AI) in the next 12 months.
Adaptability is the IT organization’s chief responsibility. “In a fit organization, IT leaders turn the IT organization into an instrument of change,” said Mr. Rowsell-Jones. “Sixty percent of global fit organizations rate the clarity and effectiveness of IT governance very highly. Similarly, 51% of EMEA CIOs rank their organization’s IT governance as effective or highly effective. IT governance can coordinate resource allocation in times of disruption, which offers survival benefits to an IT organization.”
“Digital is no longer the road to competitive advantage,” said Mr. Rowsell-Jones. “How organizations flex as the environment changes and how CIOs organize to deal with turns will dictate their success in the future, especially in 2020.”
Worldwide Public Cloud revenue to grow 17% in 2020
The worldwide public cloud services market is forecast to grow 17% in 2020 to total $266.4 billion, up from $227.8 billion in 2019, according to Gartner, Inc.
“At this point, cloud adoption is mainstream,” said Sid Nag, research vice president at Gartner. “The expectations of the outcomes associated with cloud investments therefore are also higher. Adoption of next-generation solutions are almost always ‘cloud-enhanced’ solutions, meaning they build on the strengths of a cloud platform to deliver digital business capabilities.”
Software as a service (SaaS) will remain the largest market segment, which is forecast to grow to $116 billion next year due to the scalability of subscription-based software (see Table 1). The second-largest market segment is cloud system infrastructure services, or infrastructure as a service (IaaS), which will reach $50 billion in 2020. IaaS is forecast to grow 24% year over year, which is the highest growth rate across all market segments. This growth is attributed to the demands of modern applications and workloads, which require infrastructure that traditional data centers cannot meet.
Table 1. Worldwide Public Cloud Service Revenue Forecast (Billions of U.S. Dollars)
Cloud Business Process Services (BPaaS)
Cloud Application Infrastructure Services (PaaS)
Cloud Application Services (SaaS)
Cloud Management and Security Services
Cloud System Infrastructure Services (IaaS)
BPaaS = business process as a service; IaaS = infrastructure as a service; PaaS = platform as a service; SaaS = software as a service
Note: Totals may not add up due to rounding.
Source: Gartner (November 2019)
Various forms of cloud computing are among the top three areas where most global CIOs will increase their investment next year, according to Gartner. As organizations increase their reliance on cloud technologies, IT teams are rushing to embrace cloud-built applications and relocate existing digital assets. “Building, implementing and maturing cloud strategies will continue to be a top priority for years to come,” said Mr. Nag.
“The cloud managed service landscape is becoming increasingly sophisticated and competitive. In fact, by 2022, up to 60% of organizations will use an external service provider’s cloud managed service offering, which is double the percentage of organizations from 2018,” said Mr. Nag. “Cloud-native capabilities, application services, multicloud and hybrid cloud comprise a diverse cloud ecosystem that will be important differentiators for technology product managers. Demand for strategic cloud service outcomes signals an organizational shift toward digital business outcomes.”
More than 740,000 autonomous-ready vehicles to be added to global market
By 2023, worldwide net additions of vehicles equipped with hardware that could enable autonomous driving without human supervision will reach 745,705 units, up from 137,129 units in 2018, according to Gartner, Inc. In 2019, net additions will be 332,932 units. This growth will predominantly come from North America, Greater China and Western Europe, as countries in these regions become the first to introduce regulations around autonomous driving technology.
Net additions represent the annual increase in the number of vehicles equipped with hardware for autonomous driving. They do not represent sales of physical units, but rather demonstrate the net change in vehicles that are autonomous-ready.
“There are no advanced autonomous vehicles outside of the research and development stage operating on the world’s roads now,” said Jonathan Davenport, principal research analyst at Gartner. “There are currently vehicles with limited autonomous capabilities, yet they still rely on the supervision of a human driver. However, many of these vehicles have hardware, including cameras, radar, and in some cases, lidar sensors, that could support full autonomy. With an over-the-air software update, these vehicles could begin to operate at higher levels of autonomy, which is why we classify them as ‘autonomous-ready.’”
While the growth forecast for autonomous-driving-capable vehicles is fast, net additions of autonomous commercial vehicles remain low in absolute terms when compared with equivalent consumer autonomous vehicle sales. The number of vehicles equipped with hardware that could enable autonomous driving without human supervision in the consumer segment are expected to reach 325,682 in 2020, while the commercial segment will see just 10,590 (see Table 1).
Table 1: Autonomous-Ready Vehicles Net Additions, 2018-2023
Source: Gartner (November 2019)
Lack of Regulation Inhibiting Autonomous Vehicle Deployment
Today, there are no countries with active regulations that allow production-ready autonomous vehicles to operate legally, which is a major roadblock to their development and use.
“Companies won’t deploy autonomous vehicles until it is clear they can operate legally without human supervision, as the automakers are liable for the vehicle’s actions during autonomous operation,” said Mr. Davenport. “As we see more standardized regulations around the use of autonomous vehicles, production and deployment will rapidly increase, although it may be a number of years before that occurs.”
Sensor Hardware Costs a Limiting Factor
By 2026, the cost of the sensors that are needed to deliver autonomous driving functionality will be approximately 25% lower than they will be in 2020. Even with such a decline, these sensor arrays will still have prohibitively high costs. This means that through the next decade, advanced autonomous functionality will be available only on premium vehicles and vehicles sold to mobility service fleets.
“Research and development robo-taxis with advanced self-driving capabilities cost as much as $300,000 to $400,000 each,” said Mr. Davenport. “Sophisticated lidar devices, which are a type of sensor needed for these advanced autonomous vehicles, can cost upward of $75,000 per unit, which is more than double the price of your average consumer automobile. This puts higher-level autonomous vehicle technology out of reach for the mainstream market, at least for now.”
Public Perceptions of Safety Will Determine Growth
Vehicle-human handover safety concerns are a substantial impediment to the widespread adoption of autonomous vehicles. Currently, autonomous vehicle perception algorithms are still slightly less capable than human drivers.
“A massive amount of investment has been made into the development of autonomous vehicle perception systems, with more than 50 companies racing to develop a system that is considered safe enough for commercial use,” said Mr. Davenport. Gartner predicts that it will take until 2025 before these systems demonstrate capabilities that are an order of magnitude better than human drivers.
To accelerate this innovation, technology companies are using simulation software powered by artificial intelligence to understand how vehicles would handle different situations. This enables companies to generate thousands of miles of vehicle test data in hours, which would take weeks to obtain through physical test driving.
“One of the biggest challenges ahead for the industry will be to determine when autonomous vehicles are safe enough for road use,” said Michael Ramsey, senior director analyst at Gartner. “It’s difficult to create safety tests that capture the responses of vehicles in an exhaustive range of circumstances. It won’t be enough for an autonomous vehicle to be just slightly better at driving than a human. From a psychological perspective, these vehicles will need to have substantially fewer accidents in order to be trusted.”
By 2023, digital transformation spending will grow to more than 50% of all ICT investment from 36% today; largest growth in data intelligence and analytics.
International Data Corporation (IDC) has unveiled IDC FutureScape: Worldwide Digital Transformation 2020 Predictions. In this year's DX predictions, IDC highlights the critical business drivers accelerating DX initiatives and investments as companies seek to effectively navigate business challenges, compete at hyperscale, and meet rising customer expectations.
In an IDC FutureScape Web conference held today at 12:00 pm U.S. Eastern time, IDC analysts Bob Parker and Shawn Fitzgerald discussed the ten industry predictions that will impact digital transformation efforts of CIOs and IT professionals over the next one to five years and offered guidance for managing the implications these predictions harbor for their IT investment priorities and implementation strategies. To register for an on-demand replay of this Web conference or any of the IDC FutureScape Web conferences, please visit https://www.idc.com/idcfuturescape2020.
The predictions from the IDC FutureScape for Worldwide Digital Transformation are:
Prediction 1 – Future of Culture: By 2024, leaders in 50% of G2000 organizations will have mastered "future of culture" traits such as empathy, empowerment, innovation, and customer-and data-centricity to achieve leadership at scale.
Prediction 2 – Digital Co-Innovation: By 2022, empathy among brands and for customers will drive ecosystem collaboration and co-innovation among partners and competitors that will drive 20% collective growth in customer lifetime value.
Prediction 3 – AI at Scale: By 2024, with proactive, hyperspeed operational changes and market reactions, artificial intelligence (AI)-powered enterprises will respond to customers, competitors, regulators, and partners 50% faster than their peers.
Prediction 4 – Digital Offerings: By 2023, 50% of organizations will neglect investing in market-driven operations and will lose market share to existing competitors that made the investments, as well as to new digital market entries.
Prediction 5 – Digitally Enhanced Workers: By 2021, new future of work (FoW) practices will expand the functionality and effectiveness of the digital workforce by 35%, fueling an acceleration of productivity and innovation at practicing organizations.
Prediction 6 – Digital Investments: By 2023, DX spending will grow to over 50% of all ICT investment from 36% today, with the largest growth in data intelligence and analytics as companies create information-based competitive advantages.
Prediction 7 – Ecosystem Force Multipliers: By 2025, 80% of digital leaders will devise and differentiate end-customer value measures from their platform ecosystem participation, including an estimate of the ecosystem multiplier effects.
Prediction 8 – Digital KPIs Mature: By 2020, 60% of companies will have aligned digital KPIs to direct business value measures of revenue and profitability, eliminating today's measurement crisis where DX KPIs are not directly aligned.
Prediction 9 – Platforms Modernize: Driven both by escalating cyberthreats and needed new functionality, 65% of organizations will aggressively modernize legacy systems with extensive new technology platform investments through 2023.
Prediction 10 – Invest for Insight: By 2023, enterprises seeking to monetize benefits of new intelligence technologies will invest over $265 billion worldwide, making DX business decision analytics and AI a nexus for digital innovation.
According to Shawn Fitzgerald, research director, Worldwide Digital Transformation Strategies, "Now in its fourth annual installment, our digital transformation predictions mark the next set of inflection points and related consequences executives should evaluate for inclusion into their multi-year planning scenarios. Direct digital transformation (DX) investment is growing at 17.5% CAGR and expected to approach $7.4 trillion over the years 2020 to 2023 as companies build on existing strategies and investments; becoming digital-at-scale future enterprises. Organizations with new digital business models at their core are well positioned to successfully compete in the digital platform economy."
Worldwide spending on AR/VR to reach $18.8 billion in 2020
Worldwide spending on augmented reality and virtual reality (AR/VR) is forecast to be $18.8 billion in 2020, an increase of 78.5% over the $10.5 billion International Data Corporation (IDC) expects will be spent in 2019. The latest update to IDC's Worldwide Augmented and Virtual Reality Spending Guide also shows that worldwide spending on AR/VR products and services will continue this strong growth throughout the 2019-2023 forecast period, achieving a five-year compound annual growth rate (CAGR) of 77.0%.
Worldwide spending on AR/VR solutions will be led by the commercial sectors, which will see its combined share of overall spending grow from less than 50% in 2020 to 68.8% in 2023. The commercial industries that are expected to spend the most on AR/VR in 2020 are retail ($1.5 billion) and discrete manufacturing ($1.4 billion). Fifteen industries are forecast to deliver CAGRs of more than 100% over the five-year forecast period, led by securities and investment services (181.4% CAGR) and banking (151.9% CAGR). Consumer spending on AR/VR will be greater than any single enterprise industry ($7.0 billion in 2020) but will grow at a much slower pace (39.5% CAGR). Public sector spending will maintain a fairly steady share of overall spending throughout the forecast.
"AR/VR commercial uptake will continue to expand as cost of entry declines and benefits from full deployment become more tangible. Focus is shifting from talking about technology benefits to showing real and measurable business outcomes, including productivity and efficiency gains, knowledge transfer, employee's safety, and more engaging customer experiences," said Giulia Carosella, research analyst, European Industry Solutions, Customer Insights & Analysis.
Commercial use cases will account for nearly half of all AR/VR spending in 2020, led by training ($2.6 billion) and industrial maintenance ($914 million) use cases. Consumer spending will be led by two large use cases: VR games ($3.3 billion) and VR feature viewing ($1.4 billion). However, consumer spending will only account for a little over one third of all AR/VR spending in 2020 with public sector use cases making up the balance. The AR/VR use cases that are forecast to see the fastest growth in spending over the 2019-2023 forecast period are lab and field (post secondary) (190.1% CAGR), lab and field (K-12) (168.7% CAGR), and onsite assembly and safety (129.5% CAGR). Seven other use cases will also have five-year CAGRs greater the 100%. Training, with a 61.8% CAGR, is forecast to become the largest use case in terms of spending in 2023.
Hardware will account for nearly two thirds of all AR/VR spending throughout the forecast, followed by software and services. Services spending will see strong CAGRs for systems integration (113.4%), consulting services (99.9%), and custom application development (96.1%) while software spending will have a 78.2% CAGR.
"Across enterprise industries, we are seeing a strong outlook for standalone viewers play out in use case adoption. Enterprises will drive much of these high-end headset adoption trends. In the consumer segment, more affordable viewer models for gaming and entertainment purposes will see the broadest industry adoption," said Marcus Torchia, research director, Customer Insights & Analysis.
Of the two reality types, spending in VR solutions will be greater than that for AR solutions initially. However, strong growth in AR hardware, software, and services spending (164.9% CAGR) will push overall AR spending well ahead of VR spending by the end of the forecast.
On a geographic basis, China will deliver the largest AR/VR spending total in 2020 ($5.8 billion), followed by the United States ($5.1 billion). Western Europe ($3.3 billion) and Japan ($1.8 billion) and will be the next two largest regions in 2020, but Western Europe will move ahead of China into the second position by 2023. The regions that will see the fastest growth in AR/VR spending over the forecast period are Western Europe (104.2% CAGR) and the United States (96.1% CAGR).
IDC's Managed CloudView 2019, a global primary research-based study sampling 1,500 buyers and nonbuyers of managed cloud services, highlights how enterprise expectations of managed service providers (SPs), along with their ecosystem of public cloud provider partners, is shifting and will drive fundamental changes in both how buyers consume cloud services and providers position their business models in meeting customer needs for these services.
"Enterprises continue to see tremendous value in utilizing managed SPs for managed cloud services to support transformation to cloud and provide multicloud management capabilities that helps to orchestrate and manage across a broad array of hyperscalers and SaaS provider partners, the full range of cloud options (private, public, hybrid), and across the lifecycle of services, while supporting new innovations, critical business processes and industry requirements," said David Tapper, vice president, Outsourcing and Managed Cloud Services at IDC. "However, a combination of changing customer perceptions and expectations, technological innovation, and pressures emerging from coopetition between managed SPs and their ecosystem partners of hyperscalers appear to be creating a tipping point for which managed SPs need to clearly assess their market position and what their long-term roles will be in optimizing their opportunities for managed cloud services."
Key findings from IDC's worldwide Managed CloudView 2019 study include the following:
· Changing buyer view in roles of managed SPs and hyperscalers. Enterprise expectations of managed SPs and hyperscaler partners (public cloud providers) are changing with a view of managed SPs as meeting transformation, strategy, and multicloud requirements and public cloud providers as becoming more "strategic" partners meeting critical cloud service needs (e.g. easy to integrate, availability, rapid provisioning of applications etc.)
· Shift in sourcing strategies toward public cloud providers. Firms expect that they will look to increase the number of public cloud providers they use in the future while 68% of firms worldwide will consolidate their portfolio of managed SPs.
· Need to support innovation using PaaS capabilities. Enterprises indicate significant use of managed SPs to support innovative capabilities involving development of cloud-native applications and use of open source and containers with PaaS (platform as a service) reaching upwards of 50% of enterprise application portfolios by 2024.
· Premiums for certified use of public clouds. Nearly all enterprises indicate willingness to pay a premium for using managed cloud services for public clouds that are certified by the public cloud provider, with 32% of firms willing to pay a premium from 21-40% and 28% from 41-60%.
· Preference for cloud business model in provisioning managed cloud services. Firms expect managed SPs to utilize the business model of public cloud/SaaS providers with 82% indicating a need for managed SPs to own their own cloud platform to be successful and 94% indicating that managed SPs need to offer SaaS capabilities.
IDC's IaaSView 2019, a primary research-based study sampling 1,500 global cloud Infrastructure-as-a-Service (IaaS) customers worldwide, highlights that existing applications continue to lead initial adoption of cloud infrastructure services for a majority of enterprises. While the trend is consistent with 2018, there was a marked increase in specific areas in 2019, such as the percentage of respondents reporting that their use of public cloud IaaS was led by scaling existing applications, which was 45% in 2018 compared to 62% in 2019. This highlights the value of a consistent enterprise application hosting environment for businesses to increase their use of cloud infrastructure services.
"Compared to 45% in 2018, over 60% in 2019 report that initial public cloud adoption was primarily for scaling of existing enterprise applications," said Deepak Mohan, research director, Cloud Infrastructure Services at IDC. "This growth reflects both the rapid expansion of public cloud IaaS usage across enterprise IT customers and use cases and the importance of the transition path for existing applications and data as they move into a public cloud IaaS environment."
Other key findings from IaaSView 2019 include the following:
· Over half of the respondents reported that their decision to use public cloud IaaS was driven by factors other than cost, including access to cloud native frameworks, new technology services like AI/ML, and access to value-added services to manage their resources.
· The workloads for which enterprises are using public cloud infrastructure services are roughly evenly split between commercial off-the-shelf software and custom applications.
· A majority (58%) of respondents anticipate moving their commercial off-the-shelf software to a SaaS-based consumption model within the next five years.
· 78% of respondents report an increase in percentage of infrastructure budget allocated to public cloud IaaS.
· 42% of the respondents reported using one primary public cloud IaaS provider along with multiple secondary public cloud IaaS providers, and 19% report using multiple primary public cloud IaaS providers.
· 48% report that they have applications in one public cloud that regularly communicates with applications in a different public cloud.
· 52% report that they currently have a hybrid cloud infrastructure in place, which is actively used for optimal placement of workloads across infrastructure options.
"Public cloud is an increasingly critical component of the IT infrastructure mix at enterprises," said Ashish Nadkarni, group vice president, Infrastructure at IDC. "IDC's recent investments in new market intelligence programs like IaaSView underscore our recognition of the growth in importance of public cloud infrastructure services for enterprises, and our continued commitment to provide timely insights into enterprise buyer sentiments around the evolving infrastructure market."
London, late November, with the festive season just around the corner – what better time or location to recognise the technology achievements of key IT industry companies?
With regular MC, Paul Trowbridge once again called away to watch Lewis Hamilton take his victory lap in Abu Dhabi, Mike Evans stepped up to the podium for the 10th edition of the SDC Awards, organised by Angel Business Communications, publisher and organiser of the Digitalisation World portfolio of digital publications and events. Major thanks are due to the main sponsors of the evening, with special thanks to headline sponsor CTERA, drinks sponsor Navisite and entertainment sponsor Lightbits, who ensured respectively that the overall event was a major, successful celebration and recognition of IT innovation; that liquid refreshment got us off to a thirst-quenching start; and that the comedian and casino ensured an enjoyable, memorable evening away from the awards themselves.
Following the bountiful drinks reception, 200+ attendees sat down to an excellent three-course dinner, followed by some top class comedy courtesy of acclaimed comedian Jimmie McGhie. The serious business of the night followed, with 27 awards being handed out by award sponsors and Angel staff alike, and the evening ended with some high stakes casino entertainment, where, fittingly, the only thing lost or won was virtual currency!
The full list of award winners, with accompanying pictures, follows below:
The importance of proactive performance monitoring and analysis in an increasingly complex IT landscape. Digitalisation World launches new one-day conference.
The IT infrastructure of a typical organisation has become much more critical and much more complex in the digital world. Flexibility, agility, scalability and speed are the watchwords of the digital business. To meet these requirements, it’s highly likely that a company must use a multi-IT environment, leveraging a mixture of on-premise, colocation, managed services and Cloud infrastructure.
However, with this exciting new world of digital possibilities comes a whole new level of complexity, which needs to be properly managed. If an application is underperforming, just how easily can the underlying infrastructure problem be identified and resolved? Is the problem in-house or with one of the third party infrastructure or service providers? Is the problem to do with the storage? Or, maybe, the network? Does the application need to be moved?
Right now, obtaining the answer to these and many other performance-related questions relies on a host of monitoring tools. Many of these can highlight performance issues, but not all of them can isolate the cause(s), and few, if any, of them can provide fast, reliable and consistent application performance problem resolution – let alone predict future problems and/or recommend infrastructure improvements designed to enhance application performance.
Application performance monitoring, network performance monitoring and infrastructure performance monitoring tools all have a role to play when it comes to application performance optimisation. But what if there was a single tool that integrated and enhanced these monitoring solutions and, what’s more, provided an enhanced, AI-driven analytics capability?
Step forward AIOps. A relatively new IT discipline, AIOps provides automated, proactive (application) performance monitoring and analysis to help optimise the increasingly complex IT infrastructure landscape. The four major benefits of AIOps are:
1) Faster time to infrastructure fault resolution – great news for the service desk
2) Connecting performance insights to business outcomes – great news for the business
3) Faster and more accurate decision-making for the IT team – great news for the IT department
4) Helping to break down the IT silos into one integrated, business-enabling technology department – good news for everyone!
AIOps is still in its infancy, but its potential has been recognised by many of the major IT vendors and service and cloud providers and, equally important, by an increasing number of end users who recognise that automation, integration and optimisation are vital pillars of application performance.
Set against this background, Angel Business Communications, the Digitalisation World publisher, is running a one day event, entitled: AIOPs – enabling application optimisation. This one-day event will be dedicated to AIOPs - as an essential foundation for application optimisation – recognising the importance of proactive, predictive performance monitoring and analysis in an increasingly complex IT landscape.
Presentations will focus on:
Companies participating in the event to date include: Bloor Research, Dynatrace, Masergy, SynaTek, Virtana, Zenoss.
To find out more about this new event – whether as a potential sponsor or attendee, visit the AIOPS Solutions website: https://aiopssolutions.com/
Or contact Jackie Cannon, Event Director:
Tel: +44 (0)1923 690 205
DW talks to Krzysztof (Kristof) Franek, Founder and CEO of Open-E, about the company’s data storage technology and product portfolio.
1. Please would you provide some background on the company – when and why formed, key personnel etc.?
Open-E was established in September 9, 1998. Since 2000 we had in our offer RAID controllers offered by an American start-up company 3ware. These products were selling great. At one point, together with our customers, we started to think what could be done to improve the sales. After consultations we thought about creating software that would help to manage data on Linux servers with 3ware RAID controllers. And that’s how at the end of 2001 our journey with mass storage began.
Currently the company provides enterprise-class data storage software that keeps your business secure and protected from any kind of threats. Our products offer best-in-class reliability, ROI and of course are very flexible. We have solutions for any data storage usage scenario and for company of any size.
At the moment, the products made by Open-E are used all over the world in over 100 countries, including Fortune 500 customers.
2. And what have been the key company milestones to date?
As I’ve mentioned earlier, the very beginnings are marked by RAID controllers. The idea for data storage solutions started to develop in 2001. At first I treated it all more like a hobby, but later it turned out that the client response was very positive and that we were getting more and more requests for additional features and improvements. So we launched the Open-E NAS which enabled transforming almost all PC servers into professional file servers. This has been further expanded and we added iSCSI support to our products. I should mention that we are one of the first companies on the market that has joined NAS and iSCSI within one product. But to cut the long story short, soon we launched Open-E DSS and later the Open-E Data Storage Software V6 (DSS V6), a unified file and block storage management OS that provided NAS, iSCSI, InfiniBand and Fibre Channel SAN functionality. Next, there was the Open-E DSS V7, which is still being sold and is highly appreciated on the market.
The biggest milestone, I believe, was the launch of Open-E JovianDSS in 2014, which is now our flagship product. However, we do not rest on our laurels as we regularly launch new updates for Open-E JovianDSS in order to constantly meet the ever growing market’s demands. For example, adding Fibre Channel support in 2018, Split-Brain protection features, Ethernet Cluster option in 2017. You can say each new release of Open-E JovianDSS is a milestone on its own.
3. Please can you provide an overview of the company’s flagship JovianDSS product, with particular reference to how it distinguishes itself in what’s quite a busy market?
Our product is way more cost-effective than the products offered by our competitors. Not only does it offer the-best in class features for data protection and data management but Open-E JovianDSS is also a highly flexible and compatible option with various hardware products on the market. Our product is also equipped with a scriptable CLI/API that eases configuration and data management. What is more, the ZFS-based Open-E JovianDSS software defined storage provides solutions for data storage, backup and disaster recovery within one application. That is a real bargain, taking into consideration how universal our product is.
4. In more detail, how does the product help to address backup and restore challenges?
Open-E JovianDSS is equipped with On-& Off-site Data Protection feature for multiple native backups. This built-in feature allows users to create consistent snapshots and asynchronous snapshot replication to local and/or remote backup locations so there’s no limit to how many backup destinations our users can define. There are also clones, writable versions of snapshots that in some cases help to organize your data. This way users have constant access to previous data versions and option to restore them whenever there’s such need, which might be of great help in case e.g. a ransomware attack.
5. And problems around business continuity?
Nowadays a lot of companies struggles to find a solution that would allow them uninterrupted access to data. I’d say business continuity is one of the elements that every good data storage software has to provide. Our solution to this problem – Open-E JovianDSS – offers dozens of options in regards to architecting high availability environments with iSCSI, Fibre Channel (FC) and NFS, SMB (CIFS) client protocols. These options allow setting up High Availability Load-Balanced Storage Clusters that ensure reliability and redundancy through failover in case of a server crash. Our software helps users to build their storage environment flexibly and according to individual business needs. A system can be based on two nodes, each with its own hard drives attached directly to the node, or with a connection by means of shared JBODs over SAS or FC, or even with two nodes and a common set of direct attached hard drives within one server enclosure
6. And, of course, cloud storage is now a major issue?
Honestly speaking, we don’t consider cloud storage to be an issue for Open-E. On the contrary, we believe it can be used as a complementary service to our own products, you can use Open-E JovianDSS as a backend solution for your cloud or launch JovianDSS instance in the cloud. It does not matter if you are an Enterprise, SMB, or SOHO user building your own reliable Private Cloud or if you are an MSP/Service Provider delivering a Public Cloud for your clients. Open-E JovianDSS Data Storage Software meets the demanding requirements of building a secure and efficient cloud environment. You won’t be disappointed.
7. And then there’s the emphasis on disaster recovery?
Yes, Open-E JovianDSS is a product that will help your business function with no interruptions. Our product keeps you safe from natural disasters thanks to advanced functionalities in On- & Off-site Data Protection, as you can easily restore your data from the same server (e.g. from one of multiple snapshots through the Snapshot Rollback function), from local backup server (data can be recovered with a minimum amount of downtime by temporarily using a backup server as the data server while the primary server is being restored), as well as from off-site locations. We provide solutions to a variety of scenarios.
8. Finally, storage centralisation and virtualisation/software-defined storage?
Well, Open-E JovianDSS is a Data Storage Software which can be combined with VMware vSphere ESXi. It is a cost-effective, flexible and scalable solution for virtualization that offers highest performance and data efficiency. Open-E JovianDSS based virtualized storage setups with High Availability and/or a sophisticated Hyper-converged Infrastructure offer the best value and quality.
9. And high availability is more critical than ever?
I think yes, since the companies need to have access to their data constantly, 24 hours 7 days a week and every moment of downtime means losses. Reliability and redundancy offered by Open-E JovianDSS are the things that will keep you safe from any kind of harmful events. By using the Open-E JovianDSS Advanced Metro High Availability Cluster Feature Pack, you can create high availability for two server nodes with storage mirror over Ethernet using a storage at each location (Dual Storage).
As the connection of cluster communication and data mirroring between nodes works over Ethernet, the nodes can be located far from each other as a (stretched) metro storage cluster. It can be 50 miles (80 km) in case of point to point fibre optic connection, or even more when using an additional switch between nodes.
We offer also Open-E JovianDSS Standard HA Cluster Feature Pack with shared storage, thanks to which you can enable high availability for two server nodes connected to a shared storage with one or more JBODs.
10. Moving on to company plans, how does Open-E work with its technology partners?
We cooperate with leading technology hardware and software vendors. Our alliances include technology industry leaders such as Intel, VMware, Toshiba, Microchip/Microsemi/Adaptec, Broadcom/LSI, ATTO and AIC, Supermicro and many others. We also work with emerging vendors to deliver support in our software for upcoming technologies. This allows our OEM and system integrator partners to customize and build the affordable, scalable and secure storage systems of their customers’ choosing.
11. And how does the company work with the Channel?
We support our partners and vice versa. You can also see Open-E on various events and conferences. Despite the large network of Partners we make every effort to treat each of them individually - according to their specific needs. Our authorization is also a guarantee of compatibility, quality and trust that we confide in our partners. We offer four levels of partnership. Every partner starts as an Authorized Partner and with the growth of our co-operation, Open-E offers Silver, Gold and Platinum Partnership levels. Each partnership level includes an increased amount of benefits and requirements.
Also, we offer Open-E Certified Engineer Trainings to share our knowledge, plus we offer Open-E Certified Server program which enables hardware vendors whether their components work seamlessly with Open-E products. Open-E Certified Storage Servers are tested, benchmarked and certified by Open-E. This way, customers are able to use solutions that require exceptional redundancy and security, without compromising performance. Not only engineers are offered trainings, but sales professionals can be trained as well – we organize Open-E Certified Sales Professional Trainings during which we present the participants our products and technologies and show how to sell more effectively.
12. Does Open-E have any plans to expand either geographically and/or into any specific vertical markets?
We are currently focusing on delivering the best possible products and solutions on the existing markets. Expanding geographically and any other way is a natural result of our activity and indeed, we are growing, developing and working on many other projects. Stay tuned!
13. And what can we expect from Open-E in terms of technology developments into 2020?
We will continue the development of our main product, that is Open-E JovianDSS. As for the rest of our projects you need to wait a little. Work in progress and for sure it will be something outstanding.
14. Finally, would you be able to share a recent customer success story?
Yes. The most recent success story involves e-shelter, a company that is known for developing and running high availability data centers, who needed a storage solution known for its reliability and performance. For this task, Toshiba Electronics Europe GmbH, e-shelter Partner has chosen Open-E JovianDSS that was installed on August 30, 2017. Since then, excluding a planned shutdown in November 2018 related to Open-E JovianDSS update, not a single disk failure nor a downtime occurred ever since.
Apart from that we have success stories of various markets: security services providers, publishers, video postproduction, vehicle providers, educational institutions and many more. Please visit our website www.open-e.com to read more success stories from our partners and customers
Digital transformation is essentially finding new ways to leverage data within an enterprise, to drive efficiency, revenue and business insight. Now a standard for modern business, digital transformation could be worth $18 trillion in additional business value, according to analyst house IDC.
By John Western, Regional Vice President, Europe, LucidWorks.
Every organisation has a unique pool of data at their disposal. However, these pools also face challenges from within organisations, such as the inability to manage, integrate and analyse colossal amounts of data, resistance to change and a lack of technologies that prevent change. In the long term, failure to address these shortcomings and form a successful digital transformation strategy can have a detrimental impact.
Polaroid is a prime example of a company that failed to adapt to changing market needs, and as a consequence, went from being valued as a $3 billion company in 1991, to filing for bankruptcy in 2001, eventually selling its entire brand and assets. Its decision to prioritise its primary business of selling instant film, and half-hearted effort at digital transformation, prevented it from cashing in on the success of digital images.
So, what are the key considerations for an organisation in forming an effective digital transformation strategy?
Insights Derived From Data Direct Business Change
Digital transformation thrives on insights derived from data that, in turn, provide a new perspective into business activity that allows creation of new business models. Insights are distilled from data in context. Insights back up a hypothesis. Data is the raw material.
Digital transformation is powered by data-driven insight, which helps inform new perspectives on business activity, and which subsequently aids the development of new business models. So, are you systematically seeking and creating new forms of insight? In order to promote digital transformation, the insights dimension requires more data, more tools for analytics, quicker cycle times for analysis, and further automation of data preparation and engineering tasks. All this combines to form the raw materials of insights.
Awareness of Business Opportunities
Applying data insights into a larger segment of the enterprise allows firms to form a better perspective of the world, in addition to an awareness of new business challenges and opportunities. These insights can form the foundation for more complex and richer data models, allowing firms to identify how events are linked and the choices that we make.
What types of awareness are essential in your digital transformation programme? To generate awareness, organisations need better tools for modelling and exploring data, in addition to the interactive and automated analysis of it. Digital customer data is the raw material that creates human awareness, and also creates the foundation for autonomous systems.
Optimisation Creates Business Value
Digital transformation programmes are iterative. Insights drive awareness, and awareness drives new business models. These should be implemented in stages, ideally with a small project to assess the product/market fit of the new digital model. Optimisation is integrating this process into a larger system, until the central business model driving the digital transformation expands to all relevant situations.
What optimisation will be needed as your digital business model expands? Digital transformation often demands products to constantly evolve. The challenge is to move quickly and in the right direction. Organisations need a strong data foundation to help them analyse and monitor the usage of a product, in addition to strong product management to facilitate collaborative thinking on how to evolve the product.
Automation Enables Scalability
While optimising processes could help a business grow, automation ensures scalability in the longer term. Regardless of industry, a prerequisite of the digital age is that businesses are required to represent themselves digitally. Companies often find that full automation is difficult as it requires a complete remodelling of business processes, into a brand-new architecture.
So, what is your strategy for creating an automation architecture for your digital business model? Automation used to be only about APIs and coding, which is still true. However, increasingly, automation is powered by machine learning algorithms that can identify patterns which humans can’t. The functional landscape for automation needs to include as much coverage of the product by APIs as possible. This allows scripts, more advanced programmes and machine learning systems to manage the behaviour of the product or support system. Instrumentation should be integrated at all levels, to support feedback loops that enable learning and problem-solving.
Scalable Architecture Grows
The scalability of digital operations is a core component of digital transformation. It allows enterprises to continue building on iterative successes achieved and makes optimisation possible.
Scalable architecture can be built in stages, too. We often think of scalability as one that can expand, and even contract, when needed. For example, Amazon Web Services offers auto-scaling, which adjusts when you need more performance and winds down when you need less.
It is important to consider whether your existing architecture is scalable for the current stage. Are there signs it is wearing out? Although it is not mandatory to have the architecture for scalability at the start of an initiative, it is important to always plan for growth.
Ultimately, the hallmark of a successful digital business is the ability to remain agile even as the business grows; being able to renew itself, adapt and change simultaneously with its environment. As organisations today operate in a constantly changing technology environment, agility is the key to operating successfully; leveraging new data and insights to facilitate growth and stability.
Machine Learning Can Jump-Start Your Strategy
Creating a new digital strategy is not easy, and creating the wrong digital strategy can be costly. One fundamental key to making sure you’re on the right track is to continuously look for and create new insights across your organisation, data, documents, and systems.
To capture these insights requires more data and more analytics for data preparation. Manual data preparation alone can take months, and lead to wasted time and resources. And given that most automation requires rules and cleanly labelled datasets, this can become problematic.
However, machine learning techniques like clustering and classification that rely on algorithms to group similar pieces of data together can help reduce the manual burden of preparing the data needed to plan, execute, and measure your digital transformation.
Developing a winning digital transformation strategy can be challenging. Having crucial insights is imperative, and having the right tools to find those insights are fundamental to moving your business forward. And, make no mistake, there are many companies today that are considering if they’re able to harness the people and data needed to transform to the next big thing, or whether they’ll be left with diminishing revenues, uncompetitive and ultimately failing.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, and now we continue the series in the December issue. Part 1.
Multi-cloud environments have been a hot topic for the last year. Already, businesses have been realising the benefits of a vendor-agnostic approach, which not only minimises costs but gives them the freedom to innovate. However, there are a couple of aspects of operations which will be key in ensuring multi-cloud remains viable for enterprises in the long-term.
Despite the freedom which comes with a vendor neutral ecosystem, orchestrators haven’t yet overcome the headache associated with migrating workloads between these different cloud infrastructures. The past year saw major cloud players like IBM making acquisitions to address this, but as yet, they haven’t found a successful solution. Over the next year, this will be a priority for enterprises looking to remove the bottlenecks in their CI/CD pipeline. Organisations will invest in services which can help them harness a multi-cloud ecosystem, by supporting fast deployment, scalability, integration and operational tasks across public and private clouds.
Another piece of the puzzle will be in observability and monitoring across clouds. To ensure operations are maintained across the entire ecosystem and that they are fulfilling the workloads, the components for observability must be in place. This becomes complex in a multi-cloud infrastructure, where the same level of visibility and governance must be applied across instances. 2020 will be the year public cloud providers start to put these projects together, and already we are seeing the first instances of this with the likes of Google Anthos.
2. Unicorn start-ups will begin repatriating workloads from the cloud
There has been a lot said about cloud repatriation of late. While this won’t be a mass exodus from the cloud - in fact quite the opposite, with public cloud growth expected to increase - 2020 will see cloud native organisations leveraging a hybrid environment to enjoy greater cost savings.
For businesses starting out or working with limited budgets, which require an environment for playing around with the latest technology, public cloud is the perfect place to start. With the public cloud, you are your own limit and get immediate reward for innovation. But as these costs begin mounting, it’s prudent to consider how to regain control of cloud economics.
Repatriating workloads to on-premise is certainly a viable option, but it doesn’t mean to say that we will start to see the decline of cloud. As organisations get past each new milestone in the development process, repatriation becomes more and more of a challenge. What we will likely see is public cloud providers reaching into the data centre to support this hybrid demand, so that they can capitalise on the trend.
3. Public cloud providers will be subject to increased security standards
The US Department of Defense’s decision to award the 10-year contract for its JEDI project to Microsoft will prove to be a watershed moment, serving as a trigger for more government agencies to move applications and unify information in the public cloud. The lure of major Federal spending will drive other cloud providers to compete in this multi-billion dollar space.
One of the biggest impacts will be the need to raise security and compliance standards in the public cloud. Government bodies work to extremely high requirements, which will now be placed on cloud providers and will have a spillover effect on the sector as a whole. This will include higher standards for how hybrid environments are architected and the need for a complete data separation between public cloud and on-premise environments. It will also encourage a move away from the outsourcing model as organisations will seek to build up their in-house cloud skills to meet requirements.
While this will primarily impact the US cloud market, it will also have ripple effects for other markets. The hyperscale providers are global in nature and so will be required to adjust their policies and practices for jurisdictions such as Post-Brexit United Kingdom, where there will be new standards around data protection and data separation from non-UK entities.
4. Greater level of network automation through AI/Machine learning
The state of artificial intelligence and machine learning (AI/ML) in business has matured from a nebulous vision into tangible deployments. Companies are now giving a much heavier focus to AI/ML and are reorganising their IT and business operations to cater for the trend. We’re observing this first hand through Kubeflow, where we see scores of startups and established enterprises joining every day to explore that they can do with AI/ML and how they can make deployments easier.
One specific area that’s already being enhanced by AI is in networking. We’re working with several IT and telecoms companies in this area that want to build better networks and gain far deeper insight into how those networks are being used – across everything from optimising power consumption through to the automation of maintenance tasks. In 2020 we will see the focus around AI/ML in the networking space get bigger than ever as more and more case studies emerge.
5. Kubernetes will no longer be seen as the silver bullet
Kubernetes has become an integral part of modern cloud infrastructure and serves as a gateway to building and experimenting with new technology. It’s little surprise that many companies we observe are doubling down on the application and reorienting their DevOps team around it to explore new things such as enabling serverless applications and automating data orchestration. We think this trend will continue at strength in 2020.
On a more cautious note, we may also see some companies questioning whether Kubernetes is really the correct tool for their purposes. While the technology can provide tremendous value, in some cases it can be complex to manage and requires specialist skills. As Kubernetes is now commonly being used for production at scale, it becomes increasingly likely that users encounter issues around security and downtime. As a result of these challenges, we can expect the community will mature and – in some cases – come to the viewpoint that it might not be right for every application or increase the need to bring in outsourced vendors to aid with specialised expertise.
6. 5G will enable ‘Netflix for gaming’
Mobile gaming has grown at a ferocious pace over the past decade to become a mainstream phenomenon. The global video gaming industry last year was worth nearly $138 billion, of which more than $70 billion was in mobile gaming. In 2020 we will see this trend accelerate further with the expansion of 5G, which offers the robust connectivity, low latency and bandwidth required to host heavy graphical content. This will enable the ‘streaming as a service’ model to flourish in gaming, giving users access to a curated set of gaming applications on a subscription basis, all at the touch of a button.
Where 5G really makes a difference in this instance is in freeing up compute power that would normally be required to download gaming applications. The popularity of cloud gaming (or gaming on demand) applications such as PlayStation Now and Shadow was accelerated by rapid developments in GPUs. In mobile, this same phenomenon will be made possible by 5G as an edge use case. An early example is Stadia, due to go live on Google in November of this year, which allows browser-based streaming.
We expect mobile gaming will become a far bigger trend in 2020 as the large telco providers roll out their 5G networks. This in turn will trigger a greater focus on open source as a means for building resilience and scalability into their cloud-based 5G core.
Craig Tavares, Head of Cloud, Aptum Technologies, says that new innovations for cloud technology are on the horizon, and as a continually evolving platform that seeks to meet scalability, accessibility, and security demands, the development of new and existing cloud features is something to look forward to moving into 2020.
These features include automated security scanning of cloud environments, data transfer acceleration through edge presence, application discovery, mobile access and control of multi-cloud infrastructure, along with new innovations in cloud platform management.
Data is at the forefront of any modern company’s progress, and the increased need for accessibility to empower employees precipitates this. One of the best ways organisations can create a larger ecosystem is through true hybrid cloud architectures and operational models. Technologies used in these models are used to enable better mobility, interoperability, scalability and transparency. Large organizations have already started adopting these technologies in order to increase the interoperability between applications, data and platforms. One example of this is the need for interconnectivity by enterprises operating on a range of different cloud platforms. We can expect smaller companies to emulate their larger counterparts in the near future which means more companies will be leveraging cloud on-ramp services and new emerging technology like software defined WANs to connect disparate on-prem and public cloud environments.
The increased diversity, quantity and overall utilization of data is the main driver behind the cloud revolution. Accompanying the sheer abundance of data and the creation of new SaaS applications is the expansion of hyper-scale data centers. As many companies feel their infrastructure cannot match their pace of growth to suit future demands without incorporating significant changes, hyperscale infrastructure offers and alternative view to the future by lifting the burden of significant investment in infrastructure. SaaS requirements to support large numbers of mobile devices will inevitably drive companies toward hyper-scale data centers, but it is not always easy to move workloads between on-premise servers and scalable data centers, especially on demand. Look for technologies that will make this interaction seamless such as applications being deployed in containers with Kubernetes for automation and more comprehensive data optimization management tools.
The overall rise in the number users and devices used to access the cloud, paired with the volume of data one can have at their fingertips, significantly increases the need for cloud security. This will continue to be an important focus in the coming year because whilst cloud is an enabler for meeting of global mobility expectations, this inherently creates numerous vulnerabilities. Consequentially, the number of bespoke cloud security products will need to increase to mitigate these threats. Key to this is an organization's understanding of its own weaknesses in order to provide the best tailored response. In doing so, a better allocation of resources to secure an IT environment becomes more viable. Technologies incorporating machine learning and AI will increasingly be used to automate infrastructure security scanning, remediate and even apply security hardening policies.”
James Harvey, EMEA CTO, Cisco AppDynamics believes that we’re in the ‘Era of the Digital Reflex’. There’s a business-critical need to meet customers’ increasing digital experience expectations in 2020:
“Any outage, inconvenience or poor digital customer experience will cost UK businesses dearly in 2020. As we enter a new decade, we also enter the ‘Era of the Digital Reflex’ – our use of digital services and applications has evolved to become an unconscious extension of human behaviour, and in turn, consumer expectations about the performance of applications and digital services has sky-rocketed.
74 per cent of UK consumers are less tolerant of problems with digital services than they were two years ago, and over half switch suppliers when performance issues do occur (according to the recent App Attention Index Report). This highlights just how unforgiving customers have become.
With 5G network transformations in full-swing, next year will see consumer expectations escalate further. Consumers will expect faster download speeds, better video quality and immersive experiences in any location and on any device. Poor performance is no longer acceptable. As businesses develop even more intuitive digital experiences to meet these demands, they must also be ready to deal with the increased back-end application complexity. Application performance monitoring, machine learning and AI are critical solutions in solving this level of complexity, and enable businesses to have visibility into the performance issue, insight into the route cause of the anomaly, and the ability to take immediate action and address performance issues before they impact the customer.
The ‘Era of the Digital Reflex’ will bring great opportunities to brands that invest and innovate in their digital customer experience. For those that fail to do this, 2020 will be a tough year.”
Aron Brand, CTO, CTERA, talks about 2020 and the new era of edge computing for enterprise IT:
2020 will mark a notable shift in enterprise IT as the dawn of a new era of edge computing arises. The first-generation model of centralised cloud computing and storage has now run its course, and most of the new opportunities for enterprise data management reside at the edge.
Consider that a growing volume of enterprise data is created in branch offices, on mobile devices and by IoT-enabled smart devices. Gartner estimates that in five years, 75% of data generated and processed by enterprises will exist at the edge rather than in the traditional centralised datacentre or cloud. Such data growth outside the datacentre is the new reality, and it is creating a need for enterprises to deploy computing power and storage capabilities at the network edge, or in other words, edge computing.
Edge computing combines a cloud service located at a datacentre (often called the “core”) with an edge device near the end user that is capable of autonomously satisfying a portion of the application functionality. For example, take your voice-based home virtual digital assistant from Amazon or Google. The voice processing for all of these devices takes place, believe it or not, in the cloud. The edge compute for these devices is limited to “wake words” that tell the device to send speech-to-text conversions to the cloud for processing.
More powerful edge compute power would make these devices even more desirable, including:
With the potential for snappier responses, continual availability and higher security levels, it won't be long before edge computing becomes mainstream in consumer products. Nor is it surprising that enterprises already are investing in this kind of edge computing.
Edge computing will continue to grow and will likely become mainstream in enterprise IT in 2020. As enterprises from all verticals adopt edge computing to accelerate their digital transformation, edge-to-cloud architectures that manage data centrally while making it instantly available to users at the edge will be a key enabler for business success. By offering low latency, reliable access to files, and cloud-scale economics, edge-to-cloud file services can revolutionise the way enterprises manage their valuable data assets.
A common misconception around the customer journey is that it’s a process of A to B. In reality, requirements and circumstances often change at many points along the way – much like a car trip with unexpected traffic and road closures. This is why, as IT requirements get more complex, AI for IT Operations (AIOps) is becoming the future of IT management.
By Lee James, CTO, EMEA at Rackspace.
That’s certainly the view of Gartner, which predicts that by 2020, approximately 50 per cent of enterprises will use AIOps technologies together with application performance management (APM) – up from 10 per cent in late 2018.
Take GPS app Waze as an example. It provides live directions and traffic alerts while the philosophy of AIOps is to enhance IT operations through machine learning, analytics, and big data. Where Waze looks at the thousands of other cars on the road and incorporates user-submitted updates in real-time, AIOps can proactively and often pre-emptively detect incidents and correlate events across ecosystems.
AIOps is a far cry from the ‘robots are taking over’ scenario that tends to play out in the popular imagination. It is about using multi-layered technology platforms to make operations smarter and free up resources. To put it simply, it presents two key opportunities for businesses. Firstly, it enables faster management of IT issues, which in turn reduces the scope for reputational risks. It also drives customer centricity by helping businesses understand where customers would like them to innovate.
Navigating reputational risk
Managing cloud services is no straightforward feat. It is a challenging and resource-intensive process. Picture this: a warning sign pops up and the IT team has to get together to decide what to do about it. In the meantime, the business’ service is down and customers are filling the gap with a competitor’s service.
In today’s landscape of 24/7 delivery and ever-increasing customer expectations, customers are willing to voice their opinion across different platforms including social media if something is amiss in their experience. This means reputation and service delivery are more closely linked than ever before, and reputational damage can happen in a matter of seconds.
To mitigate the risk of reputational damage, businesses need to be able to act immediately, have teams understand what’s happening across different services, and quickly gather and analyse feedback from a growing number of platforms. Many businesses have introduced a level of automation in an attempt to achieve this, through a chatbot advising when the next customer representative is available. But this is effectively just a quick fix solution as humans will still need to be part of the overall issue to develop a solution.
AIOps comes into its own here with its ability to automatically detect, diagnose, and in many cases can remediate service issues in real time. Codifying knowledge from previous incidents to find and fix future ones quickly and accurately – and with far less effort – to ultimately protect the business’ revenues and reputation.
Driving customer experience
When it comes to business data, there’s a difference between background noise and the messages customers want a business to hear. Keeping on top of customer feedback and understanding how much should be closely listened to is becoming increasingly complex, with more data types and sources to track.
With AIOps, businesses not only receive the right information but have faster access to it, enabling them to make better decisions. It can be used to correlate data across millions of customer journeys to identify patterns, developing a clearer understanding of how customers would like services delivered and opportunities to enhance the customer experience.
Much like Waze uses smart technology to alert users to traffic incidents and map the best possible route in real time, AIOps allows businesses to navigate issues and creates scope to improve at every juncture. It’s clear that Waze has transformed the way users travel and similarly, AIOps will transform IT management and allow IT teams to deliver real benefits back to the business in terms of revenue, reputation management, and customer happiness.
Digital companies that have created platform-based business models such as Facebook, Uber, Amazon, Airbnb, etc. have become tremendously successful because they understand what their customers want and how they behave. Today consumers are more accustomed to the new enhanced “digital way of life” and less tolerant towards user experiences that do not offer interactivity, convenience and compelling pricing.
By Srikar Reddy is CEO at Sonata Software and Sridhar Vedala Head of Digital Business at Sonata Software.
The key to these successful digital businesses is their platforms that are open, scalable, connected and intelligent. They are using technology to channel consumer frustration into inspiration by offering them an opportunity to create, exchange and engage with fellow participants. Harnessing data from their platforms, digital businesses are disrupting existing business models.
The non-platform companies have realised the far-reaching implications of the platform business models and some are defending their market share by creating their own platforms. The charge is led through “digital transformation programs” but, surprisingly, most companies still struggle to defend their turf and transform themselves. While most leaders are now aware of what needs to be done, major challenges occur during the implementation and integration of platform business models into the existing organisation.
Generally, the fundamentals and vast opportunities offered by digital platforms are usually well understood. As part of their “digital transformation” initiatives, companies need to set priorities around better customer understanding, enhanced proximity, exceptional customer experience and new service/product offering.
Beyond technology, the leaders also understand the need for transforming their organisations and relinquishing obsolete business practices, processes and models. They are transforming their organisations, bringing in agility, speed, flexibility and innovation akin to their tech counterparts.
So why is success still elusive? Armed with the realisation of limitless possibilities of digital platforms and their own aspirations for change, the transformation journey should have been relatively easy. Astonishingly, most companies fail to successfully translate their aspirations into reality fast enough.
The answer lies in organisational “myopia”. It is the rendition of the organisation’s bold objectives into “business as usual” initiatives. Most companies create end-to-end processes with clearly defined roles and responsibilities to ensure that strategy, product/service development, distribution and sales are delivered. It is this very operating model that fosters a short-sighted mindset and a culture of compliance to rigid processes and rituals focused on established ways of working.
The moral justification for these rituals is usually deeply rooted in success stories stemming from previous periods of growth. It makes it all the more difficult to propose new alternatives – however logical and successful they may be elsewhere.
Traditional companies who now want to build similar digital businesses based on platform models face the difficult situation of running two different business models and processes concurrently; they must run the existing declining business model while simultaneously conceiving and growing a new platform-based business. Even a company like Apple has faced a similar dilemma with their music business. The entry of Spotify and Google into the music industry has completed the shift towards a subscription-based model. Apple had no option but to give up its iTunes business in favor of Apple Music, its subscription business.
Why is it so difficult to implement what is common knowledge? The main issue in a nutshell is the effect of mandated myopia which thwarts the effective implementation of new platform-based businesses.
Rather than forcing the integration of successfully incubated platform concepts into the existing organisation, many successful transformations concentrate on ring fencing the old business and letting new business models and leadership expand. The key to success is the identification and enablement of people who have the vision and can translate an idea into business.
The market now has many successful examples of such transformations. When Microsoft acquired Linkedin, it could have been integrated in the overall portfolio as just another HR solution. Instead Linkedin was left to keep its own culture and promote digital business models closer to its own core.
In the end such defining decisions will be what it takes to prevail in the digital era!
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, and now we continue the series in the December issue. Part 2.
How AI and automation will change the way we use technology in 2020
By Mathew Lodge, CEO at Diffblue.
Predicting the future is no easy task, and even experts can focus on the wrong trends—but the future is already here; it’s just unevenly distributed. A select few organisations are already making use of the AI-powered tools of tomorrow, but 2020 is likely to bring broader innovation in the way organisations use artificial intelligence and software automation technology.AI will continue to move beyond simple automation
This year, we’ve seen continued strong growth in cloud computing and AI as more businesses have embarked on their digital transformation journeys. Still, Gartner has estimated that by 2021, demand for application development will grow five times faster than tech teams can deliver. Ironically, software has automated nearly every business process except the writing of software itself.
In the past few years, this has started to change. Just like software ate the world, AI is eating software from many different angles. No-code/Low-code solutions have been around for years, but there’s a new generation of platforms and approaches that allow non-coders—analysts, for example—to create work that previously would have required a programmer. AI tools that can write code are eating traditional software from the top down. Companies in this space include Data Robot and H2O.ai.
Bottom-up AI, which improves developer productivity, is now here as well. TabNine released a machine-learning-driven auto-completion tool that suggests likely code completions derived from learning over large open source code repositories. In November, Microsoft released a similar tool for its Visual Studio Code IDE.
At Diffblue, we’ve developed Cover, a tool that analyses Java code and uses AI to write unit tests that run quickly to find regressions early on in the development cycle. The key benefit of these tests is the freedom to adopt Continuous Integration and deliver better quality software, faster. This code no longer needs to be written and maintained by developers, because the unit tests for the next commit can always be regenerated from the current version of the application.Investment in AI skills training will grow
The current AI adoption landscape is full of companies that would love to use the newest technology, but lack the resources. Recent research from Deloitte found that although 82% of large businesses in the UK are pursuing some form of AI initiative, only 15% can be considered ‘seasoned’ or mature AI adopters. The rapid advancement of technology we are experiencing today is leaving organisations across many sectors struggling to attract and retain talent with the necessary skills.
To address this, a number of universities, organisations and institutes are investing heavily in developing new AI talent. In October 2019, the UK government announced a £370 million package of government and industry investment into 14 universities and 200 businesses, including the NHS and Google, as well as training for 200 AI PhD students across 16 new centres for doctoral training. The government is also investing £13 million in Masters programmes to help develop careers in AI, and a £10 million fund for scholarships to help those from underrepresented communities access AI and data science education.
Businesses are also getting involved: Earlier this year, Microsoft announced the goal of training 15,000 new AI professionals by 2022. In 2020, there will likely be more of this type of investment as more businesses see the need to shrink the AI skills gap.Automation will play a bigger part in the creation of secure code
In the last year, a third of businesses reported cyber security breaches. The biggest data breaches have happened as a result of compromised code, and although infrastructure can also be attacked, the applications themselves are a much larger surface area for attackers. Automation can provide businesses with more eyes to monitor for vulnerabilities and breaches without having to dedicate additional developer time to this task.
Semmle, for example, is a tool that analyses code so that it can be queried like a database and crowdsources vulnerability signatures, making it possible to automate large-scale signature checking over code; it’s the first step along a path to automated remediation. Semmle was acquired by GitHub (now part of Microsoft) for an undisclosed sum in September. Although there are several tools in this space, having one of them integrated into the world’s most popular source code service is significant, and it will be interesting to see what GitHub and Semmle do together.
As the threat landscape continues to expand and hackers become more sophisticated with their attacks, there will be greater pressure for businesses to ensure their applications are protected against vulnerabilities that could leave them open to cyberattacks. As a good first step, the British government has announced plans to invest £36 million into making the UK a world leader in tackling cyber threats, so expect to see progress on this front over the next year.
Tackling the business challenges of tomorrow
While the rise of data, the shortage of coding and AI skills, and growing cyber threats are not new challenges, the ways businesses tackle them are changing. Leading-edge activity in AI and software gives us a glimpse of where these approaches might end up, and will no doubt shape how organisations work with tech in 2020.
Tom Conklin, CISO, Druva, looks at the evolutions of ransomware:
Ransomware is going to follow soft targets that have vulnerable systems. This may be small companies that have unpatched systems. My guess is that as more companies adopt cloud services and connect on-premises networks to the internet you'll see more ransomware when the on-premises systems are not patched or properly secured, and in places where cloud accounts are being misconfigured by the customer. Cloud vendors general adopt a shared responsibility model and it’s important those adopting cloud solutions understand where their responsibilities lie.
Security industry expects the number of attacks and amount of payments will continue to increase at double digit annual growth. We expect to see more targeted attacks vs. broad high volume attacks.
Emerging cyber threats
Whilst obvious, and seemingly old school – phishing continues to be a major threat for the public and corporate entities alike. The financial and reputational impacts of these attacks can be huge, so we need to work on educating the public on how to spot a phishing email – and how to report it. By understanding the threats we may become subject to – we can better prepare and educate ourselves to deal with them.
On a corporate level, I expect to see these sort of attacks become much more sophisticated. Instead of simple one-off emails, I expect to see attacks that are more social engineered and slowly work on building trust and compromising a system.
Stephen Manley, Chief Technologist, Druva, says that Cloud and SaaS will power the democratisation of AI:
There is a common misunderstanding that aggregating reams of data magically creates insights. After people have built massive data swamps, they are surprised when answers don’t spark to life and come crawling out of the primordial data ooze. Without experts in AI or machine learning, customers see no value beyond what they could compute with spreadsheets. Furthermore, the aggregated data is all risk, no reward - who knows what might leak?
In 2020 SaaS vendors and cloud providers will democratise analytics. SaaS vendors aggregate and set up data in the cloud, so users can leverage the cloud provider’s simplified AI and ML tools. Business users will now be able to reap the rewards of analytics without building an expensive data lake or getting an advanced degree in AI. Cloud and SaaS laid the groundwork with early adopters in 2019; as the infrastructure becomes easier and more self-tuning, the usage of AI will grow exponentially in 2020.
2020 will be the year of the mainframe (model)
Next year will mark the death of the developing applications for “dedicated infrastructure.” Developers will not build new applications to run on virtual machines with flash storage. Instead they will create containerised and serverless applications which start on-demand, load data from object storage into persistent memory, execute, and then release all the resources. If that makes cloud sound like mainframe, it should. The reborn centralised model eliminates the inefficiency of legacy server and storage systems.
The hardware trends leave traditional flash storage arrays without a role. Persistent memory will become mainstream and deliver application I/O performance. Meanwhile, object storage prices will continue to plummet, so they can efficiently store the application dataset. Traditional storage, optimised neither for performance or capacity, will have no place in well-designed applications.
The legacy “dedicated hardware” model is dead… it just doesn’t know it yet. The mainframe (as cloud) rises again!
As we sit on the precipice of the fourth industrial revolution, Nick Offin, Head of Sales, Marketing and Operations, dynabook Northern Europe predict: some of the technologies set to dominate the next:
To date, much of the forecasts around 5G have been in relation to investment and launches. However, next year this will change. With several telco companies officially launching the technology and 5G-compatible phones coming into mainstream, consumers will start to see 5G ‘in action’ and will realise the possibilities with this new technology. But 5G won’t just mean a faster download of your favourite Netflix show, the shift from 4G to 5G will change just about everything across multiple industries. Telecom experts are going so far as to herald 5G's arrival as the advent of the fourth industrial revolution. In fact, 5G – with its enhanced capacity, connectivity, speeds and minimal latencies – will be the catalyst for IoT adoption. Other technologies predicted to springboard off 5G include cloud and edge computing, wearables, and 8K technology – to name a few.
The advent of 5G will see the wearable technology sector continue to reach even more sectors. One such industry is the emergency services sector. Decision-makers within the police, fire and ambulance services are beginning to recognise how they can best use wearable devices to enhance the mobile productivity of workforces, improve first-responder safety and better patient care. While wearables remain in their infancy within the Blue Light sector, over the next few years we’ll see a growing appetite around use case testing and experimentation. Those who are already testing out wearable technology are continually uncovering more potential use-cases.
Edge computing has gained significant traction in recent years. However, if 5G, IoT and wearables are to be adopted at the rate predicted, it will require ‘the edge’ to remain central to enterprise operations. The value of edge computing comes in its ability to provide secure and powerful computing at the periphery of the network, reducing operational strain and latency. In 2020 and beyond, mobile edge computing will act as the gateway for even more IoT solutions to be used across the professional world. In the same way that laptops and smartphones created a new environment for office workers, mobile edge computing will do the same for these workers.
Another key technology predicted to benefit from the connectivity provided by 5G is 8K technology. While TV broadcast and photography are obvious applications in the consumer world, 8K will also dramatically impact other aspects of our lives, from advanced facial recognition and surveillance to remote medical diagnosis and mining operations. In 2020 and beyond, 8K technology will be part of key discussions in multiple industries.”
Those companies that are able to embrace the concept of digital transformation are able to achieve new levels of efficiency and agility, in many cases completely revolutionising their operational structure or even their core business drivers. Conversely, companies that fail to adapt risk being left behind as their customers seek cheaper and more responsive alternatives.
Accordingly, digital projects are increasingly dominating the business agenda. Vodafone’s Digital, Ready? survey found that 79 percent of UK business leaders believe digital transformation to be a key priority for their company. Further, more than half of UK businesses said they planned to spend £100k on digital transformation over the next two years, with that climbing to over £1m for companies with more than 5,000 employees.
The challenge of the digital disconnect
However, while business leaders are keen to demonstrate that they take digital transformation seriously, the same can rarely be said for their employees. In fact, recent research from Cherwell found that the average UK worker is unlikely to actually know what the term even means.
This research, conducted by YouGov amongst employees at 500 businesses with 50 or more employees, found that 57 percent of employees did not know the correct meaning of digital transformation. Of those, 20 percent didn’t even guess what it might be, while 37 percent had the concept wrong with many, for example, suggesting that it might mean moving to a paperless office.
Putting aside familiarity with the latest business strategy buzzwords, almost every respondent to Cherwell’s survey also had a poor opinion of their company’s ability to adopt digital technology. Some 91 percent said they would not describe their organisation as a ‘digital innovator’ when it came to implement on new technology, with 64 percent believing that their business only took on new tech once it was already widely used in the market. Further, 38 percent stated their employer is ‘poor’ at handling technological change.
There are two potential factors in the disconnect between business leaders and employees. The first is that most companies are keen to be seen as innovators and will gladly talk the talk, but many are failing to put their talk into action and fully invest in digital projects. The second factor is that even those companies that have launched effective digital transformation activities are failing to tell their employees what this means, and what it means for them.
Why is it important to engage on digital?
It’s poor practice to make any kind of change to business processes without consulting relevant staff and giving them the chance to voice their opinion. People are generally resistant to unexpected change by default, even when the result will be a net positive for them. When it comes to digital transformation, many projects will be changing how the business operates on a fundamental level, so it’s even more important to have engagement and buy-in from the workforce.
It’s worth noting that reliable access to IT resources is now often the most important element of the workplace, so a poor IT experience can have a huge impact on productivity and morale. Cherwell’s research found that 29 percent of workers feel a slow response time to fixing their machines was one of the most frustrating things about the IT team at their company.
Without proper engagement and education, there is always a risk of staff simply ignoring new processes or finding their own workarounds to continue working in their old familiar way. This will almost certainly have a negative impact on the digital transformation project’s success and ROI and can also lead to serious operational problems as tasks are duplicated or missed entirely.
Putting the user experience first
Ensuring user engagement with digital transformation needs to start all the way at the beginning of a project. All digitalisation efforts should be focused on user experience – including both employees and customers where relevant. While it is easy to get carried away with implementing the latest digital solutions on the market, technology should always be regarded as a tool, not an objective in and of itself. Project planning should be focused around the user experience and should ideally include representation and direct engagement with those who will be affected.
In many cases, the task of changing user behaviour and establishing adoption of new technology can be more challenging than the actual technical implementation itself. Organisations need to be thinking about user awareness all the way through the process, rather than trying to rush it once the solution is ready to go. Companies will often overlook this crucial stage and just assume users will get on with things, so it can be helpful to think about internal users as though they were external customers.
Launching an awareness campaign can help to get employees on board with the project, providing them with important information about what is going on and how it will affect them. Staff should be kept in the loop as the project progresses so that they will be aware of important dates and potential disruption. Considering that our survey found more than half of employees lack a clear understanding of the concept of digital transformation, organisations could also consider covering some of the basics unless they are confident in their level of digital awareness.
The level of activity involved in marketing the digital project will depend on the company’s size and maturity, ranging from emails and posters to training and discussion sessions.
Introducing the digital leader
The most effective way to ensure that transformation efforts proceed smoothly is to establish a digital leader to take ownership of projects. Not only will they need to oversee engagement efforts with the workforce, but they will also need to help establish buy-in from senior leadership and the board. The company’s CIO is an ideal candidate for this position as they should already be well-versed in bridging the gap between the board and IT, as well as delivering a top-down approach for users. Organisations with a higher level of digital maturity and access to resources may look to establish a Chief Transformation Officer specifically for overseeing digitalisation activity.
Paving the way for a smooth digital journey
With companies now devoting £100,000s to digital transformation, no organisation can afford for their digital projects to fail. Even more important than the invested budget is the fact that successful digital transformation is now essential for a company to remain competitive.
Those companies that overlook the importance of employee engagement will risk being left behind as they wrestle with inefficient processes, bad adoption rates, poor ROI and expensive delays.
However, by taking a user-first approach and ensuring that the workforce is kept engaged and informed throughout all digital projects, organisations will lay the groundwork for delivering a fast and smooth digital journey.
Deceiving the enemy into believing one is stronger than one actually is, manipulating them into taking self-defeating actions or tricking them into believing the costs of a military raid outweigh the spoils of victory, are tactics repeatedly used in warfare throughout history.
By Carolyn Crandall, Chief Deception Officer at Attivo Networks.
The objectives of deception are to derail the attack, confuse attackers, and motivate them to disengage or reconsider whether to attack at all when confronted by an opponent who seems more formidable than they first appear.
Such a strategy applies equally to the cybersecurity world. While some adversaries are highly-funded nation-state attackers, many threat actors are simply opportunists. They prefer to prey on targets they think are weak or are easy paths to a pay-out. This wide variety of attackers is increasingly driving organizations to turn to deception techniques. The aim is to confuse threat actors so they can no longer trust what they see or the information their attack tools feed to them. The idea is to increase the complexity associated with the attack such that attackers cannot easily advance their attack and leave empty-handed.
Deception technology essentially booby traps the network so that attackers can no longer tell real from fake and, in turn, end up making mistakes that reveal their presence. Advanced deception technologies can go as far as detecting based on the mere act of an attacker’s observation and feeding them false data that manipulate their future actions in favour of the defender.
Deceiving the deceivers
Most attempts to infiltrate an organisation’s network follow a predictable attack lifecycle. The Cyberattack Lifecycle provides a process flow for this, and the MITRE ATT&CK Framework does an excellent job separating this into 12 major steps of an attack. The first phase is initial reconnaissance, where the attacker gathers publicly available information on the target and formulates an attack strategy. The next step is the initial infection, where attackers compromise a system inside the network. Once inside, they move on to the next phase, establishing a foothold. This phase is where they install back doors, remote access tools, and other mechanisms to return to the infected system whenever they want. They then move to the persistence cycle, composed of the following stages – escalate privileges, internal reconnaissance, move laterally, and maintain a presence. They continue this cycle until they find the data or target they are seeking and complete their mission.
In an environment with deception, attackers gain a misleading picture from the start. When cybercriminals first enter an IT system, they steal higher-access credentials (escalate privileges). A favourite way for a threat actor to do this is by taking locally stored credentials and targeting Active Directory (AD). The AD represents the keys to the kingdom, containing all the credentials attackers need to give them the freedom of the network. With modern AD deception, organisations can hide real information and prevent attack activity targeting AD account information by non-disruptively altering what an attacker sees and providing options to create a false AD server environment.
None-the-wiser, attackers then start snooping around to try to get the lay of the land (internal reconnaissance). Their goal is to create a virtual map that shows where the assets – devices, servers, applications, files, and folders – are, as well as how they might access them. With deception, instead of gaining the information they need to advance (move laterally) and exploit systems to burrow deeper into the network (maintain a presence), they now encounter decoys, deceptive mapped drives, and various lures on endpoint devices, so they are directed away from actual high-value target assets and into a deception environment. At the same time, behind the scenes, the deception network notifies the security team that there is an infiltrator on the network, records their activities, and activates incident response.
In the final phase, attackers attempt to complete their mission by exfiltrating data. With decoy documents, the attacker is enticed to steal data that looks appealing, but in fact, holds zero value. Its sole purpose is to give security professionals insight into what information the threat actor is searching for, as well as how the data gets taken and to where attackers send it.
Make them distrust what they see and their tools
With a cyber deception platform containing a built-in sandbox, it is not uncommon for the threat actor to only realise that they have been led down a rabbit hole after they have spent considerable time and resources carrying out their data gathering and attempts to move around the network. Once aware they’ve stumbled into a trap, attackers face the dilemma of either starting again or simply giving up and moving on. The decision to resume the attack is not taken with more consideration as they now realize that what they see and the tools they rely on no longer provide reliable data. This uncertainty, combined with the time already lost, changes the economics of the attack and can serve as a deterrent from continuing forward. Similarly to seeing a burglar alarm on a house, the attacker now knows that defenders are more prepared, and that although they didn’t see a sign also announcing that there is an attack dog roaming, they now know there is at least one if not many more acts of defence lying in wait to stop them and take a bite into their attack.
To sum up, for organisations that want not only to defend against cyber criminals but also deter them from coming back, deception techniques are a win-win. They efficiently confuse and slow down attackers while reducing dwell times and further protect a company’s most valuable assets. Disillusionment and disappointment are not high on an adversary’s list, and although there is no such thing as a silver-bullet in security, deception can definitely wreak havoc on criminal opponents.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, and now we continue the series in the December issue. Part 3.
With compliance, ethical and sustainable sourcing requirements being a non-negotiable part of modern digital transformation initiatives, 2020 promises to be a year where organisations are focused on driving even more visibility into their operations and supply chains.
To achieve this, the focus will be on the technology, software and service investments that will help them gain better access to internal and external data, make sense of this data and use it in a way that can improve insight into critical performance and compliance aspects while driving profitability.
In this context, there are three key developments I expect to come to the fore in 2020:
1. Digital transformation will drive AI uptake across businesses – but some projects will struggle due to a lack of data
AI will be a key driver to understanding data correlations and identifying patterns to provide more visibility into compliance, anomalous behaviour and areas like supplier selection – all contributing to increased supply chain visibility. With new pools of data created by their digital transformation efforts, companies will increasingly look for AI applications that can automate existing processes, surface insights that drive better decision making and ensure staff can focus on creative or intelligence-led tasks. In areas where a deep pool of data exists, AI will redefine how physical assets such as contracts interact with humans and digital systems and participate in decision making and autonomous behaviour. However, many projects will encounter challenges where these data pools lack quantity, quality or variety to train AI algorithms and deliver meaningful results. As a result, IDC predicts the AI market to grow at a CAGR of just 37% from 2017-2022.
2. Blockchain will really start to shake things up
Use of Blockchain to solve supply chain sustainability and compliance challenges will move from proof of concepts and pilot efforts to fully-fledged deployments in 2020 driven by large global brands with well-defined supply chains. The projects that get funded will be those that have proven return like provenance tracking, contract compliance, supplier diversity and ethical sourcing. Separately, these same companies along with others with more fluid supply chains, will begin building out consortiums to identify industry-wide challenges that blockchain technology can address. Early movers will be the food, manufacturing and pharma industries.
3. Corporate Social Responsibility (CSR) and sustainability will become one of the key objectives of digital transformation programs
As companies reimagine their role in society in the face of the Business Roundtable’s recent “Statement on the Purpose of a Corporation” and global social movements like the Extinction Rebellion and Fair Trade movement, we’ll see growing numbers of organisations looking to demonstrate their commitment to CSR. As consumer expectations around social impact credentials of the products they buy rises, companies like Unilever have already started to introduce changes in order to demonstrate a positive purpose in areas like environmentally friendly, sustainable and ethical sourcing. One example of how this will impact business operations in 2020 – organisations will look beyond simply using contracts reactively to punish bad actors, to building their commercial relationships around a more proactive approach of identifying and rewarding partners who help them achieve their CSR and sustainability goals. Consumer package goods companies will lead the way with consumer technology, energy, government and clothing industries not far behind.
As we enter the new decade, each of these trends will have a major impact on businesses and the way they operate; contributing to an exciting but turbulent time for enterprises where supply chain visibility will be key to delivering competitive advantage while making the planet a better and more equitable place. The availability of meaningful, accurate and actionable data will not only allow organisations to meet regulatory requirements, it will enable them to anticipate and adapt to trends, manage risk, and improve on functions like supplier quality and performance management. The winners in this new world will be those that place investments in data quality, integrity and visibility at the top of their list.
Nick Lantuh, president and CEO, Fidelis Cybersecurity, believes that the ‘platform approach’ to security operations will kick off:
In 2020, organisations will single out those who they think are strategic vendors and reconsider their cost-inefficient point solutions that generate limited results. Companies will realise that they must respond to the current threat and business landscape with a clear focus on adopting a ‘platform approach’ to security. This is where all crucial elements of an organisation's security strategy are integrated into a single piece of software, as opposed to the old school method of focusing on acquiring ‘best of breed’ products.
CISOs will continue to deepen relationships with boards
In recent years, business leaders have woken up to the financial and reputational importance of having a strong cybersecurity posture. While breaches are becoming increasingly ubiquitous, poor management of them now poses a threat to the CEO’s job, putting board seats at risk too. As a result, the position of the CISO will continue to be strengthened throughout 2020. As they continue to rapidly adapt their approach to be more closely aligned with overall business goals, boards will react by appointing them to key decision-making positions.
Kaspersky researchers have shared their vision on Advanced Persistent Threats (APTs) in 2020, pointing out how the landscape of targeted attacks will change in the coming months. The overall trend shows that threats will grow in sophistication and become more targeted, diversifying under the influence of external factors, such as development and propagation of machine learning, technologies for deepfakes development, or tensions around trade routes between Asia and Europe.
The predictions were developed based on the changes that Global Research and Analysis Team witnessed over the 2019 to support cybersecurity community with some guidelines and insights. The latter, along with a series of industry and technology threat predictions, will help to prepare for the challenges that lie in the coming 12 months.
The abuse of personal information: from deep fakes to DNA leaks
After a number of personal data leaks that happened in the past years, the number of personal details available made it easier for attackers to perform targeted attacks, based on victims leaked info. The bar has been raised, and in 2020 the threat actors will dive deeper, hunting for more sensitive leaks, such as biometric data.
The researchers pointed out a number of key technologies, which could lure victims of personal data abuse in the attackers’ traps, among them is publicly discussed video and audio Deep Fakes that can be automated and support profiling and creation of scams and social engineering schemes.
Other targeted threat predictions for 2020 include:
“The future holds so many possibilities that there are likely to be things that are not included in our predictions. The extent and complexity of the environments in which attacks play out offer so many possibilities. In addition, no single threat research team has complete visibility of the operations of APT threat actors. We will continue to try and anticipate the activities of APT groups and understand the methods they employ, while providing insights into their campaigns and the impact they have,” says Vicente Diaz, security researcher at Kaspersky.
At the centre of this ecosystem, far from the networks edge, are the largest hyperscale facilities where data can be stored securely and cheaply at the expense of slight delay in delivery. For more critical applications and faster service, regional or metro data centres provide a trade-off between the economies of scale possible from Internet Giants and the connectivity speeds required by gaming, social media, video and streaming platforms.
Finally, for the fastest possible applications that require cloud connectivity and ultra-low latency, such as 5G networks, micro data centres at the edge of the network provide the most suitable environment.
Location is crucial
Data centres at the edge are located closest to where the data they process is created or consumed. There are numerous examples of applications that require low latency, such as financial software systems, where access to real-time data and without any delay, means the very difference between a stock rising or falling and substantial investment losses. Here, the IT equipment cannot depend on a network delivering data from afar.
There are of course applications that could be considered less demanding in terms of connectivity requirements, but nonetheless, edge computing sites will function far more reliably and effectively if the data that feeds them travels through local networks, rather than via the cloud. The enormous aggregate of small data transactions driving the Internet of Things (IoT), for example, can cause data congestion, which will impact the system unless it is restricted only to where it is needed.
Edge in digital transformation
Due to their pre-integrated nature, rapid speed of deployment and high level of standardisation, micro data centres enable customers to exploit new business opportunities and scale into new geographical locations both quickly and cost-effectvely.
For companies whom increasingly rely on automation, whether for product creation, digital service delivery or for in-store purchasing, like retail outlets, edge computing enables businesses to stay competitive and service consumers faster than ever.
An excellent example of digital transformation in action is the self-service kiosk, which is rapidly becoming a common site in many fast-food restaurants. As more retail companies adopt digital technologies to transform the customer experience, their need for resilient local compute and IT availability becomes ever more important.
In this scenario, the IT infrastructure must perform with maximum reliability and whilst the number and geographical spread of such sites makes it unlikely that dedicated technical support personnel will be deployed at every store or restaurant, the critical nature of these environments means they need to be managed in real-time. As such, they require greater levels of visibility, which can only be delivered by next-generation Data Centre Infrastructure Managegement (DCIM) platfroms incorporating AI, data analytics and cloud technologies.
Power and cooling at the edge
At the edge, an inability to rectify any faults immediately can cause application downtime with significant impact on business. This, compared with larger regional or central hyperscale data centres, represents a potential Achilles Heel for many companies dependent on edge architectures. Fortunately, there are technologies and services at hand which can mitigate such circumstances, provided that sufficient plans are first put in place.
For these pre-integrated systems, there are a number of key priorities one must address. They include: power, cooling, racks, physical security, fire protection, environmental monitoring, network connectivity and overall system management.
Power availability must be the primary concern as without it, no service delivery is possible. It is essential that uninterruptible power supply (UPS) systems are in place to provide battery backup to all IT equipment. Depending on the level of protection needed, operators may choose between a single (N), dual-redundant (2N) or single spare (N+1) configuration to provide reassurance in the case of an outage. This will ensure that all critical components, including servers, networking and cooling systems have access to resilient power at all times.
For other applications needing 24*7 operation, IT Managers might consider deploying a backup generator to guard against prolonged power cuts. In this scenario it is beneficial to have a UPS that has a bypass function to prevent the load from outage in the event of a power overload or UPS fault. Monitoring is also essential so that replacement of batteries, for example, can take place before they reach the end of their operational life. Lithium ion cells are rapidly becoming a popular alternative to traditional lead-acid batteries and offer far longer operating lives, but still require monitoring.
Cooling is the second highest priority when designing a highly available edge application. Servers running at temperatures beyond their specified limits can shut down automatically, so it is essential to deploy a cooling system that best fits the data centres needs. In many network closets or office environments, the ambient cooling system is often designed for comfort cooling, ie for humans only, which can present challenges for IT.
Cooling within a pre-integrated edge system must be designed for round the clock operation. Therefore, fans should be protected by a UPS and have redundant units in place for fault tolerance. Blanking panels in empty rack spaces prevent hot exhaust air from the back of the rack returning to the air intake and causing hot spots which threaten the operation of IT equipment. Here, the rennaissance of technologies such as liquid cooling present another great opportunity to increase efficiency at the edge.
Edge environments also require greater levels of physical security, which provide safeguard against human error and unauthorised access to IT systems. Operators should ensure that where possible, all rooms containing IT equipment are locked and accessible only to essential personnel. Racks themselves should be secured, preferably with biometric devices, and monitored with next-gen DCIM, surveillance cameras and alarmed with sensors that will alert management to any unauthorised access attempts. Simple considerations like secure, wall mounted racks can also avoid security problems where space constraints are an issue, such as retail stores.
Other factors that need to be taken into consideration include fire protection, network connectivity and environmental organisation and protection. However, for ease of reference, a full checklist, which covers all essential considerations around edge resiliency can be found in Schneider Electric White Paper #280, “Practical Guide to Ensuring Availability at Edge Computing Sites”.
Ultimately, the availability of an edge site will depend on how well it is designed, managed and operated. Planning is key to minimising downtime and fortunately remote monitoring systems, such as next-gen DCIM software, simplify this for even the most resource-constrained team.
Here sensors located on all essential equipment allow status reports to be delivered securely to a remote management platform, where service personnel can be alerted to any matters requiring urgent attention via smart phones or tablets. In this way, disparate edge data centres can be maintained and serviced in ways similar to regional or hyperscale facilities, ensuring higher levels of availability for the most critical of business applications.
Retailers operate in a digital-first marketplace and that means not just using technology but being alert to the fast-changing buying patterns of digital consumers and organisations. Here Mark Bennigsen, Service Delivery Director at Columbus UK, identifies the four technologies that will enable ‘digitally mature’ retailers to not only cope with accelerated changes in the industry but future-proof their business in this digital ecosystem. He warns that those failing to adapt risk joining the growing list of established brands that have fallen by the wayside.
Let me define ‘digitally mature’ retailers. They have a solid commerce platform that enables new and existing customers to easily find products on the device of their choice – and ensures that promises regarding delivery, pricing and availability can be consistently met. But the advantages of this type of platform are huge for business intelligence. It allows a business to collect insightful customer data, process it faster and adapt to meet evolving customer demands. Here are the four technologies that are essential for retailers to plot their commerce journey and stay ahead of the digital curve going into the next decade.
1. Product Information Management (PIM)
An effective PIM system has the potential to save businesses time, money and energy. PIM enables businesses to collect all information and material used for marketing in a single location and keep customers’ needs top of mind at all times. A PIM system makes a retailer think from the perspective of the customer – considering the type of information customers need during their purchasing journey. This type of solution can then be used to not just gather the data but use it to enrich product information to create and deliver a compelling product experience.
PIM implementations support multiple content types spanning textual materials, product images, videos and more. With the right information in the right place, businesses can improve their product content and category managers can ensure that accurate, timely and high-quality product data is available across all sales channels.
2. Content Management Systems (CMS)
It is essential to create personalised web content that reinforces and improves brand messaging and engagement. An advanced CMS does just this by allowing an organisation to quickly manage and update web content to best align its brand to match buying behaviours.
The journey towards complete personalisation starts with attracting customers via increased SEO efficiency and engaging them through effective seasonal and trending content curation. This converts to increased customer relevancy and personalisation – analytics being a key factor in optimising the journey – which leads onto the creation of content-rich emails that target specific customers.
In this way CMS helps retailers work more intelligently – delivering a comprehensive customer journey and experience from landing pages through to payment and checkout, while also enhancing product and catalogue management.
3. Customer Relationship Management (CRM)
According to Gartner, 81% of purchases will be based on customer experience by 2020, so ensuring that experience is right is essential to future success in the digital marketplace. Modern CRM systems enable retailers to effectively engage with customers throughout their entire lifecycle – from marketing and sales to customer service and advocacy. An effective CRM solution supports the automation of manual tasks, but more importantly it allows businesses to better understand their customer base and provides the opportunity to engage with them.
As the market evolves and new technologies continue to be embraced, businesses must assess how they are interweaving digital tools with sales, marketing and customer service, and a CRM solution will be a key component for any retailer on their commerce journey.
4. Artificial Intelligence (AI)
Retailers can use AI to help them digitally transform and provide their customers with the best experience possible. AI is key to creating a highly personalised customer journey by using customer history to predict future needs and purchases. Powerful, connected algorithms lead to intelligent content suggestions, product recommendations and customer profiling which all help to accelerate business growth. These algorithms can ensure customers are provided with relevant content, while business dashboards can be used to monitor and direct algorithms to ensure their effectiveness. In this way AI can also provide an intelligent solution to basket abandonment as well as a personal touch when it comes to chatbot interactions.
The way consumers interact with brands is changing as the retail market keenly adopts emerging technologies, and businesses must take advantage of AI to drive product innovation and enhance their processes to advance in this digital world.
Sink or swim in today’s digital marketplace
Operating a commerce platform made up of these four technologies will help businesses secure their futures in today’s digital marketplace. Businesses must redefine their market proposition and harness valuable insights generated from emerging technologies to shape their future brand messaging. Those who can adapt their operations to meet ever-changing customer demands and market trends will benefit from increased sales from existing customers, and consistently attract new customers to grow their customer base.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, and now we continue the series in the December issue. Part 4.
Predictions for 2020: Developments in AI
As we say goodbye to 2019 and look forward to a new decade, this time of year often provides a time to reflect on the big developments we’ve seen in the past twelve months as well as those that are likely to come in the next twelve.
This year, AI has gained momentum as expected, quickly being recognised as a means of propelling businesses and workers into automating repetitive processes and speeding up innovation.
But what will the developments in this area look like in the year ahead? We’ve spoken to some experts to find out:
Dr Ian Roberts, Chief Technology Officer at Healx:
The future is set to see AI-generated healthcare recommendations extend to include personalised treatment plans. An example of this in practice is the ability to mitigate the risk of a person developing a chronic illness by having the foresight to make changes in lifestyle choices ahead of diagnosis. This medical understanding will be formed in part by their own genome, combined with machine learning algorithms. To date, consumer personal genomics companies such as 23andMe are already helping to inform people of the need to manage their health; this ranges from avoiding coffee late at night to elevated risks of dementia and certain cancers. Currently, we are in the infancy of AI in healthcare, and each company drives forward another piece of the puzzle and once fully integrated the future of medicine will be forever transformed.
James Dean, Co-founder and CEO at SenSat:
Over the last 60 years, the infrastructure sector has barely seen an increase in productivity. This is surprising for such a crucial and expansive sector - with only ‘fishing and hunting’ showing less innovation across the same period. However, in recent years efforts to drive digital transformation are beginning to yield substantial results.
Technologies like AI and Digital Twins have allowed these traditionally laggard industries to drive efficiencies and drastically increase profitability. Recent data suggests AI will increase infrastructure sector revenues by £14bn a year in five years - so we expect its adoption to continue rapidly through 2020.
The data also forecasts that four in ten infrastructure firms who don’t embrace digitisation risk going out of business within the next ten years - so there is a strong incentive for the infrastructure sector to pursue digital transformation in the coming year.
Oliver Muhr, CEO at Starmind:
In 2020, we will see a shift towards Human+ AI. This type of Augmented Intelligence will empower people in organisations to make better and faster decisions by utilising the wealth of knowledge from other team members and experts within organizations. Rather than taking in and retaining all information as equal, true AI will begin to mirror the way the human brain works - digging out what is important and forgetting less meaningful data.
Real learning is based on recognising when something you thought you knew has become obsolete and in 2020, the most advanced AIs will distinguish themselves through what they learn to forget, rather than simply what they taught.
Justyn Goodenough, Area Vice President - International at Unravel Data:
The transition of core business processes to the cloud has been the key business-technology trend amongst organisations for several years now. This is only expected to continue as more marginal data workloads are discovered to make operational sense to move off-premises. We saw in 2019 that business functions like CRM or HR made that transition and this will accelerate as more big data functions are able to make the jump. That being said, another trend starting to develop in the second half of 2019 that we expect will proliferate in the new year, is AIOps. The fundamental promise of AIOps is to enhance or replace a range of IT operations processes through combining data with AI. Like cloud migration, not only does AIOps have the potential to drastically reduce the cost of deployments, it can do so while improving performance. As organisations move into 2020 and review their business processes, it’s likely that one or both of these considerations will become a priority.
Javvad Malik, security awareness advocate at KnowBe4, offers the following thoughts:
Prediction 1: Everybody wants to rule the human
The fight for attention will formalise and the battle lines will be drawn on all sides. Social media networks will continue to try and build ‘stickiness’ into their products, while we will see IoT hardware come into the fold with smart speakers, glasses, and similar trying to retain the focus of consumers. And all of this will take place under the shade of big brother and other interested parties looking to sow seeds of distrust and doubt.
Prediction 2: Regulation & surveillance
Regulation will increase, in particular privacy regulations. With GDPR the tip of the spear, we are beginning to see evidence of large fines, and others will follow suit. However, there are signs on the horizon where governments wanting to increase their surveillance capabilities will clash with regulators over what level of privacy and security is acceptable to the individual vs the state.
Prediction 3: Balkanisation of the internet
We will see further balkanisation of the internet and its services. While countries like China have traditionally maintained its own infrastructure, we have seen political issues spill out to the cyber realm, with companies like Kaspersky and Hwawei being banned in the US. We will likely see more products and services having to be tailored for local requirements and regulations.
Joe Hughes, CEO and Founder of MTG on the future of IoT:
Advancements in IoT and edge-computing technologies will fuel demand for low power networks, micro-datacentre and micro-cloud environments. To be prepared for these emergent trends and to accommodate new technologies, telecoms and datacentre providers must consider adopting versatile network architectures and product offerings.
IoT and Smart Cities are becoming mainstream as more public and private sector organisations are becoming comfortable with the technology and are beginning to benefit from real-time data and decision-making capabilities.
Early iterations of IoT devices were primitive; with basic functionality and limited processing power – capable of simple measurements and wireless communication, but with limited analysis or processing capabilities. Recent developments in embedded systems technology have greatly improved the performance, power characteristics and capabilities of these devices.
Modern IoT chipsets now feature ultra-low power components, artificial intelligence, encryption, visual imaging and advanced-wireless systems. These enhancements have given rise to the concept of ‘edge computing’, where more advanced forms of analysis and processing are performed on the device itself.
With the processing activity now happening at the network edge, this has reduced the need for high bandwidth links and continual connectivity back to a central datacentre environment.
Edge computing devices will benefit from low-power wireless network access (such as LoRa), as opposed to the relatively power-hungry, high power networks such as 5G.
Given the low power demands of IoT and edge devices, many telecoms operators are looking at a strategy of co-existence; deploying high-bandwidth, dense deployments of 5G sites; alongside long-range, ultra-lower power wireless technologies such as LoRa, NB-IoT and Sigfox. In the case of IoT – often less is more. The Isle of Man is a great example of how pervasive wireless networks can create opportunities in the economy. The Island was one of the first in the world to deploy 3G and it has near-total coverage for 4G.
The datacentre and IT industry have undergone a technological pendulum swing, with a recent shift towards cloud and centralisation, almost emulating the era of the mainframe. As computational power shifts to the network edge, we are seeing datacentre, storage and compute providers follow-suit.
Recognising the shift towards edge computing, many datacentres, cloud and hardware vendors have introduced micro-datacentre and micro-cloud solutions. Like their IoT counterparts, these micro-datacentre environments are also being deployed at the network edge; in containers, rooftops and public spaces.
The combination of edge-compute, low power WANs and micro-datacentres has allowed firms to deploy emerging services close to end-users. Gartner identified this trend in 2018, where they recognised that workload placement is now based on business need, not constrained by physical location.
“By 2025, 80% of enterprises will have shut down their traditional data centre, versus 10% today.” [i]
The benefits of edge-compute are not just technical. As data can be collected, processed and formatted at the network edge, this can eliminate the need for personal data transmission and in the case of cloud – eliminate cross-border data transmission and storage, an important consideration in the age of privacy and GDPR.
The rise of the small and mighty DDoS attack
“This year, we’ve seen overwhelming threats and traditionally large-scale DDoS attacks decrease. While this would normally be cause for celebration, such attacks have been overshadowed by the rise of smaller, more carefully targeted incursions. In 2020, we’ll see this upward trend continue, with intensity and duration replacing brute force and size as key concerns for cybersecurity professionals. Such attacks do not seek to saturate the network link, but instead to degrade or disable specific infrastructures within the target.
“In a bid to understand, identify and diminish these small-scale threats, organisations must reassess the detect and protect measures they already have in place, ensuring that an ‘always on’ DDoS mitigation strategy is deployed. When asked how likely they would be to notice today’s most prevalent smaller attacks, just 28 percent of security leaders answered very likely, with the remaining 72 percent lacking the same confidence.
“With smaller attacks frequently flying under the radar, cybersecurity professionals need to change their approach to security next year, constantly monitoring traffic to ensure threats of all sizes are spotted, managed and fought against. Organisations also need to establish a greater level of understanding as to what exactly they have at risk and therefore where they need to deploy the most protection.
We know DDoS attacks are getting smaller, but we also know size does not always go hand-in-hand with impact – it’s now the attacks we fail to see that have the potential to cause the most damage.”
Getting to grips with IoT
“Despite 2019 seeing huge growth in the IoT market, with Fitbit and Alexa sales booming, security protocols for these connected devices have yet to become as mainstream. In fact, fewer than half (47%) of security professionals recently admitted to having a plan in place to deal with attacks on their IoT equipment, even though nine in ten are concerned about future threats.
“In most cases, IoT equipment is still being manufactured with only basic security in mind. While this may not have been such an issue a few years ago, malicious actors are now all too aware of the various entry points they can tap into to infiltrate wider networks. In the last year alone, 48 percent of organisations experienced a cyberattack against their IoT or connected devices. It is crucial, therefore, that businesses understand and identify exactly what is at stake when it comes to the IoT, and build a cohesive security strategy around this.
Next year, as IoT capabilities continue to expand and use-cases span further into our homes and offices, professionals will place a greater focus on deploying more than ‘out-of-the-box’ security for these devices. In fact, recently, 38 per cent of CTOs, CIOs and security execs claimed they are in the process of developing a plan for their IoT security, pointing at a fundamental need to ensure the appropriate controls are in place.”
Cyber criminals are becoming more enterprising than ever before, dreaming up new ways to evade and disrupt the status quo. The evolution of technology, software and networks have created a benefit and a challenge for security practitioners. Cybersecurity is no longer about just “anti-virus” and “network security”.
By Jennifer Ayers, Vice President of OverWatch and Security Response at CrowdStrike.
Threat hunters are specialised teams, trained to hunt for unknown and stealthy attacks that standard cybersecurity measures miss. This is a dedicated role performed in a full time capacity. These professionals can bring connections, knowledge, and judgement that enables them to find the ‘needle in a haystack’ when it comes to cyber attacks. Hunters look for adversaries that masquerade very effectively as legitimate users or administrators, but who might be doing one small thing differently.
It’s a vital role. In the early days of cybersecurity solutions a security team was obliged to gather all their log and event data together in a security information and event management (SIEM) solution or repository to be able to search through all the data for nuggets insights. This leaves security teams in a reactive mode, overwhelmed by alerts, and unable to discover sophisticated adversaries in their environment.
Today, in the next-generation security world of machine learning-powered and cloud amplified solutions so much of the drudgery of managing alerts and automating basic responses is now mechanised, enabling cybersecurity professionals to focus on strategic hunting. Threat hunters can really hone the skills of spotting and understanding what aberrant asset and activity looks like, to investigate, stop incidents, and pass on that knowledge to the organisation to close those loopholes down. But what are the key attributes that these threat hunters need to be effective on the job?
Gartner claims that the triggers for proactive threat hunting fall into three categories. First is hypothesis-driven investigation, such as knowledge of a new threat actor's campaign based on threat intelligence gleaned from a large pool of crowdsourced attack data. In these cases, threat hunters will look into these data lakes to find those behaviours within their specific environment.
Secondly, is investigations based developed hunting leads, which spur hunters to look deeper into a specific system's activities to find potential compromise or ongoing malicious activity. The final trigger is analytics-driven investigations where hunters pursue potential individual leads based on advanced analytics and machine learning.
Knowing the ground
Knowing detailed information around normal enterprise process execution and system behaviour is important to the threat hunter. Understanding the information required in order to define what normal looks like takes time and care to acquire, sort, and understand.
Even the most minimal trace activities can be used by a threat hunter in order to follow faint suggestions of misdeeds to paint a picture of whether an attack is in progress, or if the behaviour is irregular but does not represent malicious system activity.
This all leads back to one of the key benefits of a threat hunter - they know the quirks of the system and can weed out false positives from the idiosyncrasies of a particular environment. They reduce alert fatigue on the whole business and ensure everyone can play a productive part in the security process.
Knowing the enemy
Threat intelligence shows that many of the key adversaries attacking businesses, whether criminal or nation-state, have been using the same, or very similar techniques for over a decade. As an example, we see adversaries apply a low hanging fruit approach to accomplish their mission - reusing tactics and infrastructure that have been previously used successfully. There also remains a perennial favourite of attackers - password credential theft. Knowing what the major risks are and taking appropriate action can close many paths into the business.
Security risks and emerging threats are illuminated by government agencies like the UK’s National Cyber Security Centre, and security providers in reports like the Global Threat Report from CrowdStrike which in 2019 detailed the trend of ‘Big Game Hunting’, where eCrime actors combine targeted intrusion tactics with ransomware to extract big payoffs from large organisations.
Threat Intelligence is a key part of creating a holistic security posture. Understanding adversary intentions and techniques is important for developing ways to ensure the business is less likely to be attractive to the cybercriminal and nation state groups.
The personality of a threat hunter
The threat hunter finds the human adversary and gives the insight and intelligence to enable the wider security industry to be effective. Security operations and incident response needs to be an effective and integrated circle of trust and cooperation.
Threat hunters have a unique and diverse background but the one commonality is an endless amount of curiosity and the ability to accept failure every single day. In a world with one per cent true positive findings it takes dedication and focus to continue to hunt for the unknown.
For the past three years, I have lead Machine Learning and Data Science at DataArt, researching the pain points of businesses, proposing technological solutions and carrying out implementation. In my time working with the technology, I have identified seven key rules for smoothing the path of ML.
A popular in the engineering community idea to build bespoke solutions for every new problem. Often junior data scientists will truly believe that their very customized model, built to fit the exact business case in hand is a better idea than taking a solution from the shelf and adjusting it to the particular case. Well, the big news is - that’s probably true… in one case out of 100.
Cloud providers are rapidly developing ML services, treading the same path that Big Data services did before them. Ten years ago, Big Data was a pricey, easily-scalable and fault-tolerant exotic fruit. Now it’s standard. The same goes for open-source instruments. In 99% of cases, you don’t need to invent a new library or a database because most probably you’ll find something that works on the market.
Every technical task should bring business value. If you do research, you should always be able to provide an example of a case you’re working on in real life. Usually, if you cannot quantify such a case, there’s no need to spend time on it.
ROI calculation methodology may need to be recalibrated. Compared to classical programming, machine learning is a probabilistic approach that never provides you with 100% accuracy, so you should always evaluate if increasing accuracy by 2, 5, 10, or 20 % worth of the investment made. The good news is that data science projects are very close to the business, so you should have a great feeling of return.
Let’s say you have hundreds of people whose work is to extract data from the documents, then adding 5% to the accuracy of results of automated extraction may mean millions of dollars per year. But if adding 1% to accuracy costs tons of money, a popular case in computer vision projects, maybe there are more valuable tasks and we still can rely on a human operator.
Integration itself is not a big deal. But be sure to factor in human perception. If the company has a rule-based system, that makes decisions and it’s clear how it works, a new solution that uses ML-techniques can look like a black box to stakeholders. It’s therefore vital to have a clear migration plan that addresses potential risks.
We work with a business travel company, which, among other things, purchased airplane tickets. Booking took 17 hours at which time the price could fluctuate. Fluctuation can depend on a number of factors: passenger flow, day of the week, time of the day, season, major sport or cultural events, weather, etc. The goal was to prove that analyzing historical data, with the help of ML, prices could be lowered to several percents on top of the existing reductions being produced by an old but proven rule-based system. But nobody wants to rely on a black box having a turnover of one billion dollars.
A migration plan was elaborated: first an ML-based predictor was set up in parallel with the production system, followed by the processing of 10% of purchases in production, then 30%, constantly measuring relative performance. When the system is proven in production, at least 10% of the tickets will still need to be purchased via the old rule-based algorithm in order to get the latest updates of the sales engine to avoid overfitting (the state of the ML system when it thinks it knows everything about the world around).
It’s very important to demonstrate to all stakeholders that the ML process is very gradual and reversible in case of failure.
Every new version of your ML model is an experiment. It may be successful or not, that’s why you should always have a working CI/CD pipeline to be able to reverse to an older version.
However, you should also not forget about proper versioning of the data, the model parameters, and the results of experiments. There are specific tools to help here, such us DVC – a version control system for ML.
It’s easy to get caught by the routine of research when you start looking into one problem, then find another one and then you find yourself in a completely different place, not really important for the product.
Stay focused on your research, don’t get distracted, remember why you started and what the main goal is.
However, if you see a low-hanging fruit that may be valuable for the user – go get it! This is especially likely to happen when you conduct visualizations to represent the insights found in data. Formally, it might not be critical at the time, but you never know what could be valuable for the business at some point.
Remember how great it is to receive minor but lovely updates in your favorite apps.
There’s always a complicated and accurate way of solving problems with a lot of data wrangling, feature engineering, meaning days of manual routine. But can you keep it simple bringing 80% of the result with 20% of efforts?
We have learned to apply lateral thinking on top of data. In one case, we needed to classify processes by business units in a huge chemical company. We found a workaround alternative to manual input by using publicly-available NLP models, converting the processes and departments to vectors and finding the closest pairs.
Once we needed to sort 20 years-worth of marketing materials containing Mastercard logos in order to find which materials are still actual. Tens of thousands of documents. At some point, we realized that the last logo change for Mastercard happened in 2016. Two hours to label and train the cloud-based image recognition service, and voila – we had only fresh documents left.
We always lack data or expertise in data science projects, so creativity is an essential asset for building a working solution.
New technology such as ML comes with a set of challenges not thrown up by more established tech. One of the main issues with technology as new as ML is ensuring that proposed solutions don’t end up frustrating the management because of the inevitable uncertainty it causes.
ML and data science are a black box for the majority of people around, that’s why expectation management is more important than it’s ever been. Don’t forget to educate people, quantify the results of research to compare it with the goals, and think about the integration in advance both from the technical and the human point of view.
And, where appropriate, a combination of AI, heuristic methods and manual routine is fine. When people start building an AI-based solution, there’s often an intent to build something fully automated - an oracle with bells and whistles that can make recommendations with absolute certainty.
You’re lucky if this is possible, but don’t forget that even if ML doesn’t allow you to solve the task completely, it can be a lot of help in preparing the data required to make a decision. Taking the ultimate decision away from ML helps to avoid distrust from industry specialists, who usually prefer the final word to be a human one.
DataArt is a global technology consultancy that designs, develops and supports unique software solutions, helping clients take their businesses forward. Recognized for their deep domain expertise and superior technical talent, DataArt teams create new products and modernize complex legacy systems that affect technology transformation in select industries.
DataArt has earned the trust of some of the world’s leading brands and most discerning clients, including Nasdaq, Travelport, Ocado, Centrica/Hive, Paddy Power Betfair, IWG, Univision, Meetup and Apple Leisure Group among others. DataArt brings together expertise of over 3000+ professionals in 20 locations in the US, Europe, and Latin America.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, and now we continue the series in the December issue. Part 5.
1. Health data and AI
2020 will be the year of health data. Everyone is agreed that smarter use of health data is essential to providing better patient care – meaning treatment that is more targeted or is more cost effective. However, navigating through the thicket of consents and rules as well as the ethical considerations has caused a delay to advancement of the use of patient data. There are now several different directions of travel emerging which all present exciting opportunities for patients, for health providers including the NHS, for Digital Health companies and for pharmaceutical companies.
2. Patient benefits – more than just improved healthcare.
In an industry reliant on building datasets, the availability of suitable patients is often a key challenge. Competition for health data is therefore hotting up, and we expect to see further innovation in how digital health providers will identify suitable cohorts and incentivise those patients to grant access to their data, including the use of cryptocurrencies
3. New cyber security products
The growth of telemedicine has presented acute challenges relating to both data privacy and data security. Healthcare is currently the second biggest target for cybercrime and data breach fines are being robustly enforced by the Information Commissioner's Office under GDPR. As a result, in 2020 we expect to see a focus on cyber security products and practices tailored specifically for the healthcare market.
4. VR and AR for at-home therapies
As VR and AR and remote monitoring technology matures, so those enhanced and simulated environments may enable a broader scope of patient exercises and interactions than may otherwise be possible during physical therapy. With devices becoming more affordable, during 2020 we expect an increase in their use in customised at-home care packages.
5. Roll-out of digital health in developing countries
Nearly 41 per cent of people in developing countries have a mobile broadband subscription (UNICEF figures). This provides great opportunities for digital health, particularly the use of cloud computing for data storage and telemedicine for remote communities. We expect the number of projects between providers and NGOs (such as UNICEF) or governments to continue to increase during 2020.
6. Consolidation and collaboration
The variety of digital health technologies will no doubt continue to expand and we expect an increase in the combination of products to offer end-to-end or complementary solutions. For technology developers this will be achieved through increased use of open source environments, in addition to M&A activity and collaborative projects. From the consumer's perspective, this will be achieved through the integration of technologies, platforms and the Internet of Medical Things (IoMT).
Dominik Birgelen, CEO of oneclick, says that business technology in 2019 has seen a year of development and innovation, especially in the modern workplace. But the evolution is still ongoing, and 2020 holds the potential of revealing more undiscovered advances across the industry.:
Brexit has without a doubt been at the forefront of many business owners’ minds. With such political and economical uncertainty, there has been increasing pressure on organisations to operate more efficiently whilst reducing the total cost of ownership.
As a result, we have seen more organisations aggregate their workplace tools. Introducing XaaS (Everything-as-a-service) solution technology to the workplace, employers can ensure they’re delivering innovative technology to their employees, whilst being cost conscious. The XaaS platform leverages the abstractions that SaaS, IaaS, and PaaS provide, utilising secure access to all applications and data from anywhere. However, due to its complex management and infrastructure, Everything-as-a-Service has only recently come to light for many business owners and IT Managers. But, therefore, we can expect 2020 to be the year this term not only gets explored more but also implemented across more organisations.
As the modern workplace continues to evolve, workers today require multiple digital tools to remain productive and efficient. Their expectations lay in everything being available and accessible to them in one place, XaaS meets this demand. A Unified Workspace is a logical, robust, and secure solution for your changing workplace requirements - where work increasingly takes place from multiple locations across distributed teams. Unified workspaces provide greater productivity and employee satisfaction. They leverage cloud services enabling access to be entirely web-based and providing a central portal for the execution of tasks that are independent of the end device.
In addition to providing employees with the tools they require , the modern CTO will experience a reduction in workload through using the innovative tool. Meaning more attention can be applied to different aspects of the business.
Companies utilising Everything-as-a-service are productively contributing to their growth as a business and as a team. As the cost benefits of the service become more visible over time, these funds can be invested across different parts of the business.
Over time, as businesses begin to report on the advances of implementing the efficient tool in the workplace, Everything-as-a-service will become more understood and explored. As its visibility continues to increase, I believe it will be a leading technology trend of 2020.
With so much of the world’s critical infrastructure relying on cloud services provided by American monoliths, it’s no surprise that we’re now seeing a shift towards European countries creating their own cloud platforms in order to regain digital sovereignty. This is a trend I expect we’ll see much more of in 2020 and beyond, particularly as European privacy standards like GDPR remain incompatible with US privacy practices such as the Cloud Act. Germany and France have already begun this move with the announcement of the European cloud Gaia-X project, and it will not be long until others follow with their own products designed to end European reliance on US providers.
Moving away from traditional messaging apps
Users are growing tired of the constant scandals, data breaches, and general lack of choice amongst the top messaging providers, and although we’ve seen growth in niche options such as Telegram and Signal, there is real scope for a solution which can provide genuine interoperability, privacy, and freedom from big tech’s walled gardens. There are other options currently in development, such as ChatoverIMAP (COI), which uses existing email infrastructure to deliver a free and interoperable messaging platform. However, while research reveals that three quarters of Britons would like to change messaging providers, it remains to be seen whether there’ll be mainstream adoption of alternative, open messaging apps - or whether they will simply seek out the next big proprietary platform.
Increased adoption of open source
Open source has now firmly entered the mainstream, with IBM acquiring RedHat for a record $34bn earlier this year, and Microsoft finally throwing its weight behind the movement. These deals reveal a changing attitude to the way we want the web to be run, and businesses are increasingly adopting open source for the ease, simplicity, and creativity behind the community-sourced code. Looking ahead, I think we can expect to see real growth in open source adoption within the IoT in particular. As these devices come to perform increasingly complex, challenging, and mission critical tasks across society, devs will need to utilise the vast array of knowledge and expertise within the open source community to meet the challenges of the future.
Low code 2020 predictions from Jennifer Gill, Senior Director, Product Marketing, Pega:
Tool sprawl is strangling us. In the security boom - enterprises have accumulated far more IT tools than they really need. A 2018 Forrester survey revealed that 55 percent of organisations juggle at least twenty tools between security and operations teams. Another 451 Research survey found that 39 percent of the organisations surveyed were using 11 to 30 monitoring tools to see across their networks.
By Mike Campfield, VP of Global Security Programs, ExtraHop.
Amid the chaos, businesses are finding themselves with less speed, efficiency and information than they should have and certainly with less than all these tools promised. In fact, tool sprawl can often hinder security more than help. The hodge-podge of tools offers an uneven view of the network, creates silos of data, slows incident response and adds un-needed costs to IT budgets.
Most enterprises are attempting to move away from this status according to ESG Research, who have reported that 66 percent of enterprises are attempting to consolidate their security portfolios.
From this situation many organisations are caught between two distinctly un-favorable outcomes.
The first is the situation they are currently in: Weighed down with tools from a variety of different vendors which don’t integrate and don’t offer them a unified vision of their network.
The second is a monocultural shift - one set of tools which handles the enterprises security requirements. Still such an option can be stifling, denying SecOps the ability to choose from best-of-breed tools. Furthermore - if everyone uses the same set of tools, a compromise on one of them, means a potential compromise everywhere they’re used.
A heterogeneous array of tools is a security asset in itself. Just as in the natural world, it helps to protect from infection. Take the Cavendish Banana - which has dominated the world market since the 1950s, accounting for 47 percent of global banana production. It might be popular, but it’s also extremely susceptible to disease due to the stifling lack of genetic diversity. Neither blind permissiveness, nor draconian restrictions provide the right answer.
So how did we get here? The sudden and rude awakening to the fact of security threats has led to a quick and panicked acquisition of new and shiny tools of which it seems there is a new one every two seconds. Cyber-security budgets have exploded, private investment in the sector has boomed in kind and cyber companies have gotten happy and fat in a market that is set to reach $103.1 billion this year.
But that’s also meant that enterprises have quickly acquired a tonne of tools - many of which do the same thing - as technology has developed and new threats have arisen to meet it. Mergers and acquisitions of companies have only added to the problem, sewing together IT infrastructures which were built to exist separately.
The advent of the cloud has created its own problems. Most enterprise web traffic - 85 percent of it in fact - is now in the cloud. At the same time, cloud hosts often give customers their own set of tools for viewing and managing that traffic. Unfortunately, those tools often don’t integrate with an enterprise’s tools - only exacerbating the inconsistent image of the network that enterprises already deal with.
One of the real drivers behind tool sprawl is the disconnect between the different departments of modern enterprise IT. SecOps sits in one part of the office, DevOps in another and ITOps in yet another. While each forms a critical part of the enterprise IT infrastructure, functions often overlap - they rarely communicate. This has implications for both the organisational character of these departments as it does the technology - the fact is that a lot of the tools you might use across different departments have effectively the same function.
For example, ITOps teams rely on a set of tools called Network Performance Monitoring and Diagnostics (NPMD) to oversee the network and maximise its performance. Meanwhile, SecOps teams are using a set of tools known as Network Detection and Response (NDR) which help them look out for threats travelling through the network. Those tools perform similar roles and could provide both teams with a great deal of insight as to not just their own roles but for each other’s too.
Siloing tools makes coordination harder during incident response and remediation more arduous. The data is there, but the right people can’t get to it because it’s coming to other parts of the organisation first. It hobbles network security, but it also means increasing costs for tools which do much the same thing.
The 2018 Gartner paper Align NetOps and SecOps Tool Objectives with Shared Use Cases recommends that if two departments need access to the same network flows, packets and device configurations, then they should be sharing the tools and instrumentation they need to access them.
Increasingly we’re seeing organisations take advantage of that. That integration provides further benefits still, helping to bridge the communication gaps between the various parts of the modern IT infrastructure. The thereby provided opportunities mean that various teams can collaborate to improve their processes and practices, gaining valuable insight of the other’s experiences and expertise.
Often, security concerns remain a specialist concern, the preserve of only a subset of a subset of the enterprise. Security should be everyone’s problem - and even laymen are critical to the security of an organisation. From cyber-hygiene to actual threat hunting - every part of the IT infrastructure and beyond can be employed as defenders. Providing them with tools which get them to read from the same playbook is a significant step forward.
Organisations have to look not at the tools themselves, but what the tools are doing for them. A tool driven approach will only exacerbate the siloing of functions, data and processes. Their approach to a security portfolio has to be driven by the data they need more than the tools they think will provide it for them. First ask for the data you need and then look at which security tools can achieve it.
Consider the data used in detecting threats: Fundamentally, there are just three sources of data for threat detection. Log data is ubiquitous and the lifeblood of any SIEM solution. Endpoint data provides crucial insight into what is happening on a device. Finally, network data provides an invaluable pervasive view of activity in an environment—as well as a method of observation that attackers cannot evade. It would make sense for organisations to consolidate their tools such that they have one solution for each of the data sources, and then make sure those solutions integrate with one another.
Tool sprawl is the product of enthusiasm. As with so many aspects of technological innovation, people often buy in without accounting for what the unfortunate side effects of that enthusiasm acquisition might be in a broader context.
Still a knee-jerk reaction to tool sprawl threatens to stifle innovation and threaten the state of security across the industry. Enterprises need to change their conception of tools if they want to find a way between either extreme. They have to start thinking about what they really need out of security tools and from there to change how an organisation makes use of those tools. Doing so offers the opportunity not just to transform the effectiveness of a security portfolio but of an entire organisation.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, and now we continue the series in the December issue. Part 6.
Reading technology predictions every year can feel repetitive. Particularly when last year’s predictions of AI, Blockchain, IoT and flying cars all carry through as they are technology foundations upon which we build. The exciting bit is how we use these foundations to evolve and uncover new opportunities. That’s the innovative inspiration that keeps us reading predictions year after year.
Reflecting first on the year gone, did we fulfil the predictions made by most senior technology observers in 2019? Did we finally apply artificial intelligence to our big data effectively? Did the introduction of self-lacing and optimised trainers by Nike result in the breaking of the two-hour Marathon?
We certainly made strong progress. AI and machine learning are being applied against many more practical use cases in our everyday lives, from fraud prevention with computer vision to operational efficiencies through increasingly AI driven supply chain and customer demand choreography. Blockchain is rising out of the trough of disillusionment and starting to find relevant applications; some for the good of data democratisation, others just to make the connected world more secure.
But, what should we be looking out for or striving towards in 2020? Honestly, it is more of the same – just better and more refined. Here’s seven of my 2020 predictions:
1. Entertainment will be reinvented through enhanced experience-specific service architectures that support better multi-model experience design. Including extended reality and super channel convergence. For example, F1 on Twitch, the leading service and community for multiplayer entertainment with real-time group in-race gaming.
2. The commoditisation of AI based recommendation engines, such as Netflix’s recommendation engine as a service.
3. DNA computers, or at least quantum computers, will threaten everything we have (only recently) learnt to trust in cyber security.
4. More human-aware environments. Highlighted by Japan’s famous robot hotel, Henn-na, who this year laid off half of its robot staff as they created too much work for humans; providing some confidence and consolation to those who are worried about becoming redundant to technology.
5. Better application of IoT to enhance environments and drive both utility and entertainment experiences will become mainstream.
6. And as sub-part of (5), there will be wider adoption of multi-sensory interfaces which will see the return of MR glasses with integrated voice interfaces as many brands are exploring how voice can work for them; not just through Alexa.
7. With 2020 being the year of 5G, it will also be the year rich mobile vision and voice interfaces become the norm. The rise of haptic interfaces will start to become less niche as mainstream application stretches out of just entertainment. The blend of human and robotic automation will become more powerful as a result.
8. Bonus…2021 predictions will be written by AI.
2020 trends and predictions from Henry Knight, Head of Product & UX, SmartDebit:
With the increasing sophistication of the IoT and the advancements in data security, there is greater scope in 2020 for making every day routines, tasks and experiences smarter and more intuitive. One such example, aligned to a longstanding interest of my own in discreet computing, is that of ‘smart spaces’. The exact mechanics I expect will rapidly evolve during the year, but a key theme is the increasingly powerful ‘enmeshment’ of our personal technology which we all hold in our hand, specifically mobile devices and even embedded payment chips, interfacing with public spaces such as stores, galleries and open areas to provide meaningful, tailored experiences. It’s a scary but exciting frontier for socio-technological development where our interests, desires and consumerist tendencies are used to shape the world around us. The question remains open, from thinkers like Douglas Rushkoff, whether this is a development that is for us, or against us.
Having a keen interest in the payments space, 2020 is due to welcome Request to Pay as a service. Request to Pay is not a new payment method, instead it is a messaging system designed to overlay existing payment methods such as credit card and Faster Payments. Its broadest intention is to embed a new way, a new etiquette even, of the business/consumer relationship designed to give more control to payers in when and how they pay for goods and services, baking in the idea of paying in part or rejecting payment as a response to a payment request. The service is optional for organisations collecting payments and adoption will require a mind shift if it is to succeed. Most likely it will only grow exponentially in usage when payers start to see some advantages and start questioning why all businesses aren’t adopting more flexible payment mechanisms. That desire may be felt more keenly by certain payer types such as those on less predictable income streams who require payment flexibility to work with their cashflow. On one hand it’s easy to see how certain organisation types can adopt it, like charities who want to offer donors flexibility in their charitable giving rather than causing a hard stop in their generosity altogether. On the other, there’s potentially a dystopian outcome where companies allow their customers to accrue larger and larger debts, so long as they are extracting some kind of payment instead of nothing at all.
Evolving Executives in an Evolving Industry
Leon Adato, Head Geek™, SolarWinds
Today’s executives operate in a world dominated by services. The days of CapEx IT spend and yearly budgeting are behind us. Decision-making occurs not in years and quarters, but with an in-the-moment adaptability allowing executives to remain competitive in modern business environments.
These elements are changing the makeup of C-suites. We’re seeing a shift away from boardrooms dominated by legacy executives as leadership becomes more encompassing of the skillsets needed to achieve success in today’s technology-driven economies. As executives continue to evolve in 2020 and beyond, we can also expect to see a greater convergence between the C-suite and the IT department to identify common ground to achieve modernization, digitization, and transformation.
Technologists and executives historically haven’t understood each other. This lack of understanding has added layers of complexity resulting in expensive misunderstandings for organizations as they seek to modernize. Evolved C-suites will identify this challenge and see it as an opportunity to help bridge the language gap between IT and executives.
Evolved executives will seek mentoring opportunities with tech pros to ultimately result in better use of technology for businesses. Changing the dialogue with IT means leadership won’t have to distill technical jargon, and they can more effectively work toward the common goal of ensuring business performance.
Breaking Down Siloes with IT Operations Management
Sacha Dawes, Head Geek™, SolarWinds
Multi-cloud and multi-premises will continue to dominate the IT reality in 2020. The pervasive use of IT across departments within organizations are forcing IT siloes to break down, resulting in tech pros from different teams needing to work together to support business objectives like productivity and performance. Add to this the challenge of managing highly distributed and multi-environment applications and infrastructures, and the path ahead for tech pros can appear increasingly daunting.
As businesses evolve, tech pros are tiring of the proliferation of “best of breed” tools from different vendors and are instead looking toward integrated solutions to give them the ability to manage environments inside their firewall as well as hybrid and public-cloud environments. Visibility needs to be centralized to support modern, multi-cloud, and multi-premises architectures, with an integrated view showing dependencies and relationships between applications and the infrastructure on which they depend. Only then can tech pros not only understand performance metrics across the business but speak a common IT language as they bridge previously siloed functions.
Seeing APM for the First Time in 30 Years
Patrick Hubbard, Head Geek™, SolarWinds
While application performance management (APM) tools have been available since the 1990s, they’re still among the most under-appreciated tools in the technology industry. In 2020, this will change. As companies adopt hybrid cloud, they use dynamic technologies to build new application features. While affording greater flexibility, agility, and scalability day-to-day, these changes to how we deliver applications make traditional monitoring techniques much more difficult (sometimes impossible), while the business expects even greater performance.
We’ve reached an application performance monitoring (APM) crossroads and in 2020 we have a rare chance to catch up with recent technology changes and regain control of application delivery monitoring. We’ll see the tech pro’s focus on infrastructure performance alone to expanding success metrics to encompass evolving real-world, end-user experiences, as well. For many engineers, this process will start with a simple change in perspective: learning to think like an end user to monitor like a wise admin. A renewed emphasis on implementing the tenets of APM offers organizations a silver lining: it can buy the freedom to innovate. It’s a way to start more meaningful conversations with the CIO, where tech pros can begin to educate business leaders about new ways to make customers happy.
As businesses continue to prioritize customer-centricity and end-user experience in the year ahead (and beyond) they’ll use APM tools to evolve, innovate, and reach broader business goals, ultimately resulting in increased implementation and corresponding skill development.
Back to the Basics: Avoiding Disaster Through Good Cyberhygiene
Thomas LaRock, Head Geek™, SolarWinds
Whether it’s spear-phishing, malware, or ransomware, criminals find new ways to attack and compromise businesses. In 2020, to combat these threats and avoid low-hanging-fruit vulnerabilities, organizations will realize the need to prioritize best practices for good cyberhygiene and training for everyone from entry-level staff to C-suite leaders. This includes simple employee best practices, such as leveraging password managers and account takeover prevention tools, to following the 3-2-1 backup rule (three copies of a data set in two different formats and one copy must be stored offsite) to critical end-user trainings to prevent ransomware and phishing attacks.
In the coming year, I also expect to see an evolution in the way we think about cybersecurity: moving from more of a hardware security focus to a broader digital security focus, to encompass the swath of personal data existing online and potentially being hacked or breached. We’re already beginning to see security and tech pros’ acknowledgement of this changing landscape with a greater focus on security-based skillsets. The recent SolarWinds® IT Trends Report 2019: Skills for Tech Pros of Tomorrow revealed 54% of respondents considered SIEM and threat intelligence the second-most important technology for career development by weighted rank.
Hackers, viruses, and intelligent malware are all part of today’s hostile cybersecurity landscape, and the adoption of good cyberhygiene will help organizations better prevent and prepare for security threats.
2020: At a 5G Crossroads
Patrick Hubbard, Head Geek™, SolarWinds
As chipmakers and ISPs alike continue along the path to broad 5G adoption, 2020 will prove to be a critical inflection point. While we’ll see certain societal impacts (such as improved application performance and the enablement of new technologies for IoT, collaboration, etc.), we can expect ISP powerhouses like Time Warner to radically shift their offerings while telcos like Verizon continue to invest in 5G equipment.
5G is coming, and although it may seem like the next generation of wireless tech will bring nothing but speed, responsiveness, and the reach needed to unlock the full capabilities of emerging tech trends, in actuality it will introduce unprecedented pain points—and those without a current solution.
In 2019, we saw smartphone makers like Samsung and ZTE bring 5G handsets to market but users have only been to scratch the surface of 5G’s potential, as telcos and networking companies are still building the infrastructure to support broader coverage of this next generation tech. Consumers running applications on a “5G lite” network (like AT&T’s 5Ge band) may experience faltering connectivity switching between different speeds, resulting in degraded or unpredictable performance for apps engineered assuming broadly available high performance networks. At the same time, monitoring applications running on increasingly fragmented networks will become even more important, pushing developers to optimize applications for all connectivity speeds. Being able to measure network performance will also be key to ensure further 5G infrastructure rollouts are meeting latency expectations.
Enterprises Take a Page From the Smart City Playbook
Patrick Hubbard, Head Geek™, SolarWinds
When it comes to digital transformation, there are parallels between what smart cities and enterprises are doing. Smart cities and innovative enterprises alike are using technology as an agent of change. As demand for digital experiences grows in 2020, enterprises will start to realize there’s more to learn from smart cities.
Smart cities use technology to deliver services more effectively and ultimately provide a better living environment for citizens. The same can be said for enterprises looking to untangle their customer’s supply chain or make a physical plant run more efficiently. But cities also face the same challenges of complexity, aging custom applications, overdue infrastructure upgrades, everything mature enterprises deal with, and more. However, some may argue smart cities are transforming faster than businesses already, adopting smart city mobility solutions citizens enjoy, and progressing private-public partnerships to drive further innovation. Smart cities are learning things, and finally making progress in areas enterprise businesses are keen to address. The research, experimentation, data, and best practices coming out of smart city efforts can be helpful as businesses chart a course for business transformation and technology modernization.
IT noise can be defined as any piece of information that a first responder has to deal with in order to solve a problem that is affecting their business – and that is not contributing to the understanding or resolution of that issue.
By Guy Fighel, General Manager AIOps & VP of Product Engineering at New Relic.
IT noise is irrelevant information that is making it harder to spot and solve problems. It’s an issue because in an IT operational environment, every second IT is not doing what it’s supposed to it means potential revenue loss. Today, modern technologies – such as AIOPs – is helping teams reduce operations noise significantly. Over the next decade, experts and innovators will continually be pushing to eliminate the last mile of noise reduction. But how?
Applying intelligence throughout the DevOps cycle
Rather than narrowing your IT approach to one specific aspect of the incident response process, teams should strengthen the relationships between each stage of the process to create a more powerful solution. Focusing only on faster detection, faster understanding, faster response, or faster follow-up is not enough; teams need a comprehensive tool that thinks like their best SREs—from a systems perspective.
Tapping intelligent assistance
Understanding the root cause and determining steps to resolution usually account for the majority of the time between an issue occurring and its remediation. To achieve this, teams need useful context about existing issues, including their classification based on the “Four Golden Signals” (latency, traffic, errors, and saturation) and correlated issues from across an environment.
Leveraging smarter tools for creating perfect software
In order to help customers create stellar software, experiences, and businesses, it’s critical to embrace solutions that are easy to connect and configure, work with the tools teams already use, create value throughout the entire observability process, and learn from data patterns and user feedback to get smarter over time. AI is one more step in this journey.
DevOps, SRE, and on-call teams rely on a multitude of tools to detect and respond to incidents. This ever-growing list of tools can pose problems: incident, event, and telemetry data is fragmented, siloed, or redundant, making it harder to find the information needed to diagnose and resolve incidents.
AIOps platforms promise to solve these problems with a centralized, intelligent feed of incident information that displays everything you need to troubleshoot and respond to problems, all behind a single pane of glass. Unlocking this value, though, can require a significant time commitment and workflow shift, potentially costing teams hundreds of hours in integration, configuration, training, and on-boarding tasks.
Delivering operations noise reduction and augmentation teams
On-call teams are familiar with noisy alerts triggered by low-priority, irrelevant, or flapping issues. These can lead to pager fatigue, cause distractions, and increase the probability that a critical signal will go unnoticed.
Developing a system of AIOPs whereby AI is augmenting humans means IT teams can achieve an IT noise-free production environment. Via the AIOps system, the user will get a much richer problem description with details of all the sub-incidents in one single notification, enabling them to more easily identify the root cause of the issue and solve all sub-incidents at once. Once operations understand what is wrong in one incidence it is much easier for them to solve the problem.
In summary, IT operations teams that want to successfully eliminate IT noise need to apply the aforementioned different techniques to augment IT operations teams and ultimately impact the bottom line positively.
As an IT leader, hearing “what’s our digital transformation strategy?” may be enough to send a shiver up your spine.
BY Tom Needs, COO at Node4.
However daunting it may seem, one of the biggest challenges of digital transformation is the fact that the term has been so widely applied, it’s now something of a cliché, or as Forrester suggests, has “come to mean so many things that it’s almost meaningless.” Nevertheless, it has become such a key driver of eminent change in the global economy that it takes a courageous business to not have a digital transformation plan in place.
Digital transformation is prevalent – with figures from Gartner revealing that 79% of corporate strategists say it is “reinventing their business”. But what it actually means will vary for every organisation, and therefore, mapping the right path to achieving a shared understanding within the business and getting buy-in to a digital transformation plan depends on being clear on this definition, purpose and constraints.
Taking a brief step back from the technology alone makes a difference. As the Enterprisers Project CIO community says, “love it or not, the business mandates behind the term – to rethink old operating models, to experiment more, to become more agile in your ability to respond to customers and rivals – aren't going anywhere.”
By adopting these three core principles, businesses can begin to refine their approach and discover what digital transformation means for them – not for everyone else – and how to find the correct path to go down. That’s no easy task, but here are five useful steps to consider:
Understand that digital transformation is a business transformation
Before you commence on any digital transformation strategy, it’s crucial to start by understanding your key business drivers and strategic priorities. Digital transformation is not a technology box-ticking exercise and any strategy must focus on business objectives and explain how digital transformation will benefit the business.
More broadly, you also need to gauge what level of change your organisation can effectively resource and enforce. This is necessary because change brought about by digital transformation is so much more than technology upgrades or increased IT investment. Research published by McKinsey illustrates the considerable effort required, arguing that technology is “only one part of the story”. Success depends on some radical activity, from reimagining the workplace and upgrading the organisation’s ‘hard wiring’, to changing the way you communicate. Not only do businesses need to have ‘digital savvy’ leaders, but they also need to build relevant talent and skill sets throughout their organisations.
Engaging with the bigger picture requires a willingness to learn lessons from those who report success, to ensure that digital transformations do not “fall short in improving performance and equipping companies to sustain changes.” This has happened to many a digital project, for example, an organisation may have been unable to sufficiently update entrenched, analogue business processes to support a whizzy new digital customer interface.
Acknowledge the broader business environment
It is important that businesses identify external pressures and challenges covering key areas such as their markets, processes, regulatory environments, competition, and supplier and customer ecosystems. Time and effort should be put into researching where to invest and how digital transformation strategies are being applied in relevant businesses and industries.
This underlines the need to focus digital transformation more broadly than tech procurement, because, as Gartner argues, “the non technological aspects, if not addressed, can mask the depth of organisational transformation required and become serious inhibitors.” Industry inertia, for example, can lead to the failure of digital transformation projects. You might have a shiny new customer-facing process, but if it relies on a partner who can’t support digitally transacting in this way, it will never achieve its promise.
Examine your technology options
IT departments often don’t have the time or skillset to develop a comprehensive roadmap for change and transformation. Add to this the bewildering choice of technology providers out there pushing the digital transformation message, and the resulting complexity of a digital transformation project “can be a killer”.
Before deciding on technology choices, review your current IT estate carefully and map out both current and ideal future states for core infrastructure and applications. There are important decisions to be made about legacy business applications and this map will help you to research the options for modernisation. There is a lot of noise in the industry about being ‘cloud-first’ and for many, the idea is to plan for everything to eventually be in the cloud. However, it’s worth noting that it’s not the only option for legacy applications and some of the first movers into the cloud have not seen the cost and performance benefits they planned for.
Set performance measures
KPIs offer IT leaders the possibility to set a shared understanding of digital transformation within the business and maintain an ongoing discussion. The temptation is to evolve existing IT focused metrics, but with an objective as broad as business transformation there is a good case for starting over. Consider what it means to your organisation to be a digital business; can you demonstrate and measure the impact on customers, speed of operations, data exploitation and the rate of innovation?
There are many digital transformation KPI lists available on the internet, but this is another temptation best avoided. Your KPIs need to be both industry specific and organisation specific. IDC has created a helpful digital scorecard for CIOs thinking about how to best fashion their KPIs. With shared understanding as a goal, your metrics need to interest and engage your organisation’s leadership team. They also need to reflect the fact that digital transformation is a long-term programme.
Find the best partners
It is rare for organisations to embark on the digital transformation journey alone, so tech partnerships can ensure your approach is future-proof and flexible enough to adapt with changing demand and technology. Look for partners who, in Forrester’s words, can ‘define digital transformation’. The right partners should be able to demonstrate an understanding of your business, markets, competitive pressures and opportunities. When you’ve started narrowing down your options, ask for evidence that they can successfully apply technology to deliver genuine transformative change. Otherwise, you risk missing the target by simply moving from one set of technology vendors and service providers to another, without addressing your wider objectives.
Digital transformation carries a mixture of both risk and reward. Whether you have just started your journey, or have already embarked on it, if you’re establishing and evaluating objectives, progress, challenges and benefits, it can completely reinvent your enterprise and pay huge dividends.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, and now we continue the series in the December issue. Part 7.
Throughout 2019, technology has continued to have a transformative impact on businesses and communities. From the first deployments of 5G to businesses getting to grips with how they use artificial intelligence (AI), it’s been another year of rapid progress.
From an IT perspective, we have seen two major trends that will continue in 2020. The first is that on-premises and public cloud will increasingly become equal citizens. Cloud is becoming the new normal model of deployment, with 85% of businesses self-identifying as being predominantly hybrid-cloud or multi-cloud today. Related to this are the issues of cybersecurity and data privacy, which remain the top cloud concerns of IT decision makers. In 2020, cyber threats will increase rather than diminish, so businesses must ensure that 100% of their business-critical data can be recovered.
Here are some of the key technology trends that businesses will look to take advantage of and prepare for in the year ahead.
1. Container adoption will become more mainstream.
In 2020, container adoption will lead to faster software production through more robust DevOps capabilities and Kubernetes will consolidate its status as the de facto container orchestration platform. The popularity of container adoption or ‘containerization’ is driven by two things: speed and ease. Containers are abstract data types that isolate an application from an operating system. With containers, microservices are packaged with their dependencies and configurations. This makes it faster and easier to develop, ship and deploy services. The trend towards multi cloud means businesses need data to be portable across various clouds — especially the major providers — AWS, Microsoft Azure and Google Cloud. 451 Research projects the market size of application container technologies to reach $4.3 billion by 2022 and in 2020 more businesses will view containers as a fundamental part of their IT strategy.
2. Cloud Data Management will increase data mobility and portability.
Businesses will look to Cloud Data Management to guarantee the availability of data across all storage environments in 2020. Data needs to be fluid in the hybrid and multi cloud landscape, and Cloud Data Management’s capacity to increase data mobility and portability is the reason it has become an industry in and of itself. The 2019 Veeam Cloud Data Management report revealed that organizations pledged to spend an average of $41 million on deploying Cloud Data Management technologies this year. To meet changing customer expectations, businesses are constantly looking for new methods of making data more portable within their organization. The vision of ‘your data, when you need it, where you need it’ can only be achieved through a robust CDM strategy, so its importance will only grow over the course of next year.
3. Backup success and speed gives way to restore success and speed.
Data availability Service Level Agreements (SLAs) and expectations will rise in the next 12 months. Whereas the threshold for downtime, or any discontinuity of service, will continue to decrease. Consequently, the emphasis of the backup and recovery process has shifted towards the recovery stage. Backup used to be challenging, labor and cost-intensive. Faster networks, backup target devices, as well as improved data capture and automation capabilities have accelerated backup. According to our 2019 Cloud Data Management report, almost one-third (29%) of businesses now continuously back up and replicate high-priority applications. The main concern for businesses now is that 100% of their data is recoverable and that a full recovery is possible within minutes. As well as providing peace of mind when it comes to maintaining data availability, a full complement of backed up data can be used for research, development and testing purposes. This leveraged data helps the business make the most informed decisions on digital transformation and business acceleration strategies.
4. Everything is becoming software-defined.
Businesses will continue to pick and choose the storage technologies and hardware that work best for their organization, but data centre management will become even more about software. Manual provisioning of IT infrastructure is fast-becoming a thing of the past. Infrastructure as Code (IaC) will continue its proliferation into mainstream consciousness. Allowing business to create a blueprint of what infrastructure should do, then deploy it across all storage environments and locations, IaC reduces the time and cost of provisioning infrastructure across multiple sites. Software-defined approaches such as IaC and Cloud-Native — a strategy which natively utilizes services and infrastructure from cloud computing providers — are not all about cost though. Automating replication procedures and leveraging the public cloud offers precision, agility and scalability — enabling organizations to deploy applications with speed and ease. With over three-quarters (77%) of organizations using software-as-a-service (SaaS), a software-defined approach to data management is now relevant to the vast majority of businesses.
5. Organizations will replace, not refresh, when it comes to backup solutions.
In 2020, the trend towards replacement of backup technologies over augmentation will gather pace. Businesses will prioritize simplicity, flexibility and reliability of their business continuity solutions as the need to accelerate technology deployments becomes even more critical. In 2019, organizations said they had experienced an average of five unplanned outages in the last 12 months. Concerns over the ability of legacy vendors to guarantee data Availability are driving businesses towards total replacement of backup and recovery solutions rather than augmentation of additional backup solutions that will be used in conjunction with the legacy tool(s). The drivers away from patching and updating solutions to replacing them completely include maintenance costs, lack of virtualization and cloud capabilities, and shortcomings related to speed of data access and ease of management. Starting afresh gives businesses peace of mind that they have the right solution to meet user demands at all times.
6. All applications will become mission-critical.
The number of applications that businesses classify as mission-critical will rise during 2020 — paving the way to a landscape in which every app is considered a high-priority. Previously, organizations have been prepared to distinguish between mission-critical apps and non-mission-critical apps. As businesses become completely reliant on their digital infrastructure, the ability to make this distinction becomes very difficult. On average, the 2019 Veeam Cloud Data Management report revealed that IT decision makers say their business can tolerate a maximum of two hours’ downtime of mission-critical apps. But what apps can any enterprise realistically afford to have unavailable for this amount of time? Application downtime costs organizations a total of $20.1 million globally in lost revenue and productivity each year, with lost data from mission-critical apps costing an average of $102,450 per hour. The truth is that every app is critical.
“Identity theft will take a new direction with the increased use of deep fakes
What has been concerning in 2019 is the increase in identity and credential theft, and I see this becoming much more problematic in 2020. The rapid advancement of Deep Fake technology is taking identity fraud to a whole new level of online challenges and risks, not only are they stealing your digital online identity, but also your digital voice and digital face. This means that cybercriminals can take digital identity theft to a new level and could have the ability to create an entire digital clone of you. I see this becoming a major problem area in the cyber space and even more so in political campaigns as the general public will not have the awareness to distinguish what is real from fake. In today’s internet data without context is dangerous
Government Use of machine intelligence (typically referred to as Artificial Intelligence) to be put to the TEST
In 2020 AI will become an important strategy with many governments around the world using AI to improve and automate many citizen services however acceptable use and limitations of the scope will also be applied. This will help determine the full scope on how much data should be collected, for how long and for exactly what usage to limit abuse of such sensitive data. For government to be successful with AI they must be transparent with their citizens. We must embrace AI moving forward but with responsibility and caution.
This year, the use and abuse of IoT devices has risen and doesn’t look to be slowing down as we go into next year. IoT differs from computers as they have a specific purpose and cannot be re-programmed, therefore organisations need to view and assess the risks specific to the function or task of the device in order to increase the security. Organisations, in particular the manufacturers of IoT devices, will need to adapt their security approach to ensure that these fast-growing endpoints are secure. The new Californian and Oregon IoT legislation coming into effect in January is a step in the right direction, but more must be done. IoT security is about focusing on the risks not the device.
Cyber awareness is evolving to become more human friendly. We are now seeing a difference in approach to security evolving into company culture. Boards and top-level executives are now learning how to communicate accordingly on cyber security topics, meaning that security teams and their goals are becoming a lot more aligned with the business’ goals.”
Tyler Reguly, manager of security R&D at Tripwire, comments:
“Whether or not it will be is a different question, but 2020 NEEDS to be all about the consumer when it comes to security. The world of end user electronics and services created a navigational nightmare for everyone. Personal account breaches and password reuse can put corporations at risk to improved phishing attacks. Smart devices are everywhere, connecting to everything. They provide such a large attack surface that they are a problem. 0.04% of Disney+ accounts saw password disclosure (most likely via password reuse), but I’ve heard from many people that they “won’t use Disney+ because it was hacked.” This type of FUD could put a smaller organization in jeopardy financially. Additionally, websites like IndieGoGo and Kickstarter allow anyone with an idea to fundraiser for a new smart device, regardless of how much domain knowledge the creator has. This leads to the creation of many insecure devices that find their way into homeowner networks regularly. Consumers need to be aware of what they are doing and the risks they create for business, for their employers, and especially for themselves.”
Vivio predictions for 2020
Several of the major technology trends are already underway, but 2020 will be the year that many really become part of everyday life.
5G was hyped up throughout 2019 but next year will be when we see it become more widespread, increasing in availability from eight cities to around 50. Not only does this mean it will be more accessible for consumers on an everyday basis, but different industries will be focusing on their 5G offering.
In 2020 the overriding theme will be ‘more, more, more’ with greater customer demand to be connected everywhere and faster than ever before, plus an increasing reliance on mobiles running our daily lives.
Consumers will place value on speed as opposed to the quantity of data and will assume that they can use whatever level of data they want, but speed will be a huge factor.
How we use data and the level of demand will be determined by 5G. This will then have a knock-on effect on the roll out of other technologies such as driverless cards and smart machinery that 5G will enable. Ultimately, the real test will be how companies adapt to that, with changes across a range of industries including healthcare, manufacturing and logistics.
AI is another prolific topic. 2020 will see major steps in the development of AI and its ability to better understand accents, complex conversations and emotions. We will see an increase in voice being used via SMS and instant messaging. This will also mean virtual support agents becoming the go to for advice and will also lead to increased reliability on voice assistants such as Siri and Alexa.
There has been a lot of talk around AI replacing jobs of human professions, however research suggests that it will actually create more jobs in education, healthcare, and the public sector, where AI can be used alongside workers and to assist them with repetitive tasks.
As the world becomes a more connected place, IoT is rapidly growing as people’s daily needs rely on the internet. Now it’s not just about phones, computers and tablets – there’s a whole myriad of connected devices, including everything from your washing machine to your fridge and vacuum cleaner.
As IoT continues to become more widespread, it won’t just be consumers using IoT devices, companies will increasingly adopt smart technologies to save time and money to remotely manage and collect data.
The roll out of 5G will help fuel IoT growth with greater speed and the ability connect more smart devices simultaneously.
Whilst the cloud isn’t a new technology, it now forms an essential part of business infrastructure. Cloud trends in 2020 will see extensive use of third-party products, an increase in machine learning as a service, edge computing and a rise in hybrid cloud products. The main concern in 2020 will be security; vendors will need to provide a comprehensive range of solutions for dealing with potential threats.
Blockchain & Crypto Currencies
Blockchain and crypto currencies were big news this year. However, 2020 will look beyond this at how the technology can be used across a variety of different industries and practises including: tracking food produce to provide transparency into the supply chain, sustainability and freshness of a product, which will ultimately lead to increased customer trust.
The time needed by UK organisations to recover from a data breach is increasing. Senior decision makers surveyed for NTT Security’s 2019 Risk:Value report expect recovery to take nearly 100 days on average, double the figure of a year ago. They also believe they would lose more revenue as a result of a breach, forecasting almost 13% in 2019 versus almost 10% in 2018.
By Azeem Aleem, VP Consulting, NTT.
The need has never been stronger for an organisation-wide, security-first culture, supported by a robust incident response plan and a coherent, well enforced security policy. But companies are failing to make progress towards this – and in some cases are going backwards.
NTT Security analysed all the organisations surveyed, across global markets, awarding positive scores for good cybersecurity practice and negative scores for bad. In both 2019 and 2018 the average score was just +3, meaning that there is nearly as much bad practice as good practice. One third of businesses score less than zero: exhibiting more bad practice than good practice.
The ‘stalling’ of cybersecurity progress is not down to a failure to recognise the scale of cyber risk, or the need to address it.
Cybersecurity threats are top of the agenda for UK business leaders, with cyber attacks, data loss or theft, and attacks on critical infrastructure cited as three of the top five business risks they face. Only ‘economic or financial crisis’ was a greater concern. The picture is the same across almost all markets – including Australia, Germany, France, India, Spain and the US.
Business leaders are also aware of the benefits of implementing strong cybersecurity measures, with 84% believing this will help their business, and 88% believing cybersecurity has a positive role to play in society at large.
So why are organisations’ security postures no better than they were a year ago?
Lack of policy. Only 70% of UK businesses have a formal security policy in place, down 7% from 2018, and of those 48% say their employees are fully aware of it. The problem is even more acute in the rest of Europe, with only Spain coming close to the UK (67% have a formal security policy). This drops to 48% in Switzerland and 44% in the Netherlands.
Incident response plans are in place at 60% of UK companies; higher than the global figure of just over half, and tops companies in the Netherlands, Germany and Austria by some margin.
Inadequate investment. Budgets are failing to keep up with growing demands on teams, with the percentage of operations spend dedicated to security falling around 1% to 16.5%, and 15% of IT spend attributed to security. Organisations in Germany and Switzerland are spending the least on security, at 14% and 12% of IT budget respectively.
Shortage of skills. Businesses still don’t have adequate skills and resources to cope with security threats, with almost half of UK companies admitting this is the case. Globally, the issue is most acute in Singapore (59%), which may in part be due to a competitive jobs market and the country’s increasing attractiveness to cybercriminals.
Insufficient knowledge of regulation. The regulatory landscape has changed in the last few years, but many businesses are not keeping pace. While four in five feel that compliance is important, 13% do not know which regulations they are subject to. More than half in the UK do not believe their company is affected by GDPR. Worryingly, awareness is even lower in a number of European countries including Benelux (24%), Switzerland (32%), Germany and Austria (36%) and France (37%)
Fear over non-compliance is leading many businesses to consider paying ransoms to hackers: a third of UK executives would rather pay up than invest more in security, up 12% from 2018.
At the root of this cybersecurity paralysis is a lack of strategic leadership.
Passing the buck
Nearly half of UK business leaders believe that cybersecurity “is the IT department’s problem”, and nothing to do with the wider business. This rises to more than half in Switzerland, Sweden and Norway.
Cybersecurity really matters to business leaders. They see it as an enabler, and they’re aware of the risks and the need to manage them – but appear to lack the ability, or perhaps the will, to do so. As a result, many businesses are falling behind cyber criminals as the capabilities of their adversaries advance.
Organisations must act now to address their weak cybersecurity links. Security needs to be a strategic priority – discussed regularly at board level, and integrated into and monitored as part of the overall business risk programme.
Effective cybersecurity policies and incident response plans need to be implemented, communicated to all stakeholders, and tested and regularly reviewed. This requires a comprehensive understanding of the regulations and compliance obligations that apply.
Finally, organisations must plan for change. Threats evolve, and new skills and resources will be needed to combat them. The integration of new technology and digital transformation projects is expanding the threat surface, and hacking campaigns are causing more damage. Unless the design and execution of cybersecurity strategies improves, business risk will continue to escalate.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, and now we continue the series in the December issue. Part 8.
As we stand just months away from the new decade, we are presented with the opportunity to look back on the year that was and gaze ahead at what’s to come in the year ahead.
2019 was a year of new technology trends promising to alter our business landscape as we know it. Safe to say not all these trends came fully to fruition but they did lay the groundwork for the coming years and we are excited to see where this takes us.
Bigger the better – hyperscale is here to stay
2019 saw an uptick of smaller data centre facilities. Designed for smaller workloads, the demand for decentralised micro data centres grew, raising questions around the need for hyperscale data centres in years to come.
But hyperscale is here to stay. In fact, the demand for these large-scaled centralised data centres is set to reach new heights in 2020, with bigger than ever data centres entering the scene.
As digital companies expand into new markets, Asia and Europe will take the lead in terms of new hyperscale builds. Smaller facilities are not commonplace in these regions where regulations and barriers surrounding news builds have limited the number of facilities providers can enact.
On the flipside, US, a more mature market where the data centre new build market is much easier to navigate, will see a slowdown in hyperscale facilities in 2020.
Sustainability is the name of the game
The ball will drop in 2020 when legislators realise that the climate crisis is a real issue.
We will see legislations that require data centre new builds fulfil a green power quota coming into play. In fact, we can expect some regulators holding back permits for new data centre builds as a result of limited power supply.
We already saw this within the Dutch municipalities of Amsterdam and Haarlemmermeer, which called for an immediate stop to the construction of data centres in the region to ensure public reliance on the power grid would not be impacted as a result of power-hungry data centre facilities.
Adding 5G into the mix in 2020 can only mean there is less power to go around. Unable to travel the distance like its far-reaching predecessor 4G, 5G will need more wireless receivers to create substantial aggregation points. This will inevitably grow the power supplies needed to keep the network working to effectively.
There is a likelihood that some governments will limit data centre operations in order to fulfil power needs of both the public and corporations alike.
AI will be limited in its capabilities
This year sparked off many a conversation around AI and its use cases across various industries. Within the data centre sector, conversations centred around how providers can support businesses in undertaking this technology and in terms of how data centre facilities themselves are employing AI to elevate their operations.
The tail-end of the year saw many coming to the stark realisation that AI can be limiting in the benefits it can bring. Within the data centre industry for example, AI was pipped to be a fantastic replacement for building management systems. It was believed that AI could take over the role engineers played in carrying out the manual facility management processes. Yet, that simply has not and will not happen given the need for engineers on-site to make specific judgement calls.
2020 will be the year where the industry realises that while AI can be a fantastic extension to enhancing human capabilities, it cannot work independently or replace human roles within facility management. Instead, skilled engineers will be more in demand to ensure AI can effectively be incorporated into any data centre facilities to drive higher value for end customers.
Change will be slow to come
The data centre industry has historically been known to be resistant to change.
For instance, we saw a new cooling technology come into play over a decade ago which involved placing servers in water to help regulate the temperature. Both cost-effective and power-efficient, it took off significantly in some regions like the US, yet even after a decade later, the widespread roll-out of that has yet to come.
In 2020, what needs to change the most is the mindset of data centre providers. “Why change something that isn’t broken” can be a safe option but it will inevitably leave providers lagging behind other industries in harnessing the benefits of fast-moving technology solutions.
Whether it’s integrating a new technology or implementing a new design, providers need to be thinking about offering the best possible solution for their customers. And often times, this will involve changes to current ways of operating.
2020 will hopefully be the year that providers embrace change the way they should be.
Data centre industry growth will reach new heights… and new locations
2020 will see more and more businesses recognising the data centre sector as a lucrative business opportunity. Hoteliers and real estate companies alike will start branching out into data centre new builds.
We’ll also see more non-data centre establishments investing big money into the industry. One way this will occur is through acquisitions of smaller data centre operators looking to scale up.
As a result, the already competitive data centre market will only become even more so.
We’ll also see regions that have not historically been associated with data centre builds cropping up with new facilities. Data centre hubs like the US and Europe will start reaching saturation point, forcing providers to extend their builds into newer areas.
Major cities with large population and heavy digital reliance will require a high bandwidth of connectivity, making them the new target markets in years to come. Southeast Asian countries like Vietnam, Thailand, Indonesia and even Malaysia will become new regions to watch for providers.
These emerging markets will allow providers an easier inroad due to lesser restrictions from regulations such as GDPR. Providers will also find themselves facing better cost-saving opportunities as they can serve these markets through on-site facilities rather than having to do it remotely from the nearest country for instance.
How quantum computing can break the new enigma machine
By Andersen Cheng, CEO at Post Quantum
Between 1943 and 1945, a team of British codebreakers and cryptologists met in secret at Bletchley Park with the intention of cracking ‘Enigma’. The Nazi machine had confounded the Allies since the beginning of the war. Encrypted messages passed between branches of the German military, undetected and untraceable.
The British project reached its fruition with the creation of ‘Colossus’. The machine designed with a signal purpose - breaking Enigma. When the Allies finally began deploying the technology, it sped up the end of the war by as much as two years.
But despite what some of its inventors may have originally intended, Colossus was not just a single-purpose tool. It was also the world’s very first digital computer. The traditional computer and the internet and everything else, were the fruits of the work at Bletchley Park.
Just this week, Google claimed ‘quantum supremacy’ in an article in Nature Magazine. The tech company are frontrunners in the development of the quantum computer, a technology that will mark a ‘quantum leap’ for computing with consequences equivalent to the creation of Colossus.
Quantum computers won’t replace the computers we use day-to-day. They are still temperamental and experiments can only be conducted under strict lab conditions. But what they are good at is ‘factorising’ incredibly large prime numbers – the secret to today’s encryption used in all internet traffic and telecommunications.
The encryption standards that currently secure the internet are completely reliant on our computers’ inability to factorise such numbers. Indeed, decrypting the internet’s security standards is so difficult because it would take a traditional computer 10,000 years to complete the same task. Google’s experiment however, showed its ‘Sycamore’ quantum processor performing such a task in just over three minutes.
Quantum computers represent then an altogether new kind of Colossus, capable of breaking codes previously thought impossible to crack. So, does this render all encryption vulnerable to ‘quantum attack’? From the biggest banks to governments, to me and you?
Not quite yet. It is thought that a commercially available quantum computer is still ten years away from being commercially deployable. But that doesn’t mean we can be too cautious. Indeed, with the US and China already engaged in a ‘new space race’ to master this new technology, the ten-year prediction is estimated down with every passing year. They do not need to wait for a pristine commercial grade machine to be made available, a lab controlled engine with all the pipes and wires hanging out would be good enough to crack encryption.
As such, experts in the intelligent services have privately briefed against the ten-year estimate, predicting that quantum computers will be just a few years from maturity. If a hostile government or non-state actor possessed the keys to the world’s secrets, would they let the news slip so lightly?
Take it from me, we won’t hear about the arrival of the world’s first encryption cracking quantum computer, we will only hear about the consequences after a nation state or single entity has made use of the secret information for untold purposes. The time then for governments and enterprises around the world to adopt ‘quantum-proof’ encryption isn’t ten years from now –– it’s today as it will take a considerable time for everyone to become quantum proof.
What used to be an administrative, back-office task, has now grown into an intelligent business function used for the large-scale production of value-added goods. A digital supply chain has the aim of being not only more efficient and cost effective for businesses, but also able to contribute to the improvement of the present effects of social and environmental issues.
By Darren Koch, SAP Ariba.
While up and coming technologies may not be widely used in procurement just yet, a recent global survey of chief procurement officers (CPOs) has revealed that integrating them into the supply chain process is indeed a priority. Despite this, more training and education needs to be actioned, after the same study revealed that many still do not understand exactly how to unlock the full potential of a digitally transformed supply chain. Implementing a complete digital transformation is a significant challenge in itself, often overwhelming professionals about where to start, but they do have scope of what they’re looking to achieve. In fact, 84 percent of responding CPOs recognised that digital transformation through emerging technology adoption will improve procurement operations. However, it’s just not being used and less than a quarter of respondents are currently using AI, ML, 3D-printing or predictive analytics.
So, while CPOs understand that digital transformation is important, they don’t necessarily understand why it is important, or the practical value it can bring to their procurement organisation on a day-to-day basis or in the long term. The reality is, when implemented effectively and practically, data-driven digital transformation enables more effective spend management and mitigation of risk, therefore aligning ethical values with business practices, and ultimately saving money and time.
Having risk under control
Disruptions to the supply chain are rife - hurricanes, winter storms, trade wars, corporate financial health and more. And while they aren’t new to businesses, the way they are managed, is. For example, procurement professionals are now able to implement AI or ML to analyse data that was previously siloed and recognise patterns that previously slipped through the cracks, forecast bottlenecks and predict changes in demand.
When businesses utilise these technologies for purchasing and supply partner decisions, they can monitor changing patterns in real time, preventing potentially catastrophic events from causing severe operational disruptions.
Corporate Social Responsibility is steadily becoming more important to businesses and indeed their customers. News story after news story on sustainability continues to dominate the media, and everyday consumers are becoming more conscious on the impact they have on their environment, and indeed, the businesses they buy from.
So much so, that they’re willing to pay more for products that are sustainable, a 2019 CGS survey found. It is projected that this number is only likely to rise as environmental and human rights issues become more prominent. The good news is, businesses won’t be left scrambling for solutions, instead they will be able to use tools such as blockchain and ML to identify possible human and environmental rights violations among supply partners, and work as a team to fix these issues.
The Cost Debate
The price of efficiency in procurement is high, but when leveraged correctly, AI-powered procurement can help companies drive cost efficiency in a plethora of ways.
For example, predictive analytics can be used to identify potential hotbeds of risk, such as a financially struggling partner or a supplier who is soon entering drought season, or worse – about to go bankrupt. By predicting and resolving these issues before they occur, buyers can minimise or even avoid disruption before it occurs by altering their supply cycle or finding an alternative supplier.
Lastly, by automating manual operations like making purchases or tracking invoices, businesses will be able to consolidate several processes and eliminate the opportunity for user error. This also frees up time for staff who were previously focused on menial and repetitive manual tasks, and enables those same employees to take part in more strategic and productive activities.
Intelligent Spend Management
In today’s always-on, global economy, making sure spend is managed while staying agile is no easy feat, and companies are beginning to realise the power they can harness when they are able to spend more effectively. Intelligent spend management strategies are crucial to implement real-time decision making, assess and tolerate risk, and drive cost effectiveness. And, to make things even easier, emerging technologies, such as AI, blockchain and ML, are especially valuable when used in conjunction with spend management.
It’s a common theme that while procurement leaders understand the urgency of certain initiatives, they often don’t know how to execute them, and this is true for digital transformation also. While there are many benefits to digitalising the supply chain and the knowledge that integrating emerging technologies will prove beneficial, procurement professionals are still slow to adopt these technologies. Fortunately, the roadblocks that are in place are easily removed through digital transformation.
For example, budget restrictions, conflicting priorities, talent shortages and data insights can all be remedied through a digital transformation which can ultimately provide cost savings and drive profit margins.
In order for companies to widen their business network, reduce costs and increase efficiency, the most viable solution is through digitalisation. Only then can they reap the benefits and leave positive, lasting societal and environmental impacts.
The economic implications of noncompliance are enormous and have required companies to completely rethink how they hold, control and manage data and give consumers control over their personal information. Companies must be acute to data loss through attacks, where even the largest of companies are susceptible for putting their customer’s data at risk, displayed by the recent British Airways data breach.
It’s a difficult task to do this at any scale, but not impossible. A significant shift in the approach to data management, both culturally through processes and a refresh in legacy technology make this more achievable. Switching to the cloud, in particular cloud powered data warehouses, can assist in this process for a secure, compliant system that gives a company oversight and an individual their rights to privacy.
Keeping the doors locked against hackers
Traditional siloed data storage is fraught with vulnerabilities. Susceptible to ransomware and destructive attacks, hackers can lock personal data from email addresses to personal medical details in a single location.
To combat this, organisations must relinquish their reliance on legacy technology and shift to the cloud to take advantage of the advanced capabilities available to manage and secure their data. Cloud providers safeguard consumer data through advanced encryption during transfer and whilst stored. This involves applying an encryption algorithm to translate clear text into cipher text, where access is only granted through encryption keys. This process known as ‘key management’ allows data to be safely accessed by being decrypted via a specific key. A modern, cloud-based data warehouse can manage this whole process and add further layers of coded security. This involves using a hierarchical key wrapping approach, which secures encryption keys, negating the opportunity of key becoming compromised, whilst providing the capability to revoke data access immediately through withdrawal of the master key. For added peace of mind, data warehouse platforms will adopt a flexible key-rotation approach which limits how long a single key is used. This approach means that only those with specific permissions for reading individual data can access it, preventing unsolicited access.
Iron-tight access control
At a company culture and process level, best practice is now to closely monitor and safeguard the permissions of employees to important or sensitive data. Data breaches are quite simply born out of neglect and the complexity of managing and controlling access across a myriad of legacy systems.
Controlling exactly where your data exists is virtually impossible through legacy, on premise systems. Cloud storage providers can begin to shine a light on all corners of an organisation’s datasets, and provide a clear picture on who is accessing this data. A modern data warehouse will support multilevel, fine-grained, role-based access control (RBAC) ensuring users can only access what they have been permitted to see down to column and row level. Particularly sensitive fields can be masked, tokenised or further encrypted, providing an added level of protection. All access to data is logged, enabling data controllers to see who has accessed the data and for what purpose.
In terms of access security, cloud data warehousing are equipped with multi-factor authentication (MFA), providing an additional layer of protection and compliance. This simple process sends a secondary verification request to anyone who logs into a system, via a token through a mobile phone. This ensures that an unauthorised user, with a stolen username and password, cannot access company data. Organisations and individuals are now more data savvy than ever, and failure to offer simple MFA processes, will likely cause people to think twice before handing over personal data.
Secure data sharing
Safeguarding stored data is not the only benefit of adopting a cloud-based data warehouse. Companies can now find solace in securely sharing data both internally and to external parties.
Data is a hugely valuable resource and companies still need to analyse and leverage their business data to better manage operations and strategy, while protecting personal information. The rise of big data has coincided with that of data science, a practice that has already changed how many businesses produce and manage their goods and services, work with suppliers and sell to customers. In particular, accessing, storing and analysing datasets enables organisations to extract key insights on consumer trends.
Data sharing can take place securely and efficiently through the cloud and provide access to data much faster than through traditional systems. Archaic processes of sharing data through File Transfer Protocol (FTP) systems or sending valuable data spreadsheets via email, are now replaced by secure and real-time data sharing models in the cloud, which prevent the risk of multiple datasets being unnecessarily duplicated. This data can be shared internally within different departments, or offices based in different locations, while external data can be shared across the business ecosystem, with the guarantee of data leakage prevention.
Organisations shouldn’t be deterred from leveraging data due to the security implications of poor data management practices. For companies to continue capitalising on the value of data, it requires a significant rethink of outdated technology structures and embracing a modern cloud powered data platform that will not only safeguard a company and individual data, but establish a secure data sharing system. With the severe implications in a post-GDPR world, there has never been a greater need for data protection and systems that usher an improved standard of privacy and security.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, and now we continue the series in the December issue. Part 9.
Zscaler’s transformation strategy experts offer a range of insights about what organisations should expect in 2020 and beyond, with guidance for avoiding pitfalls along the way:
As we look to the decade ahead, we anticipate fundamental architectural changes in enterprises, particularly for those that are well underway in their digital transformation journeys. The rising shift to the cloud will accelerate in 2020 as companies increasingly recognise the benefits of a “no network” strategy, a zero trust-oriented security model, and the importance of delivering a seamless and secure user experience. These are, after all, the prerequisites of the digital workplace.
To help illuminate the path that enterprises are negotiating at the onset of the next decade, Zscaler Transformation strategy experts in network infrastructure and cloud security have offered a range of insights about what’s to come. Not only do they discuss what companies and IT organisations should expect in 2020 and the years that follow, but they also offer guidance for avoiding pitfalls along the way.
#1 – Simplicity becomes key: User experience will move into focus
Users will always take the path of least resistance when accessing the applications required for work, regardless of whether those apps are hosted in the corporate data centre or in the cloud. This habit is not limited to the IT realm – it is human nature to take the shortest path to achieve any goal. To prevent users from taking the kinds of shortcuts that circumvent security controls, IT must offer users the simplicity to which they are accustomed when accessing their personal applications.
IT departments will make it a priority to provide infrastructures that enable a fast, seamless user experience, particularly as enterprises seek to attract a new generation of digitally native workers. With 5G on the horizon with always-on technology, it will become even more crucial in the next decade to offer users a secure and short access path to any application on any device.
#2 – Driven by cloudification: Applications will start to live anywhere
Traditionally, corporate applications have lived within the corporate network, where they could be controlled and monitored by the company. While cloud adoption has been on the rise, we will see it increase dramatically to improve usability and to benefit businesses. Even today, users don’t want to think about where their applications are stored. Companies will need to stop having conversations about legacy controls.
Enterprises now have the opportunity to move applications to more cost-effective and more business-effective locations. If applications can reside anywhere, it opens up the ability for enterprises to decide whether or not to change the location of applications at any time without impacting the service for the end user.
Such flexibility also has the benefit of creating greater competition in the marketplace. With enterprises no longer married to a single cloud provider, smaller operators, perhaps better suited to an organisation’s needs and with more competitive pricing, can thrive.
#3 –Reliance on the corporate network will be on the wane as SASE grows
The new networking model defined by Gartner as the secure access service edge (SASE) is based on the concept that reliance on the data centre as the literal centre of a company’s network makes no sense in a world in which more and more applications are moving to the cloud and users are accessing networks anywhere, at any time, from a multitude of devices.
The idea of the service edge pushes compute and services to the edge of the network (in every POP), so they’re close to users, which ensures minimal latency between the endpoint and the application. This model is in stark contrast to the delivery of services through network connectivity, as it is simple, scalable, and flexible, with low latency and high security.
SASE will have dramatic repercussions within enterprise IT, because it means that businesses will no longer need to provide network connectivity. Services will be at the edge, away from the network’s internal workings or functionality. As such, users do not need to know where a network is or where an application is housed and they can, therefore, be anywhere. With the internet becoming the new network over which business takes place, reliance on the physical network will be on the wane and there will be less reliance on IT as an internal service within a company.
#4 – From network to application access: The RAS-VPN will be retired
A by-product of the growth in SASE will be less reliance on the traditional remote-access virtual private network (RAS-VPN). The VPN has always been an extension of the physical network for those mobile workers or third parties who needed to access the network from external locations. When that network context is no longer relevant in the enterprise as applications live in the cloud, there’s no need to have a VPN. What’s needed instead is simple and seamless access to any application, no matter where the application lives or where the user is located.
Although there will still be consumer demand for VPNs, the technology is likely to vanish from the enterprise space. No enterprise wants to risk exposing its network to outsiders. Furthermore, VPNs are costly, requiring expensive infrastructure and expensive personnel with the specialized skills to maintain it.
#5 – Going dark: Enterprises must stop exposing their infrastructure to the internet
To enable partners and remote employees to access to internally managed applications, companies have had to rely on technology that exposes these applications to the internet and opens up access to the internal network, both of which pose risks. Zero trust network access (ZTNA), often referred to as software-defined perimeter (SDP), allows users to access applications without ever accessing the network. In essence, it creates a secure segment of one between the authenticated user and a specific app using the internet. And with the ZTNA model, applications and IP addresses are never exposed to the internet, making them completely invisible to unauthorized users.
As the number of private apps that run in multi-cloud or hybrid environments increases, along with the number of employees and third parties connecting from devices located outside the classic perimeter, security will become increasingly difficult if attempted with legacy technologies. ZTNA uses simple policies hosted in the cloud that are globally distributed but enforced locally. They provide visibility and grant access to private apps only to the specific users authorized to view them, and never to the internal network. ZTNA creates end-to-end encrypted micro-tunnels that create a secure segment of one between a user and an application.
In the next decade, we expect companies to follow the zero trust path and think about their security posture in a completely different way.
#6 – As the internet becomes the new transport network, network traffic patterns will change
With the ongoing cloudification of the modern business, internet-related traffic will continue to grow year-over-year by up to 50 percent, as seen in the Zscaler security cloud. If organisations begin to realise the benefits that can be achieved by moving their applications to the cloud from a cost and flexibility point of view, they will likely continue to migrate on-premises applications to the cloud. As a result of this shift, bandwidth requirements for MPLS network infrastructures will similarly grow, at significant expense, and companies will be forced to rethink their network architectures. A move away from such hub-and-spoke networks towards direct-to-internet connections at each location presents the ideal strategy.
Along with the shift in network traffic patterns is the growing volume of encrypted traffic. In 2018, global encrypted traffic was recorded as high as 80 percent of all traffic, according to Google’s analysis of HTTPS encryption on the web, whereas October 2019 statistics showed that the 90 percent line has since been reached by the U.S., closely followed by Germany and France. We can expect to see this increase move even closer to 100 percent at the beginning of the next decade, as internet security and privacy continue to be of utmost importance to enterprises worldwide.
#7 – Multi-cloud strategies: Companies will stop putting all their apps in the same basket
Multi-cloud strategies are on the rise in enterprises and we will continue to see a diversification of cloud service providers. Next to the major players—AWS, Azure, and Google—more niche players will start to become established as enterprises move away from hosting apps on-premises.
This diversification can be seen from different angles. It is not always the IT team that chooses the cloud service provider; in fact, business units typically dictate this decision, with the cloud provider often being selected based on the specific requirements of the applications being used. That means it is commonly the app that chooses the cloud and not vice versa. Enterprise business apps will find their requirements better suited to one cloud provider than industrial apps with their specific use cases for developers. However, companies must be careful not to become locked in with a single vendor, thus putting all their eggs in one basket.
Cloud architects are advised to support their internal business units with a catalogue of services offered by the various cloud vendors. On the other hand, they will have to redesign security architectures that meet the demands of staff to seamlessly and securely access multi-cloud environments.
#8 – Shift in responsibility: The CDO role becomes critical to business
The critical assets of a company are now unquestionably digital, which has driven the rise in the importance of the Chief Digital Officer (CDO). The transformation we foresee is that the role of the CDO will be better equipped to support companies’ needs on their transformation journeys than the traditional CIO role. The CDO’s role is challenging, as systems and processes have to be changed for companies to become more digital. Such a function has to take a much more holistic view and will need to be equipped to break up silos. This is the prerequisite for a successful transformation, where application owners, network architects, and security experts must all be on the same page when it comes to implementing the fundamental architectural changes that are on the horizon.
The CIO title is likely to evolve in very much the same way that the role of head of HR has evolved to encompass “People and Culture,” and heads of sales are also increasingly known as Chief Revenue Officers these days, representing a shift in priorities and focus for these individuals and their roles. Going along with this change is the new role of the CISO, an individual that traditionally was known for restricting innovation. As digital transformation marches on within businesses, the “I” is set to be replaced, and we will see increasing numbers of Chief Security Officers or Chief Digital Security Officers emerge as a result.
#9 Corporations will discover the need to invest in transformation teams
The pressure on CXOs will grow to drive transformation initiatives in 2020, yet even with the vision to transform, projects will come to a standstill if enterprises do not have the right resources in place. So, the biggest challenge is no longer to convince the management to take the step into the digital age; it is now a challenge of carrying out projects successfully due to a lack of skilled internal resources.
Based on C-level conversations, enterprises commonly report that the biggest slowdown in progress comes from missing the appropriate skillset. In-house IT teams are involved in maintaining existing infrastructure, and project management skills for transformation initiatives remain scarce. A shift in mindset from clinging to existing hardware and infrastructure towards a “no-network” setup—in which everything is moved to the cloud—requires a huge cultural shift, and one that should be started through the enablement of internal resources or investing in transformation teams.
Once the decision has been made, there is a long way to go to drive the project forward. Without the right skillsets on board, progress may be delayed or halted. Network architects with the vision to leave the old world behind—and have enough technology foresight to change the network infrastructure completely—will be the most sought-after personnel in the next decade.
#10 - The fear of falling behind leads late cloud adopters to emerge
The lack of digital transformation skills available to organisations internally could lead to another shift in the next decade: as cloud adoption gains traction and the majority of businesses have embraced the cloud, at least to some extent, there is likely be fear among the remaining organisations of falling behind. To keep pace with the competition, even cloud sceptics and those most resistant to change will have to start to adapt come 2020.
What we expect of these companies that have been waiting for mature solutions and proven adaptation strategies is that they will enter the cloud scene with greater velocity and that sales cycles will become shorter as a result. The twenties will be the decade in which the cloud adoption curve will move from early adopters to the late majority cloud adoption phase.
2019 has been a very busy year for DCA in many ways. Membership continues to increase, and we are now collaborating with more Strategic Partners than ever before. The Trade Association continues it commitment of supporting, promoting and placing speakers at many of the major data centre related conferences which take place throughout the year. In 2019 the DCA attended, exhibited and represented the industry and its members at 29 individual events across Europe, Asia and the Middle East.
By Steve Hone CEO and Cofounder, The DCA
In September we hosted the DCA’s annual Members conference (Data Centre Re-Transformation) at The Lowry in Manchester. This was the first year we had organised the event without the support of an event organiser and the feedback could not have been more positive. We are working on plans for DCT 2020 in London which will the events 10th Anniversary.
One area in which we are constantly busy is with features and articles for Data Centre Publications. The DCA work closely with three industry magazines by providing them with DCA members articles for regular monthly and quarterly features, both in print and online.
Our editor is continuously reading submissions and researching articles, these are then reviewed by DCA peers before being forwarded for publishing. All DCA members are welcome to submit content and articles and there is no additional l charge for as its covered by your organisation’s annual membership fee. In 2020 we are scheduled to publish 160 articles in 26 separate editions which will be read by over 400K subscribers worldwide, so please look out for our upcoming publication themes and deadlines for 2020
In this month’s DCA Journal we look back at the top ten most popular/read features of 2019.
For those old enough to remember this has been done in the style of Smashie and Nicey’s
So, here is the big countdown of the Top Ten Published Articles for 2019 (in reverse order of course!)
Kicking things off at number 10
By Kevin Towers, CEO Techbuyer
Sneaking in at number 9
By Colin Dean Managing Director Socomec UK Limited
Down 2 at number 8
By Matteo Mezzanotte, Communications & PR, Submer Immersion Cooling
Unchanged at number 7
By Paul Smethurst, Managing Director, Hillstone Products Ltd
Straight in a number 6
By Dr Umaima Haider, Research Fellow, University of East London
Moving on to the top 5 picks of 2019!
Up 2 at number 5
By Richard Clifford, Head of Innovation, Keysource
Rock steady at number 4
By Steven Carlini, Vice President Innovation and Data Centre IT Division, CTO Office Schneider Electric
In third place
By Robbert Hoeffnagel, PR Green IT Amsterdam
It was close but in 2nd place
By Mark Dansie OCP Data Centre Facility Subject Matter Expert
And our number 1 and most read article of 2019 is……..
By Mike Hayes, Applications Specialist at FläktGroup,
Congratulations Mike! and thank you to all those DCA members who submitted content in 2019; please keep those articles coming as published content continues to be a massive HIT with data centre professional.
As this feature will reach you in December, The DCA would like to wish all our members, collaborative partners and those working in the Data Centre sector a very happy and prosperous 2020.