I still chuckle when coming across the, thankfully, odd example, of a highly sophisticated company, which invests large sums on data centre and IT infrastructure, with special attention paid to reliability and resilience, only to discover that one of their key applications is running on a twenty year old machine, because it’s ‘too dangerous’ to migrate it!
Thankfully, very few companies leave themselves open to such a single point of failure.
However, the ongoing threat of the coronavirus, allied to yet another bout of extreme weather conditions, in the UK at least, has set me wondering whether there is an aspect of resilience that is being ignored right now. Thomas Malthus’s ideas around war, famine and disease being natural population checks might not hold so good in the 21st century, but disease as a business disruptor could just be a problem both now and into the future. Of course, if it wipes out most of the world’s population, then we’ll have bigger problems to worry about than keeping businesses functioning!
Assuming that, thanks in part to our growing immunity to antibiotics, widespread diseases, easily transported around the world thanks to globalisation (both of tourism and business), could well become more common and more disruptive. Right now, it seems as if the closing down of large parts of China, thanks to the coronavirus, is about to have an unexpected consequence on many businesses across the world. In simple terms, the quantity of components and finished goods which China manufactures and then supplies across the globe is already slowing down significantly. Factories are closed, so production has stopped. And even those goods which are completed cannot find any logistics companies ‘brave enough’ to go and collect them.
Ironically, it is the Far East we have to thank for the concept of Just In Time manufacturing, which then spilled over into the general business world. Why hold vast stocks of anything, if you can push them back to the supplier to keep hold of them. A massively reduce inventory, and a great aid to cash flow – only paying for goods when they’ve already been sold on to customers.
Ah, but all of a sudden, thanks to China’s shutdown, this policy doesn’t look so clever, whether you’re a car manufacturer relying on crucial components, or a retail company awaiting clothing shipments.
Okay, so one major health scare doesn’t, just yet, at least, cause a wholescale change in attitude to how, when and where supplies are sourced, but it’s something to bear in mind moving forwards. After all, the once in a lifetime extreme weather events which were so memorable for our parents’ and grandparents’ generations now seem to be occurring every few years. Indeed, in the UK, for example, hardly a winter goes by without pictures of towns and villages under water.
So, maybe no need to panic just yet when it comes to disease, but certainly a situation worth monitoring in the coming years, if the increase in climate volatility is anything to go by. The cost of data centre and IT hardware could well increase in the short term, and while any organisation can sweat older assets for a little longer, if the competition has managed to secure the very latest technology, the gap between the technology haves and have nots could grow that bit wider.
Amidst the changing nature of work, competition, and society, the research suggests ways to reimagine leadership.
A study released by MIT Sloan Management Review (MIT SMR) and Cognizant reveals that most executives around the world are out of touch with what it takes to lead effectively and for their businesses to stay competitive in the digital economy. Reliance on antiquated and ineffective leadership approaches by the current generation of leaders is undermining organizational performance. To remain competitive and lead effectively, executives will need to fully reimagine leadership, the study’s authors have found.
“We are on the precipice of an exciting new world of work, one that gives executives an opportunity to chart a new course for what their leadership should look like, feel like, and be like,” said Doug Ready, senior lecturer in organization effectiveness at the MIT Sloan School of Management and guest editor of the report. “Yet, our study suggests that digitalisation, upstart competitors, the need for breakneck speed and agility, and an increasingly diverse and demanding workforce demand more from leaders than what most can offer. The sobering data underscores the urgent need for a fully reimagined playbook for leaders in the coming digital age.”
The study, as reported in “The New Leadership Playbook for the Digital Age: Reimagining What It Takes to Lead” is based on a survey of 4,394 global executives from over 120 countries, 27 executive interviews, and focus group exchanges with next-gen global emerging leaders. The data reveals:
“A generation of leaders in large companies are out of sync, out of tune, and out of touch with their workforces, markets, and competitive landscapes. What got them to their current exalted status won’t be effective much longer — unless they take swift action,” said Benjamin Pring, report coauthor and director of the Center for the Future of Work for Cognizant. “Allowing unprepared senior executives with outdated skills and attitudes to stick around forces next-generation, high-potential leaders to move on to new pastures, which harms morale and ultimately shifts the organisation further away from where market demand is heading.”
The authors identify three categories of existing leadership behaviors (the 3Es); these include:
“Our experience suggests that the most advanced leadership teams are those committed to developing these 3Es in their organisations,” added Carol Cohen, report coauthor and senior vice president, global head of talent management and leadership at Cognizant. “A key to success is artfully introducing new leadership approaches that particularly appeal to a new generation of employees while at the same time honoring the time-tested behaviors and attributes that inspire trust, build a sense of community, and motivate employees to improve performance.”
The authors caution that the primary leadership challenges in the digital economy are not solved by merely adopting a group of these 3E behaviors but require developing new mindsets that anchor, inform, and advance these behaviors. They identified four distinct mindsets that together constitute what they believe are the new leadership hallmarks in the digital economy and illustrate through data and case studies how they can shape successful leadership. They include:
The report also offers further recommendations for a new leadership playbook and briefs leaders on the need to articulate a powerful leadership narrative, build communities of leaders, demand diversity and inclusion, and align talent, leadership, and business strategies.
An average IT team spends 15% of its total time trying to sort through monitoring alerts as the gap between IT resources, cloud scale and complexity widens.
Software intelligence company Dynatrace has published the findings of an independent global survey of 800 CIOs, which highlights a widening gap between IT resources and the demands of managing the increasing scale and complexity of enterprise cloud ecosystems. IT leaders around the world are concerned about their ability to support the business effectively, as traditional monitoring solutions and custom-built approaches drown their teams in data and alerts that offer more questions than answers. The 2020 global report “Top challenges for CIOs on the road to the AI-driven autonomous cloud” is available for download here.
CIO responses in the research indicate that, on average, IT and cloud operations teams receive nearly 3,000 alerts from their monitoring and management tools each day. With such a high volume of alerts, the average IT team spends 15% of its total available time trying to identify which alerts need to be focused on and which are irrelevant. This costs organizations an average of $1.5 million in overhead expense each year. As a result, CIOs are increasingly looking to AI and automation as they seek to maintain control and close the gap between constrained IT resources and the rising scale and complexity of the enterprise cloud.
Findings from the global report include:
IT is drowning in data
Traditional monitoring tools were not designed to handle the volume, velocity and variety of data generated by applications running in dynamic, web-scale enterprise clouds. These tools are often siloed and lack the broader context of events taking place across the entire technology stack. As a result, they bombard IT and cloud operations teams with hundreds, if not thousands, of alerts every day. IT is drowning in data as incremental improvements to monitoring tools fail to make a difference.
Existing systems provide more questions than answers
Traditional monitoring tools only provide data on a narrow selection of components from the technology stack. This forces IT teams to manually integrate and correlate alerts to filter out duplicates and false positives before manually identifying the underlying root cause of issues. As a result, IT teams’ ability to support the business and customers are greatly reduced as they’re faced with more questions than answers.
Precise, explainable AI provides relief
Organizations need a radically different approach – an answers-based approach to monitoring, to keep up with the transformation that’s taken place in their IT environments, and an approach with AI and automation at the core.
“Several years ago, we saw that the scale and complexity of enterprise cloud environments was set to soar beyond the capabilities of today’s IT and cloud operations teams,” said Bernd Greifeneder, CTO and founder, Dynatrace. “We realized traditional monitoring tools and approaches wouldn’t come close to understanding the volume, velocity and variety of alerts that are generated today, which is why we reinvented our platform to be unlike any other. The Dynatrace® Software Intelligence Platform is a single platform with multiple modules, leveraging a common data model with a precise explainable AI engine at its core. Unlike other solutions, which just serve up more data on glass, it’s this combination that enables Dynatrace to deliver the precise answers and contextual causation that organizations need to succeed in taming cloud complexity and, ultimately, achieving AI-driven autonomous cloud operations.”
AVEVA has revealed global survey findings identifying the key investment drivers for digital transformation. The survey was conducted with 1,240 decision makers in ten countries in EMEA, North America and APAC across nine industry verticals.
The research identifies a strong demand across both industries and markets to implement advanced technologies such as Artificial Intelligence (AI) and data visualization to make sense of vast data streams in real time, with 75% of respondents globally prioritizing investment in AI and analytics.
The research also identified three key global investment priorities for organizations when it comes to embarking upon the digital transformation journey:
1.Making sense of data utilizing artificial intelligence and real time data visualization
·The research highlighted a strong demand for technologies that provide predictive outputs from large data flows, with AI and Analytics listed as the most important enabler (75%), closely followed by Real-Time Data Visualization (64%), Augmented, Virtual or Mixed Reality (60%) and Big Data Processing (59%).
·AI was a top three enabler across all industries globally, with the greatest importance assigned in Power and Utilities (81%) and Oil & Gas (particularly upstream 79% and midstream 78%).
·Japan (88%) and China (84%) prioritised AI highest, with the UK (79%) and US (77%) following closely behind.
2.Fostering collaboration through Advanced Process and Engineering Design
·Advanced Process and Engineering Design was the second most important technology (74%) and was in the top three technology priorities across all industries globally, scoring highest among Engineering, Procurement and Construction professionals.
·This was perceived as an essential technology for global production, ranked as the most important enabler for Marine Ship Building (75%), Buildings/Infrastructure (74%) and Packaged Goods (73%), with Oil and Gas and Energy all ranking the technology highly.
·Japan (85%) and Germany (82%) are early adopters with high importance attributed across all regions.
3. Stepping up cyber security and safety capabilities
·Cyber security was the third most prioritised technology enabler (71%) and in particular a focus for Mining (76%), Downstream Oil and Gas (75%), Power & Utilities (70%) and Marine (70%) and the highest priority for Planning & Scheduling specialists.
·Improving Safety and Security through technology investment was a priority across all regions, with the Middle East (68%), Australia (63%) and India (60%) particularly highlighting this issue
For global corporates the two most valuable assets are their people and their data. Businesses today have great responsibility to protect employees and customers, with technology that provides the foresight to critical failures before they occur. Lisa Johnston, CMO, AVEVA commented, “As digital transformation moves to the forefront of the industrial agenda, the power of technology to unify data and break silos is allowing specialists to collaborate and change business models. The world’s most capital-intensive projects, from sustainable energy production and mining to smart factories and connected cities are now being designed, planned and delivered by global multidisciplinary teams all connected seamlessly through technology.”
Other findings from the research included:
·Asset Performance Managers (APM) were found to have the most demanding desires for technology investment, requiring a far-reaching product set and visionary approach. APM was set apart from all other professional categories in the strength of their demand for technology to create new services or products; use data to drive new revenue streams; and to enable collaboration with AI and Analytics their most important enabler. APM also identified technology as providing a great potential to upgrade both safety and security and emergency response times.
·Face to face engagement and trust remain key vectors for driving sales success. Face to face meetings with a vendor, either at a vendor event or as an introductory meeting, are most influential across all categories, with personal referrals also particularly meaningful. Experience with a vendor is highly prioritized in UK, China and India while France, Japan and Australia place considerable weight on case studies from the vendor.
·Remote Operations Centers and Learning Management and Training scored highest for relevance across global industries. China and Japan see a Command Center as most relevant, while the US and Australia cite Supply Chain Optimization is key to their business.
·Delivering cost reduction and enhancing safety prioritised by high growth organisations The countries prioritising cost reduction have been fast to scale over the past decades - China (61%), India (58%) and the Middle East(60%) indicated that significant margin improvement is possible in these geographies from software solutions. This demand for greater efficiency mirrored a requirement to invest in technology to promote safety as these economies continue to mature (China 51%, UAE 68% and India 60%).
“This research was an opportunity to hear from both our existing as well as potential new customers across the globe and our findings mirror the growing demands for technology solutions in the market globally today. Emerging technologies like AI, Machine Learning and Edge Computing which are transforming the technology landscape with vast data streams delivering operable output as well as true business outcomes for our customers today,” concluded Lisa Johnston.
Infosys Digital Radar 2020 finds that few companies have progressed to the most advanced stages of digital transformation this year.
Businesses globally face a “digital ceiling” when it comes to digital transformation, according to new research from Infosys Knowledge Institute (IKI), the thought leadership and research arm of Infosys (NYSE: INFY), a global leader in next-generation digital services and consulting. The study reveals that businesses must change their mindsets to achieve sophisticated levels of digital maturity.
Infosys Digital Radar 2020 assessed the digital transformation efforts of companies on a Digital Maturity Index and found year-over-year progress in basic areas, such as digital initiatives to improve a company’s efficiency. However, most companies come up against a “digital ceiling” when trying to achieve the most advanced levels of maturity.
The report, which surveyed over 1,000 executives globally, ranked the most digitally advanced companies as “Visionaries”, followed by “Explorers” and then “Watchers.”
Companies know how to achieve moderate transformation success, with an 18 per cent increase in companies progressing this year from the lowest tier of Watchers to the middle Explorer tier. However, Explorers struggled to move into the top Visionary cluster, with the top tier remaining the same, indicating a “digital ceiling” to transformation efforts.
The Visionary cluster remains unchanged despite companies reporting fewer barriers to digital transformation than last year. Human, rather than technological, barriers are now the most persistent, with the two of the top hurdles being the lack of talent or skills (34 per cent) and a risk-averse corporate culture (35 per cent).
How to break through the digital ceiling?
The research demonstrates that top performers break through the digital ceiling because they think differently.
Firstly, successful companies focus strongly on people, using digital transformation to make improvements centred on customers and employees.
Most companies (68 per cent) across the spectrum stated operational efficiency and increased productivity as a main transformation objective. But successful companies in the Visionary cluster are particularly motivated to make improvements for their employees. Nearly half of Visionaries describe “empowering employees” as a major business objective for transformation, compared with less than one third of Explorers and less than one fifth of Watchers.
Likewise, Visionaries have an increased focus on customer centred initiatives, being significantly more likely than other clusters to undertake transformation to improve customer experiences and engagement and in order to respond more quickly to customer needs.
Secondly, successful companies have a different mindset when it comes to transformation processes.
Traditional linear transformations result in long transformation timelines, meaning a company’s improvements are out of date by the time the process is complete. Instead, top performers demonstrate a cyclical mindset, implementing recurring rapid feedback loops to accelerate transformation and keep updates relevant. The Visionary cluster is far ahead of others in digital initiatives tied to quick cycles: 75 per cent operate at scale in Agile and DevOps, compared with an overall average of 34 per cent for the entire survey group.
Businesses overestimate tech barriers and underestimate the importance of a company’s mindset
The importance of culture and a cyclical transformation mindset to breaking through the digital ceiling were underestimated by businesses last year.
In the 2019 Digital Radar report, companies were asked to predict the biggest barriers to their transformation progress for the following year. This year’s Infosys Digital Radar 2020 compares these predictions to the actual challenges businesses faced in 2019.
Businesses reported dramatic declines in the impact that technological barriers have on their transformation progress, including:
However, businesses made much less progress against cultural barriers, including lack of change management capabilities (down 7 per cent) and lack of talent (down 6 per cent).
Progress across industries and geographies
Salil Parekh, CEO and MD at Infosys, commented: “We’ve seen enterprises successfully employ emerging technologies to optimise productivity and efficiency, but struggle at the next stage of digital maturity. Faster, better, and cheaper technology alone will not provide the improvements enterprises need. Our research has shown that companies which can keep pace with digital transformation are those that design digital initiatives to improve customer experiences and empower their employees, differentiating themselves and propelling their business to the most advanced levels of progress.”
Jeff Kavanaugh, VP and Global Head at Infosys Knowledge Institute, commented: “This year’s Digital Radar research revealed significant progress across transformation initiatives – however, traditional programme models are not keeping up with the rapid pace of market change and companies face a distinct barrier in reaching top levels of digital maturity.
“The most successful businesses in our survey have an employee focus and a circular transformation mindset, which enable top performers to kick off a virtuous cycle in the company. The result is a “living enterprise” that is constantly sensing, improving, and attuned to its customers and employees. This living enterprise is suited to serving a larger circle of stakeholders – employees, customers, suppliers, local communities, and larger society – not just shareholders.”
61% of CIOs say time spent on strategic planning increased in last 12 months.
The role of the CIO is evolving with more of a focus on revenue and strategy, according to the 2019 Global CIO Survey from Logicalis, a global provider of IT solutions. The study, which questioned 888 CIOs from around the world, found that 61% of CIOs have spent more time on strategic planning in the last 12 months whilst 43% are now being measured on their contribution to revenue growth.
The survey reveals that although CIOs are becoming more strategic and accountable, they are under pressure with reduced budgets and higher security risks. Almost half of respondents (48%) say that their time spent on security defenses has increased in the last year, with CIO’s spending 25% of their time on information and security compliance. The maintenance of technology remains a key aspect of the CIO’s role, with CIO’s on average, spending one-third (33%) of their time focused on day-to-day management of technology.
The increased strain is having a negative impact on CIOs’ enjoyment of their job. Almost half of CIOs (49%) believe their job satisfaction has decreased in the last 12 months, whilst 29% say their work/life balance has worsened. The expanded focus on strategy and revenue has had an impact on the amount of time CIOs are able to spend on innovation, with 30% saying it has decreased in the last 12 months.
Mark Rogers, CEO at Logicalis, said: “It is clear from these survey results that the role of the CIO has changed and is continuing to evolve. Digital transformation has impacted almost every industry, which has led to the role of the CIO increasing in importance. CIOs are now expected to be responsible for business performance at a strategic level which has added to the time that they are expected to spend on maintaining IT infrastructure.”
“This increase in strategic responsibility - should be embraced by businesses and CIOs alike - because technology does hold the key to unlocking competitive advantage and operational efficiency. However, these survey results are stark in their findings and show the increased amount of pressure being exerted on CIOs. Organisations must ensure their CIO is fully supported and has the necessary resources to carry out their job effectively. Businesses are pushing their CIOs to understand more about the line of business and input on strategy, whilst CIOs are still under pressure carrying out day-to-day activities. Clearly, this needs to be addressed.”
Nutanix, has published the findings of its second global Enterprise Cloud Index survey and research report, which measures enterprise progress with adopting private, hybrid and public clouds. The new report found enterprises plan to aggressively shift investment to hybrid cloud architectures, with respondents reporting steady and substantial hybrid deployment plans over the next five years. The vast majority of 2019 survey respondents (85%) selected hybrid cloud as their ideal IT operating model.
For the second consecutive year, Vanson Bourne conducted research on behalf of Nutanix to learn about the state of global enterprise cloud deployments and adoption plans. The researcher surveyed 2,650 IT decision-makers in 24 countries around the world about where they’re running their business applications today, where they plan to run them in the future, what their cloud challenges are, and how their cloud initiatives stack up against other IT projects and priorities. The 2019 respondent base spanned multiple industries, business sizes, and the following geographies: the Americas; Europe, the Middle East, and Africa (EMEA); and the Asia-Pacific (APJ) region.
This year’s report illustrated that creating and executing a cloud strategy has become a multidimensional challenge. At one time, a primary value proposition associated with the public cloud was substantial upfront capex savings. Now, enterprises have discovered that there are other considerations when selecting the best cloud for the business as well, and that one size cloud strategy doesn’t fit all use cases. For example, while applications with unpredictable usage may be best suited to the public clouds offering elastic IT resources, workloads with more predictable characteristics can often run on-premises at a lower cost than public cloud. Savings are also dependent on businesses’ ability to match each application to the appropriate cloud service and pricing tier, and to remain diligent about regularly reviewing service plans and fees, which change frequently.
In this ever-changing environment, flexibility is essential, and a hybrid cloud provides this choice. Other key findings from the report include:
“As organisations continue to grapple with complex digital transformation initiatives, flexibility and security are critical components to enable seamless and reliable cloud adoption,” said Wendy M. Pfeiffer, CIO of Nutanix. “The enterprise has progressed in its understanding and adoption of hybrid cloud, but there is still work to do when it comes to reaping all of its benefits. In the next few years, we’ll see businesses rethinking how to best utilise hybrid cloud, including hiring for hybrid computing skills and reskilling IT teams to keep up with emerging technologies.”
“Cloud computing has become an integral part of business strategy, but it has introduced several challenges along with it," said Ashish Nadkarni, group vice president of infrastructure systems, platforms and technologies at IDC. "These include security and application performance concerns and high cost. As the 2019 Enterprise Cloud Index report demonstrates, hybrid cloud will continue to be the best option for enterprises, enabling them to securely meet modernisation and agility requirements for workloads.”
Continuing unprecedented growth in the datacentre sector may be at risk due to increasing concerns around scarce resource and rising labour costs according to the latest industry survey from Business Critical Solutions (BCS), the specialist professional services provider to the international digital infrastructure industry.
The Winter Report 2020, now in its 11thyear, is undertaken by independent research house IX Consulting, who capture the views of over 300 senior datacentre professionals across Europe, including owners, operators, developers, consultants and end users. It is commissioned by BCS, the specialist services provider to the digital infrastructure industry.
Just over two-thirds of respondents believe that the next year will see an increase in demand, up on the 55% from our previous summer survey. This is supported by over 90% of developers and investor respondents stating they expect to see a further expansion in their data centre portfolio over the coming year.
However, concerns are being raised by many Design Engineering and Construction (DEC) respondents around general shortages amongst design, construction and operational professionals with four-fifths expressing resourcing concerns. DEC respondents identified build professionals as being subject to the most serious shortages – 82% stated this view compared with 78% for design professionals and 77% for operational functionality of data centres.
When asked to rank the impact of this our respondents highlighted the increased workload placed on their existing staff (96%), rising operating/labour costs (92%) and over 80% indicating that this has led to an increase in the use of outsourcing options over the past 12 months. The increased workload for existing staff had in turn led to problems in resourcing existing work, with just over 70% stating that they had experienced difficulties in meeting deadlines or client objectives.
James Hart, CEO at BCS (Business Critical Solutions) said: “At BCS we are currently doing the round of careers fairs looking for candidates for next year’s graduate and apprenticeship scheme. When we are talking to these young people we often find that they either haven’t even considered our sector and/or they have misconceived ideas about what this career path involves. We can address this by going into universities, colleges and schools telling STEM graduates about the data centre industry and how great it is. Without action, this these issues will become more acute, so the rallying cry for 2020 is that the sector is an exciting place to be and we have to get out there and spread the word!”
New study finds that nearly 90% of organisations faced business email compromise (BEC) and spear phishing attacks in 2019.
Proofpoint has released its sixth annual global State of the Phishreport, which provides an in-depth look at user phishing awareness, vulnerability, and resilience. Among the key findings, nearly 90 percent of global organisations surveyed were targeted with business email compromise (BEC) and spear phishing attacks, reflecting cybercriminals’ continued focus on compromising individual end users. Seventy-eight percent also reported that security awareness training activities resulted in measurable reductions in phishing susceptibility.
Proofpoint’s annual State of the Phish report examines global data from nearly 50 million simulated phishing attacks sent by Proofpoint customers over a one-year period, along with third-party survey responses from more than 600 information security professionals in the U.S., Australia, France, Germany, Japan, Spain, and the UK. The report also analyses the fundamental cybersecurity knowledge of more than 3,500 working adults who were surveyed across those same seven countries.
“Effective security awareness training must focus on the issues and behaviours that matter most to an organisation’s mission,” said Joe Ferrara, senior vice president and general manager of Security Awareness Training for Proofpoint. “We recommend taking a people-centric approach to cybersecurity by blending organisation-wide awareness training initiatives with targeted, threat-driven education. The goal is to empower users to recognise and report attacks.”
End-user email reporting, a critical metric for gauging positive employee behaviour, is also examined within this year’s report. The volume of reported messages jumped significantly year over year, with end users reporting more than nine million suspicious emails in 2019, an increase of 67 percent over 2018. The increase is a positive sign for infosec teams, as Proofpoint threat intelligence has shown a trend toward more targeted, personalised attacks over bulk campaigns. Users need to be increasingly vigilant in order to identify sophisticated phishing lures, and reporting mechanisms allow employees to alert infosec teams to potentially dangerous messages that evade perimeter defences.
Additional State of the Phish report global findings include the following takeaways. Specifics on North America, EMEA, and APAC are detailed within the report as well.
·More than half (55 percent) of surveyed organisations dealt with at least one successful phishing attack in 2019, and infosecurity professionals reported a high frequency of social engineering attempts across a range of methods: 88 percent of organisations worldwide reported spear-phishing attacks, 86 percent reported BEC attacks, 86 percent reported social media attacks, 84 percent reported SMS/text phishing (smishing), 83 percent reported voice phishing (vishing), and 81 percent reported malicious USB drops.
·Sixty-five percent of surveyed infosec professionals said their organisation experienced a ransomware infection in 2019; 33 percent opted to pay the ransom while 32 percent did not. Of those who negotiated with attackers, nine percent were hit with follow-up ransom demands, and 22 percent never got access to their data, even after paying a ransom.
·Organisations are benefitting from consequence models. Globally, 63 percent of organisations take corrective action with users who repeatedly make mistakes related to phishing attacks. Most infosec respondents said that employee awareness improved following the implementation of a consequence model.
·Many working adults fail to follow cybersecurity best practices. Forty-fivepercent admit to password reuse, more than 50 percent do not password-protect home networks, and 90 percent said they use employer-issued devices for personal activities. In addition, 32 percent of working adults were unfamiliar with virtual private network (VPN) services.
·Recognition of common cybersecurity terms is lacking among many users. In the global survey, working adults were asked to identify the definitions of the following cybersecurity terms: phishing (61 percent correct), ransomware (31 percent correct), smishing (30 percent correct), and vishing (25 percent correct). These findings spotlight a knowledge gap among some users and a potential language barrier for security teams attempting to educate employees about these threats. It’s critical for organisations to communicate effectively with users and empower them to be a strong last line of defence.
·Millennials continue to underperform other age groups in fundamental phishing and ransomware awareness, a caution that organisations should not assume younger workers have an innate understanding of cybersecurity threats. Millennials had the best recognition of only one term: smishing.
A new report published by CREST looks for solutions to the increasing problems of stress and burnout among many cyber security professionals, often working remotely in high-pressure and under-resourced environments. CREST – the not-for-profit body that represents the technical security industry including vulnerability assessment, penetration testing, incident response, threat intelligence and SOC (Security Operations Centre) – highlights its concerns and says that more needs to be done to identify the early stages of stress and provide more support.
Recent statistics show that 30% of security team members experience tremendous stress, while 27% of CISOs admit stress levels greatly affect the ability to do their jobs and 23% say stress adversely affects relationships out of work.
“While most security professionals are passionate about what they do and thrive well under bouts of pressure, it is important to recognise when this healthy and positive stress becomes unhealthy and detrimental to performance and wellbeing, and where people are working remotely, as many are, it can be really difficult to spot because of a lack of support and communication,” says Ian Glover, president of CREST. “The problem can sometimes be compounded by the rise in complex attacks, long hours spent under a constant ‘state of alert’, the shortage of skills and pressure from senior management and regulators. Reported breaches are a frequent reminder of the business and reputational consequences if mistakes are made or malicious activity is missed.”
Author of the report, David Slade, aPsychotherapist, points to the main stress warning signs to look out for, which include anxiety, lack of confidence, making erratic decisions, irritability, a reduction in concentration, poor time keeping and generally feeling overwhelmed. These factors can lead to bouts of insomnia, a decline in performance, increasing use of drugs or alcohol, over or under eating, taking more sick days, withdrawal, a loss of motivation and actual physical and mental exhaustion.
“As in many high-pressure professions, it is very rare for people in cyber security to seek professional help when feeling stressed or overwhelmed,” says David Slade. “We need to instil a culture of better communication and peer-to peer support as well as encouraging practical measures such as taking regular breaks, exercise and holidays as well as introducing relaxation techniquessuch as mindfulness and having time set aside to discuss individual worries and concerns.”
The CREST report urges businesses and organisations to accept responsibility to ease staff stress levels by creating an organisational culture of openness at all levels and building a flexible environment in which individuals get encouragement, advice and support. This includes access to sources of advice on mental health issues, training tools and workshops, along with stress and burnout self-help videos.With the increasingly acute skills shortage in cyber security, CREST also believes that more automation can play a part in taking the strain off overworked staff, while the use of DevSecOps can help to move from a reactiveapproach to cyber security to a ‘security by design’ model.
“Management’s urgent task is to ensure that the organisation flourishes in a way that serves both the people outside and the people inside with a way of assessing how well the psychological needs of both groups are taken into account,” says Slade. “This would ensure that any change of structure or practice does not impinge on these needs.”
The CREST report was borne out of research conducted among its members and an open Access to Cyber Day that included stress and burnout workshops.“The level of interest and engagement in putting the report together was a clear demonstration of both the growing concern around stress and burnout in the industry, and the willingness to do something about it” adds Ian Glover. “If we want to retain the skills and experience we already have while also encouraging the best new talent into the cyber security industry, we need to recognise the problems and face up to the challenges to create exciting and stimulating careers while providing the right environment and support.”
As one of the leading types of cyber-attacks, ransomware is expected to dominate cybercrime in 2020. According to PreciseSecurity.com research, weak passwords were one of the most common cybersecurity vulnerabilities in 2019, causing 30% of ransomware infections in 2019.
Weak Passwords Are the Third Most Common Ransomware Cause Globally
The recent PreciseSecurity.com research revealed that phishing scams caused more than 67% of ransomware infection globally during the last year. Another 36% of Mail Protection Service users reported ransomware attacks caused by the lack of cybersecurity training. Weak passwords were the third most common reason for ransomware infections globally in 2019.
The 30% share in the combined number of ransomware infections during the last years indicates a concerning level of password security awareness. The 2019 Google survey about beliefs and behaviors around online security showed that two in three individuals recycle the same password across multiple accounts. More than 50% admitted using one "favorite" password for the majority of the accounts. Only one-third of respondents knew how to define the password manager.
Only 12 % of US Online Users Take Advantage of Password Managers
The 2019 Statista survey reveals that 64% of US respondents find stolen passwords as the most concerning issue about data privacy. However, such a high level of concern didn't affect their habits related to keeping track of login information. According to the findings, 43% of respondents reported that their primary method of keeping track of their most crucial login information was to write it down. Another 45% of respondents named memorizing the login data as their primary method of tracking. At the same time, only 12% of US online users take advantage of password managers.
23.2 Million Victim Accounts Globally Used 123456 as Password
Using hard-to-guess passwords represent the first step in securing sensitive online information. However, according to the UK's National Cyber Security Centre 2019 survey, password re-use and weak passwords still represent a significant risk for companies and individuals all over the world.
The breach analysis indicated that 23.2 million victim accounts from all parts of the world used 123456 as a password. Another 7.8 million data breach victims chose a 12345678 password. More than 3.5 million people globally picked up the word "password" to protect access to their sensitive information.
60% of breaches in 2019 involved vulnerabilities where available patches were not applied.
ServiceNow has released its second sponsored study on cybersecurity vulnerability and patch management,conducted with the Ponemon Institute. The study,“Costs and Consequences of Gaps in Vulnerability Response”,found that despite a 24% average increase in annual spending on prevention, detection and remediation in 2019 compared with 2018, patching is delayed an average of 12 days due to data silos and poor organisational coordination. Looking specifically at the most critical vulnerabilities, the average timeline to patch is 16 days.
At the same time, the risk is increasing. According to thefindings, there was a 17% increase in cyberattacks over the past year, and 60% of breaches were linked to a vulnerability where a patch was available, but not applied. The study surveyed almost 3,000 security professionals in nine countries to understand how organisations are responding to vulnerabilities.In this report, ServiceNow presents the consolidatedfindings and comparisons to its 2018 study,Today’s State of Vulnerability Response: Patch Work Requires Attention.
The survey results reinforce a need for organisations to prioritise more effective and efficient security vulnerability management:
Thefindings also indicate a persistent cybercriminal environment, underscoring the need to act quickly:
The report points to other factors beyond staffing that contribute to delays in vulnerability patching:
According to thefindings, automationdelivers a significant payoff in terms of being able to respond quickly and effectively to vulnerabilities. Four infive (80%) of respondents who employ automation techniques say they respond to vulnerabilities in a shorter timeframe through automation.
“This study shows the vulnerability gap that has been a growing pain point for CIOs and CISOs,” said Jordi Ferrer, Vice President and General Manager UK&I at ServiceNow. “Companies saw a 30% increase in downtime due to patching of vulnerabilities, which hurts customers, employees and brands. Many organisations have the motivation to address this challenge but struggle to effectively leverage their resources for more impactful vulnerability management. Teams that invest in automation and maturing their IT and security team interactions will strengthen the security posture across their organisations.”
Cloud Industry Forum finds that the Cloud is critical to more than eight in ten Digital Transformation projects.
Over a quarter (28%) of businesses in the UK now have a fully formed digital transformation strategy in place, and over half (54%) are in the process of implementing one, according to the latest research from Cloud Industry Forum (CIF). Critically the Cloud is seen as either very important or vital to organisations’ digital strategies to 82% of respondents.
Launched today, the research, which was conducted by Vanson Bourne and surveyed UK-based IT and business decision-makers, sought to understand how they were exploiting cloud and other next generation technologies, and the barriers standing in the way of adoption.
Almost all of the respondents (98%) said that their digital transformation strategy is at least fairly clearly defined, with just over a third (34%) having full clarity. When asked if their organisation is doing enough to become fully digitised 28% felt they were doing more than enough, and a further 56% stated that they were doing just enough. There is still work to be done with a lack of skills (40%) and the perennial lack of budget (38%) being cited as key hinderances to further and deeper digital transformation.
Alex Hilton, CEO, Cloud Industry Forum, stated: “Digital transformation is a well and truly established concept, with only a tiny minority of our sample not embracing it in some way. We are beginning to see greater clarity in the way leaders are formulating their strategies, but it does not mean it is time to rest on our laurels.
“There is still much that businesses can do to speed up processes, build efficiency and convince all leaders that digital is the way to go. Cloud’s role in all of this remains vital, given its emphasis on flexibility at a time when these qualities are more important than ever.”
Key findings include:
Alex Hilton, continued: “UK businesses clearly recognise the need for transformation and are gradually leaving legacy technologies behind in favour of next generation technologies as they pursue competitive advantage. Cloud is critical to this shift, thanks not only to the flexibility of the delivery model, but also the ease with which servers can be provisioned, which reduces financial and business risk. Furthermore, cloud’s ability to explore the value of vast unstructured data sets is next to none, which in turn is essential for IoT and AI.
“However, it’s clear that the majority of UK organisations are right at the start of this journey and many are being prevented from exploiting IoT, blockchain and AI due to skills shortages, a lack of vision, and, indeed, a lack of support from vendors. The research found that the lack of human resources, alongside a general skills shortage and a lack of budget, are the biggest challenges hampering further transformation.
“According to the research 56% of the sample cited that they are looking for strong, trusted relationships with suppliers, and a further 51% are looking for deep technical knowhow. The vendor community and the channel have a big role to play here, refining their service and support capabilities, and helping end users comprehend the transformative potential of these next generation technologies,” concluded Alex.
Worldwide IT spending is projected to total $3.9 trillion in 2020, an increase of 3.4% from 2019, according to the latest forecast by Gartner, Inc. Global IT spending is expected to cross into $4 trillion territory next year.
“Although political uncertainties pushed the global economy closer to recession, it did not occur in 2019 and is still not the most likely scenario for 2020 and beyond,” said John-David Lovelock, distinguished research vice president at Gartner. “With the waning of global uncertainties, businesses are redoubling investments in IT as they anticipate revenue growth, but their spending patterns are continually shifting.”
Software will be the fastest-growing major market this year, reaching double-digit growth at 10.5% (see Table 1). “Almost all of the market segments with enterprise software are being driven by the adoption of software as a service (SaaS),” said Mr. Lovelock. “We even expect spending on forms of software that are not cloud to continue to grow, albeit at a slower rate. SaaS is gaining more of the new spending, although licensed-based software will still be purchased and its use expanded through 2023.”
Table 1. Worldwide IT Spending Forecast (Billions of U.S. Dollars)
Data Center Systems
Source: Gartner (January 2020)
Growth in enterprise IT spending for cloud-based offerings will be faster than growth in traditional (noncloud) IT offerings through 2022. Organizations with a high percentage of IT spending dedicated to cloud adoption is indicative of where the next-generation, disruptive business models will emerge.
“Last quarter, we introduced the ‘and’ dilemma where enterprises are challenged with cutting costs and investing for growth simultaneously. Maturing cloud environments is an example of how this dilemma is alleviated: Organizations can expect a greater return on their cloud investments through cost savings, improved agility and innovation, and better security. This spending trend isn’t going away anytime soon.”
The headwind coming from a strong U.S. dollar has become a deterrent to IT spending on devices and data center equipment in effected countries. “For example, mobile phone spending in Japan will decline this year due to local average selling prices going up as a result of the U.S. dollar increasing. The U.K.’s spending on PCs, printers, servers and even external storage systems is expected to decline by 3%, too,” said Mr. Lovelock.
Despite last quarter showing the sharpest decline within the device market among all segments, it will return to overall growth in 2020 due to the adoption of new, less-expensive phone options from emerging countries. “The almost $10 billion increase in device spending in Greater China and Emerging Asia/Pacific is more than enough to offset the expected declines in Western Europe and Latin America,” said Mr. Lovelock.
Collaboration on the increase
Over the next two years, 50% of organizations will experience increased collaboration between their business and IT teams, according to Gartner, Inc. The dispute between business and IT teams over the control of technology will lessen as both sides learn that joint participation is critical to the success of innovation in a digital workplace.
“Business units and IT teams can no longer function in silos, as distant teams can cause chaos,” said Keith Mann, senior research director at Gartner. “Traditionally, each business unit has had its own technology personnel, which has made businesses reluctant to follow the directive of central IT teams. Increasingly, however, organizations now understand that a unified objective is essential to ensure the integrity and stability of core business. As a result, individuals stay aligned with a common goal, work more collaboratively and implement new technologies effectively across the business.”
Evolution of the Role of Application Leader
The role of application leader has changed significantly with the replacement of manual tasks by cloud-based applications in digital workplaces. The application leader must ensure that this transition is supported by appropriate skills and talent.
As more and more organizations opt for cloud-based applications, AI techniques such as machine learning, natural language processing (NLP), chatbots and virtual assistants are emerging as digital integrator technologies. “While the choice of integration technologies continues to expand, the ability to use designed applications and data structures in an integrated manner remains a complex and growing challenge for businesses. In such scenarios, application leaders need to deliver the role of integration specialists in order to ensure that projects are completed faster and at lower cost,” said Mr. Mann.
Application leaders will have to replace the command-and-control model with versatility, diversity and team engagement with key stakeholders. Application leaders must become more people-centric and provide critical support to digital transformation initiatives.
Additionally, in a digital workplace, it is the application leader’s responsibility to serve as the organizational “nerve center” by quickly sensing, responding to, and provisioning applications and infrastructures. “Application leaders will bring together business units and central IT teams to form the overall digital business team,” said Mr. Mann.
Much management work to be fully automated by 2024
Artificial intelligence (AI) and emerging technologies such as virtual personal assistants and chatbots are rapidly making headway into the workplace. By 2024, Gartner, Inc. predicts that these technologies will replace almost 69% of the manager’s workload.
“The role of manager will see a complete overhaul in the next four years,” said Helen Poitevin, research vice-president at Gartner. “Currently, managers often need to spend time filling in forms, updating information and approving workflows. By using AI to automate these tasks, they can spend less time managing transactions and can invest more time on learning, performance management and goal setting.”
AI and emerging technologies will undeniably change the role of the manager and will allow employees to extend their degree of responsibility and influence, without taking on management tasks. Application leaders focused on innovation and AI are now accountable for improving worker experience, developing worker skills and building organizational competency in responsible use of AI.
“Application leaders will need to support a gradual transition to increased automation of management tasks as this functionality becomes increasingly available across more enterprise applications,” said Ms. Poitevin.
AI to Foster Workplace Diversity
Nearly 75% of heads of recruiting reported that talent shortages will have a major effect on their organizations. Enterprises have been experiencing critical talent shortage for several years. Organizations need to consider people with disabilities, an untapped pool of critically skilled talent. Today, AI and other emerging technologies are making work more accessible for employees with disabilities.
Gartner estimates that organizations actively employing people with disabilities have 89% higher retention rates, a 72% increase in employee productivity and a 29% increase in profitability.
In addition, Gartner said that by 2023, the number of people with disabilities employed will triple, due to AI and emerging technologies reducing barriers to access.
“Some organizations are successfully using AI to make work accessible for those with special needs,” said Ms. Poitevin. “Restaurants are piloting AI robotics technology that enables paralyzed employees to control robotic waiters remotely. With technologies like braille-readers and virtual reality, organizations are more open to opportunities to employ a diverse workforce.”
By 2022, organizations that do not employ people with disabilities will fall behind their competitors.
Over the past five years, International Data Corporation (IDC) has been documenting the rise of the digital economy and the digital transformation that enterprises must undergo to compete and succeed. Greater enterprise intelligence has become a top priority for business leaders on this transformation journey. By working with and observing such enterprises, IDC has developed a new Future of Intelligence framework, which provides insight and understanding for business leaders and technology suppliers.
IDC defines the future of intelligence as an organization's capacity to learn, combined with its ability to synthesize the information it needs in order to learn and to apply the resulting insights at scale. The ability to continuously learn at scale – and apply that learning across the entire organization instead of in silos at a faster rate than the competition – is the crucial differentiator that will separate those with greater enterprise intelligence enterprises from their peers.
The capabilities needed in the drive towards the future of intelligence will depend on a platform that enables ongoing explanation, monitoring, learning, and adaptation that will drive economies of intelligence.
IDC predicts that over the next four to five years, enterprises that invest in future of intelligence capabilities effectively will experience a 100% increase in knowledge worker productivity, resulting in shorter reaction times, increased product innovation, and improved customer satisfaction, in turn leading to sustainable market share leadership (or achievement of their mission) in their industry. These enterprises will be able to:
"In 2019, enterprises globally spent $190 billion on data management, analytics, and AI technologies and services — not even including labor costs or purchases of external data. How much of that spending generated intelligence and how much of that investment generated value are questions many executives are unable to answer," said Dan Vesset, group vice president, Analytics and Information Management and IDC's Future of Intelligence research practice lead. "Enterprises that achieve economy of intelligence will have a competitive advantage just as those enterprises in the past that achieved economies of scale and scope had an advantage over their peers."
Cloud IT infrastructure revenues decline
According to the International Data Corporation (IDC) Worldwide Quarterly Cloud IT Infrastructure Tracker, vendor revenue from sales of IT infrastructure products (server, enterprise storage, and Ethernet switch) for cloud environments, including public and private cloud, declined in the third quarter of 2019 (3Q19) as the overall IT infrastructure market continues to experience weakening sales following strong growth in 2018. The decline of 1.8% year over year was much softer than in 2Q19 as the overall spend on IT infrastructure for cloud environments reached $16.8 billion. IDC slightly increased its forecast for total spending on cloud IT infrastructure in 2019 to $65.4 billion, which represents flat performance compared to 2018.
The decline in cloud IT infrastructure spending was driven by the public cloud segment, which was down 3.7% year over year, reaching $11.9 billion; sequentially from 2Q19, this represents a 24.4% increase. As the overall segment is generally trending up, it tends to be more volatile quarterly as a significant part of the public cloud IT segment is represented by a few hyperscale service providers. This softness of the public cloud IT segment is aligned with IDC's expectation of a slowdown in this segment in 2019 after a strong performance in 2018. It is expected to reach $44 billion in sales for the full year 2019, a decline of 3.3% from 2018. Despite softness, public cloud continues to account for most of the spending on cloud IT environments. However, as demand for private cloud IT infrastructure is increasing, the share of public cloud IT infrastructure continued to decline in 2019 and will be declining slightly throughout the forecast period. Spending on private cloud IT infrastructure has shown more stable growth since IDC started tracking sales of IT infrastructure products in various deployment environments. In 3Q19, vendor revenues from private cloud environments increased 3.2% year over year, reaching nearly $5 billion. IDC expects spending in this segment to grow 7.2% year over year in 2019 to $21.4 billion.
As investments in cloud IT infrastructure continue to increase, with some swings up and down in the quarterly intervals, the IT infrastructure industry is approaching the point where spending on cloud IT infrastructure consistently surpasses spending on non-cloud IT infrastructure. Until 3Q19, it happened only once, in 3Q18, and in 3Q19 it crossed the 50% mark for the second time since IDC started tracking IT infrastructure deployments. In 3Q19, cloud IT environments accounted for 53.4% of vendor revenues. However, for the full year 2019, spending on cloud IT infrastructure is expected to stay just below the 50% mark at 49.8%. This year (2020) is expected to become the tipping point with spending on cloud IT infrastructure staying in the 50+% range.
Across the three IT infrastructure domains, Ethernet switches is the only segment expected to deliver visible year-over-year growth in 2019, up 11.2%, while spending on compute platforms will decline 3.1% and spending on storage will grow just 0.8%. Compute will remain the largest category of cloud IT infrastructure spending at $34.1 billion.
Sales of IT infrastructure products into traditional (non-cloud) IT environments declined 7.7% from a year ago in 3Q19. For the full year 2019, worldwide spending on traditional non-cloud IT infrastructure is expected to decline by 5.3%. By 2023, IDC expects that traditional non-cloud IT infrastructure will only represent 41.9% of total worldwide IT infrastructure spending (down from 51.6% in 2018). This share loss and the growing share of cloud environments in overall spending on IT infrastructure is common across all regions. While the industry overall is moving toward greater use of cloud, there are certain types of workloads and business practices, and sometimes end user inertia, which keep demand for traditional dedicated IT infrastructure afloat.
Geographically, the cloud IT Infrastructure segment had a mixed performance in 3Q19. Declines in the U.S., Western Europe, and Latin America were driven by overall market weakness; in these and some other regions 3Q19 softness in cloud IT infrastructure spending was also affected by comparisons to a strong 3Q18. In Asia/Pacific (excluding Japan), the second largest geography after the U.S., spending on cloud IT infrastructure increased 1.2% year over year, which is low for this region. However, it is in comparison with strong double-digit growth in 2018. Other growing regions in 3Q19 included Canada (4.9%), Central & Eastern Europe (4.6%), and Middle East & Africa (18.1%).
Top Companies, Worldwide Cloud IT Infrastructure Vendor Revenue, Market Share, and Year-Over-Year Growth, Q3 2019 (Revenues are in Millions)
3Q19 Revenue (US$M)
3Q19 Market Share
3Q18 Revenue (US$M)
3Q18 Market Share
3Q19/3Q18 Revenue Growth
1. Dell Technologies
2. HPE/New H3C Group**
3. Inspur/Inspur Power Systems* ***
IDC's Quarterly Cloud IT Infrastructure Tracker, Q3 2019
* IDC declares a statistical tie in the worldwide cloud IT infrastructure market when there is a difference of one percent or less in the vendor revenue shares among two or more vendors.
** Due to the existing joint venture between HPE and the New H3C Group, IDC reports external market share on a global level for HPE as "HPE/New H3C Group" starting from Q2 2016 and going forward.
*** Due to the existing joint venture between IBM and Inspur, IDC will be reporting external market share on a global level for Inspur and Inspur Power Systems as "Inspur/Inspur Power Systems" starting from 3Q 2018.
Long-term, IDC expects spending on cloud IT infrastructure to grow at a five-year compound annual growth rate (CAGR) of 7%, reaching $92 billion in 2023 and accounting for 58.1% of total IT infrastructure spend. Public cloud datacenters will account for 66.3% of this amount, growing at a 6% CAGR. Spending on private cloud infrastructure will grow at a CAGR of 9.2%.
Spending on robotics systems and drones forecast to reach $128.7 Billion in 2020
Worldwide spending on robotics systems and drones will be $128.7 billion in 2020, an increase of 17.1% over 2019, according to a new update to the International Data Corporation (IDC) Worldwide Robotics and Drones Spending Guide. By 2023, IDC expects this spending will reach $241.4 billion with a compound annual growth rate (CAGR) of 19.8%.
Robotics systems will be the larger of the two categories throughout the five-year forecast period with worldwide robotics spending forecast to be $112.4 billion in 2020. Spending on drones will total $16.3 billion in 2020 but is forecast to grow at a faster rate (33.3% CAGR) than robotics systems (17.8% CAGR).
Hardware purchases will dominate the robotics market with 60% of all spending going toward robotic systems, after-market robotics hardware, and system hardware. Purchases of industrial robots and service robots will total more than $30 billion in 2020. Meanwhile, robotics-related software spending will mostly go toward purchases of command and control applications and robotics-specific applications. Services spending will be spread across several segments, including systems integration, application management, and hardware deployment and support. Services spending is forecast to grow at a slightly faster rate (21.3% CAGR) than software or hardware spending (21.2% CAGR and 15.5% CAGR, respectively).
"Software developments are among the most important trends currently shaping the robotics industry. Solution providers are progressively integrating additional software-based, often cloud-based, functionalities into robotics systems. An operational-centric example is an asset management application to monitor the robotic equipment performance in real-time. It aligns solutions with current expectations for modern operational technology (OT) at large and plays in facilitated adoption by operations leaders," said Remy Glaisner, research director, Worldwide Robotics: Commercial Service Robots. "Equally important is the early trend driven by burgeoning 'software-defined' capabilities for robotics and drone solutions. The purpose is to enable systems beyond some of the limitations imposed by hardware and to open up entirely new sets of commercially viable use-cases."
Discrete manufacturing will be responsible for nearly half of all robotics systems spending worldwide in 2020 with purchases totaling $53.8 billion. The next largest industries for robotics systems will be process manufacturing, resource industries, healthcare, and retail. The industries that will see the fastest growth in robotics spending over the 2019-2023 forecast are wholesale (30.5% CAGR), retail (29.3% CAGR), and construction (25.2% CAGR).
"Despite movement toward a trade agreement between the U.S. and China, it appears that tariffs may remain in place on many robotics systems. This will have a negative impact on both the manufacturing and resource industries, where robotics adoption has been strong. The additional duties will likely slow investment in the robotics systems used in manufacturing processes, automated supply chains, and mining operations," said Jessica Goepfert, program vice president, Customer Insights & Analysis.
Spending on drones will also be dominated by hardware purchases with more than 90% of the category total going toward consumer drones, after-market sensors, and service drones in 2020. Drone software spending will primarily go to command and control applications and drone-specific applications while services spending will be led by education and training. Software will see the fastest growth (38.2% CAGR) over the five-year forecast, followed closely by services (37.6% CAGR) and hardware (32.8% CAGR).
Consumer spending on drones will total $6.5 billion in 2020 and will represent nearly 40% of the worldwide total throughout the forecast. Industry spending on drones will be led by utilities ($1.9 billion), construction ($1.4 billion), and the discrete manufacturing and resource industries ($1.2 billion each). IDC expects the resource industry to move ahead of both construction and discrete manufacturing to become the second largest industry for drone spending in 2021. The fastest growth in drone spending over the five-year forecast period will come from the federal/central government (63.4% CAGR), education (55.9% CAGR), and state/local government (49.9% CAGR).
"We expect to see some price increases as drone manufacturers pass on the cost of tariffs imposed on the import/export of drones. The construction and resource industries will particularly feel the effects of these price increases. In contrast, many consumer drone manufacturers have chosen against raising prices and are absorbing the additional costs in order to maintain supply and to satisfy continuing consumer demand for drones. While the pending trade agreement offers some hope, these industries will face continued headwinds as long as tariffs remain in place," said Stacey Soohoo, research manager, Customer Insights & Analysis. "Elsewhere, robotics manufacturers will continue to face the one-two punch of higher costs for both materials and imported components."
On a geographic basis, China will be the largest region for drones and robotics systems with overall spending of $46.9 billion in 2020. Asia/Pacific (excluding Japan and China) (APeJC) will be the second largest region with $25.1 billion in spending, followed by the United States ($17.5 billion) and Western Europe ($14.4 billion). China will also be the leading region for robotics systems with $43.4 billion in spending this year. The United States will be the largest region for drones in 2020 with spending of nearly $5.7 billion. The fastest spending growth for robotics systems will be in the Middle East & Africa which will see a five-year CAGR of 24.9%. China will follow closely with a CAGR of 23.5%. The fastest growth in drone spending will be in APeJC, with a five-year CAGR of 78.5%, and Japan (63.0% CAGR).
Blockchain has been capturing the imagination of both businesses and government organisations, however it can be difficult to distinguish between hype and the real potential of this technology.
By Scrinath Perera, VP of Research, WS02.
Blockchain promises to redefine trust - it lets us build decentralised systems where we do not need to trust the owners of the system. Likewise, blockchain lets previously untrusted parties establish trust quickly and efficiently. This enables developers to build novel applications that can work in untrusted environments. However it is not readily apparent where blockchain use cases are feasible and can deliver clear value. Here at WS02 we recently analysed blockchain’s viability using the Emerging Technology Analysis Canvas (ETAC) taking a broad view of emerging technology and probing impact, feasibility, risks and future timelines.
It’s hard to talk about all the different blockchain use cases collectively and come to sensible conclusions. This is because they are many and varied and each will have varying requirements and goals. Therefore, we began our analysis by surveying the feasibility of 10 categories of blockchain use cases including digital currency, ledgers and lightweight financial systems, among others, and we put these into various categories. This has enabled us to identify a number of common traits.
New versus old systems of trust
What we found was that most of the use cases we looked at have already been solved in some way. So, why do we need to implement blockchain?
The answer is that blockchain can provide a new kind of trust.
Traditional, pre-blockchain systems generally function effectively. However, they implicitly assume two kinds of trust. In the first instance, we trust the “super users” of what are centralised implementations. Here, one person or a few individuals have deep access to the system and are deemed trustworthy. Alternatively, with an organisation or government, we can reasonably assume processes are in place that should deter wrongdoing.
The second way we establish trust is through an out-of-bound means, such as signing a legal contract, obtaining a reference or by providing a credit card to gain access. This is why most systems or ecosystems require you to provide credentials, which you need to create through some other channel, before you can access them.
Unlike traditional trust systems, blockchain-based systems can operate without either of the assumptions being true. Operating without the first assumption is known as decentralisation and doing so without the second is known as “dynamic trust establishment.” However, our ability to operate without these assumptions and achieve a new level of trust does not always mean that we should. We need to consider the trade-off between the cost of using blockchain and the potential return. This is not a technical decision, but one that looks at values and how much risk we’re willing to take.
Weighing the costs of blockchain
Our analysis identified several challenges, some of these were technical and will likely be fixed in the future, others were risks that are inherent aspects of blockchain and unlikely to change. Blockchain challenges are limited scalability and latency, limited privacy, storage constraints and unsustainable consensus (e.g. current consensus algorithms are slow and consume significant computing power). Meanwhile, blockchain risks include irrevocability, regulator absence, misunderstood side effects, fluctuations in bitcoin prices and unclear regulatory responses.
We evaluated all of the use case categories in the context of blockchain’s challenges and risks and arrived at three levels of feasibility.
First, blockchain technology is ready for applications in digital currency, including initial coin offerings (ICOs); provenance, e.g. supply chains and other B2B scenarios; and disintermediation. We expect to see use cases in the next three years.
Second, ledgers (of identity, ownership, status and authority), voting and healthcare, are only feasible for limited use cases where the technical limitations do not hinder them.
For other use cases such as lightweight financial systems, smart contracts, new internet apps and autonomous ecosystems, blockchain faces significant challenges, including performance, irrevocability, the need for regulation and lack of consensus mechanisms. These are hard problems to fix and could take at least five to 10 years to resolve. In most cases, today’s centralised or semi-centralised solutions for establishing trust are faster, have more throughput and are cheaper than decentralised blockchain-based solutions.
So is blockchain worth it?
Neither the decentralisation nor the dynamic trust establishment enabled by blockchain is free. However, while true decentralisation is expensive, once in place, it makes dynamic trust establishment easy to implement.
Decentralisation can be useful in a number of scenarios such as: limiting government censorship and control; avoiding having a single organisation controlling critical systems; preventing rogue employees from causing significant damages. It also enables system rules to be applied to everyone evenly and reduces damage because fewer user accounts are compromised in the event of a hack/system breach..
Moreover, the polarised arguments around blockchain’s value suggest there is no shared understanding of the value of decentralisation. Organisations express concern around the arbitrary power of governments as well as large organisations, but do they understand the trade-offs and additional resources required to attain higher trust? Similarly, privacy is a concern, but most of us share data daily in exchange for free access to the internet and social media platforms.
Clearly, decentralisation needs to be part of a policy decision that is taken only after wide discussion. On the one hand, in an increasingly software-controlled world — from banking to healthcare and autonomous cars — the risks associated with centralised systems are increasing. That said, trying to attain full decentralisation could kill blockchain, especially if overly ambitious targets are set because the cost will be prohibitive.
Fortunately, centralised versus decentralised does not have to be an all-or-nothing decision. Multiple levels of decentralisation are possible. For example, private blockchains are essentially semi-decentralised because any action requires consensus among a few key players. Therefore, it is important to critically examine each blockchain use case.
Significant financial investments have been made in blockchain, but if the quest for a fully decentralised solution takes too long; it will put the future of blockchain at risk. This makes a good case for starting with a semi-decentralised approach to minimise risk and then strive for full decentralisation in a second phase.
A full analysis of the trade-offs
Blockchain provides mechanisms for establishing trust that reduce the risks associated with centralised systems and enable agility by automating the verification required to establish trust. However, compared to current decentralised or semi-decentralised blockchain solutions, centralised solutions are faster, have more throughput and cost less to implement. That said, as governments and market demands address the technical challenges blockchain faces, the associated costs and barriers to implementation will be reduced. In summary, deciding which blockchain use case to invest in and when requires deep critical analysis of all the trade-offs.
Blockchain. The hype was massive – businesses everywhere were going to be leveraging the benefits of blockchain technology – not just the gamers and the financial sector. While blockchain activity hasn’t stalled, neither does it seem to have gained the predicted, unstoppable market momentum just yet. So, what’s going on? We asked a range of industry experts for their take on blockchain’s potential in the business world. Part 1.
Sudhir Pai, CTO of Financial Services, Capgemini, offers the following thoughts:
“Recent research revealed that blockchain is set to become ubiquitous by 2025, entering mainstream business and underpinning supply chains worldwide. This technology is set to provide greater transparency, traceability and immutability, allowing people and organizations to share data without having to be concerned about security. However, blockchain is only as strong as its weakest link. Despite the hails surrounding blockchain’s immutable security, there are still risks surrounding it that organizations must be aware of – and mitigate – prior to implementation.
Blockchain’s transparency makes it extremely difficult to manipulate at scale. While the blockchain platform itself may be secure, there is still some work to be done to ensure organizations are equipped to make their networks secure end to end. For true security, organizations must focus on the last mile connection between a physical event and the digitized record of this event.
If these points of entry to the platform are tampered with, the blockchain is rendered worthless. It is therefore imperative that organizations secure all points of entry, and assess the risks, before they consider deploying blockchain on a broad scale. They will need to consider security at all layers, most importantly:
To achieve the most value from blockchain, both now and in the future, organizations must take responsibility for their safety and security at all levels – application, Infrastructure, data and partners. By conducting a blockchain risk assessment and addressing key risks, organizations can make sure they are well positioned to leverage the efficiencies, transparency and cost-effectiveness provided by blockchain without opening themselves up to unexpected risks. The most pragmatic way for organizations interested in blockchain is to test the concept through pilot programs. Pilots should be focused on the areas that offer organizations the most control and companies should take these weak links into consideration.
Ultimately, blockchain has the ability to solve business issues relating to traceability, responsiveness, and trust. By taking a carefully planned approach to implementation, and understanding blockchain’s weak links, organizations can unlock the true value of blockchain, creating new opportunities and reducing inefficiencies.”
80% of supply chain blockchain initiatives will remain at a pilot stage through 2022
Through 2022, 80% of supply chain blockchain initiatives will remain at a proof-of-concept (POC) or pilot stage, according to Gartner, Inc. One of the main reasons for this development is that early blockchain pilots for supply chain pursued technology-oriented models that have been successful in other sectors, such as banking and insurance. However, successful blockchain use cases for supply chain require a different approach.
“Modern supply chains are very complex and require digital connectivity and agility across participants,” said Andrew Stevens, senior director analyst with the Gartner Supply Chain practice. “Many organizations believed that blockchain could help navigate this complexity and pushed to create robust use cases for the supply chain. However, most of these use cases were inspired by pilots from the banking and insurance sector and didn’t work well in a supply chain environment.”
This setback should not discourage supply chain leaders from experimenting with blockchain. Blockchain use cases simply require a different approach for supply chain than for other sectors.
From Technology-First to Technology Roadmaps
Adopting a technology-first approach that exclusively targets blockchain infrastructure was the initial idea for use cases in supply chain, mirroring the approach of the banking and insurance sector. However, this approach did not work, because in contrast to the highly digital-only fintech blockchain use cases, many supply chain use cases will need to capture events and data across physical products, packaging layers and transportation assets. Additionally, supply chain leaders need to understand how these events can be digitalized for sharing across a potential blockchain-enabled ecosystem of stakeholders.
“Today, supply chain leaders have now started to treat blockchain as part of a longer-term technology roadmap and risk management planning. We see that many leaders are adopting a broader end-to-end view across their supply chains and map all requirements – from sourcing across manufacturing to the final distribution,” Mr. Stevens added. “Having blockchain as part of an overall technology portfolio has created opportunities for internal collaboration across many areas that have a potential interest in blockchain, such as logistics and IT.”
Blockchain As a Stimulus
Though most blockchain initiatives didn’t survive past the pilot phase, they have provided fresh stimuli for supply chain leaders to conduct broader supply chain process and technology reviews.
“Many supply chain leaders that have conducted blockchain initiatives found that they now have a more complete overview of the current health of their supply chain. Their perception on how blockchain can be used in the supply chain also has shifted,” Mr. Stevens said. “By going through the process of deploying a blockchain pilot, they discovered what needs to change in their organization before blockchain technology can be leveraged effectively.”
Before starting another initiative, supply chain leaders should identify and establish key criteria and technology options for measuring and capturing metrics and data that can indicate an organization’s readiness to explore blockchain.
“In a way, blockchain is a collaboration agent. It forces an organization to continually assess on a broad scale if its structure and employees are ready to embrace this new technology,” Mr. Stevens concluded.
Vincent Manier, CFO at ENGIE Impact, a company that advises organisations on sustainability goals, suggest that blockchain has a role to play in recording carbon emissions.
As more and more businesses prioritise sustainability and pledge to reduce carbon emissions, whether it’s to satisfy investors, customers or employees, effectively tracking a business’ carbon footprint is becoming increasingly important. As it stands, carbon reporting is a complex and manual process, often self-reported with little validation from external parties and next to no involvement from the supply chain, meaning that insights into a company’s carbon footprint can be reported incorrectly and, even, window dressed to be more appealing.
One solution to this that’s growing in validity is the use of blockchain to record carbon emissions. Its decentralised characteristic and networking features means carbon reporting can be conducted with greater transparency and control, providing insights into the whole company’s carbon footprint, including those of supplier, distributors and customers, without any risk of manipulation or error.
In an age where a company’s reputation is directly affected by its stance on sustainability and more and more consumers are making purchase choices based on an organisation’s contribution to the environment, creating greater sustainability transparency will be the key to satisfying all stakeholders with an interest in sustainability and helping enterprises meet their targets.
Blockchain. The hype was massive – businesses everywhere were going to be leveraging the benefits of blockchain technology – not just the gamers and the financial sector. While blockchain activity hasn’t stalled, neither does it seem to have gained the predicted, unstoppable market momentum just yet. So, what’s going on? We asked a range of industry experts for their take on blockchain’s potential in the business world. Part 2.
Made for manufacturing?
Terri Hiskey, VP product marketing, manufacturing at Epicor Software, talks of blockchain success in the manufacturing sector:
“Blockchain is a promising technology that is being adopted quickly in the manufacturing industry. At a time when companies are dealing with complex and non-integrated supply chain networks, this relatively new technology has the potential to resolve some of the major challenges of this increasingly interdependent environment. Blockchain can increase visibility throughout the supply chain, decrease administrative costs, and improve traceability. As an immutable record of events without a central authority, the technology is being hailed as a breakthrough innovation that could help prevent supply chain scandals in the future.
“The theory goes that because each party in a supply chain has a copy of the blockchain that they can access locally, no one has to log into anyone else’s system to enter data. No one has to rely on emails to keep paperwork in order, and when an event occurs all parties can be notified automatically. Some businesses are already putting blockchain to work to great effect. For example, IBM has launched a service that allows businesses to use blockchain to improve record-keeping. Toyota, for one, has already been experimenting with IBM’s solution, to see how the company might be able to use blockchain to track high-value items through its supply chains.
“Whilst blockchain is a powerful tool with great potential, it can‘t single-handedly bring transparency and accountability to the supply chain, and shouldn’t be considered as a standalone innovation. Blockchain will be most powerful when it’s combined with other solutions, such as a business’ existing enterprise resource planning (ERP) system. Because of an ERP system’s ability to integrate with multiple platforms, and filter relevant data, blockchain can be seen as a new plug-in tool that helps expand analytical possibilities and give users a more comprehensive view of what’s happening in their business. Ultimately, this will enable more accurate decision making.
“We are living in a world where businesses are under increasing scrutiny, where customer trust is invaluable, and where the ability to quickly demonstrate product provenance is vital. The transparency and incorruptibility of blockchain makes it a revolutionary force for the supply chain—and when integrated with an ERP system, this technology can provide business leaders with real-time, actionable data. With blockchain technology at their fingertips, manufacturers can bring extended visibility and clarity to their operations—using these insights to support business growth and improve ethical performance.”
2020 – a decisive year?
This is the view of Maike Gericke, Member of the Board of Directors at INATBA:
When it comes to blockchain and Distributed Ledger Technologies (DLT) going mainstream, 2020 will be a decisive year. We are already seeing major institutions like the Bank of England considering the use of the technology. However, while these signs a positive and exciting future, challenges still remain. To achieve mainstream success three key challenges of legislation, technology and education must be addressed.
Firstly, on the side of legislation, we’re seeing serious actions taking place. France, for instance, has recently launched its Pacte Law, creating a legal framework for non-security token offerings. At the same time, others are equally accelerating their own frameworks too, with the German Blockchain Strategy and Malta’s laws such as the Virtual Financial Assets Bill. While there is still a long way to go, these steps taken by government bodies give us hope that progress will be made around bigger pieces of legislation too.
Away from legislation, another stumbling block is that of the technology itself. Scalability and interoperability have been issues since blockchain’s inception. Yet, with the move towards greater experimentation in Layer 2.0 and 3.0, we could be on the cusp of a breakthrough. Using these new layers on top of a blockchain will mean that scalability and interoperability become issues of the past. In fact, these breakthroughs will not just mean that blockchain increases in its uses with finance-based systems, but in public sector areas too. For instance, identity checks at airports or the sharing of healthcare information could be greatly improved.
Finally, and perhaps most importantly, the challenge of education will be addressed. If blockchain is to become mainstream, then knowledge and understanding of it needs to break out of the current community. Presently, there’s still a lot of misinformation around its use. You don’t have to search too far to hear about cryptocurrencies being the best tool for criminals. Yet, this is simply not the case. In fact, blockchain technology can be a huge aid for law enforcement forces when they are conducting an investigation. If we can shift this mindset in 2020 and help the general public understand what it is that blockchain offers, then we will be going a long way to solving a major issue in increasing its adoption. 2020 is set to be a watershed moment in blockchain.
One of the reasons why I am so hopeful that 2020 will be a transformation year is that it will be the first full year of INATBA. Set up to help solve these very issues, INATBA’s bringing together the private sector, public sector and legislators to help create the frameworks for blockchain to prosper. Providing an impartial platform where the discussions that are needed can take place, we are already seeing early successes in solving these challenges. So while there is a long way to go, there are very encouraging signs that 2020 is the year that the journey to mainstream adoption really gathers pace.
A solution looking for a problem?
TIBCO Software's Global CTO Nelson Petracek offers the following observations:
Blockchain has certainly had its ups and downs over the past few years. From the crypto-craze a couple of years ago, where any blockchain-related whitepaper, it seemed, could generate funding, to the subsequent crash, blockchain has generated many headlines. In the enterprise, this cycle has led to more confusion than before, as organisations previously challenged to find the “killer app” for blockchain now also need to work through the skepticism and perception that blockchain is a solution looking for a problem.
Common blockchain promises have also been met with a dose of reality. Organisations are discovering that completely decentralised organisations are not easily achievable given constraints such as regulatory considerations and traditional enterprise structures. Programmable cryptocurrencies are met with similar concerns, as well as an unwillingness in many industries to move from fiat currency. Also, the promise of “no middleman” is hard to achieve since a third party is often required to oversee or govern an enterprise blockchain network. And, of course, many technical concerns around blockchain still need to be resolved, including concerns about performance, scalability, security, storage, and the general integration of blockchain stacks into existing enterprise architectures.
Given these challenges, and the relatively quiet state of the technology today, it would be reasonable to assume that enterprise blockchain will gradually fade into the distance. However, outside of the cryptocurrency area, many organisations are still looking at the technology, albeit with a more realistic set of expectations and views. There are a variety of reasons for this, but one main driver behind this continued investigation, is the current industry focus on digital transformation. Almost every organisation is looking at ways to become “more digital”, whether this is through the use of cloud, open architectures, or advanced analytics on vast amounts of data. However, in addition to these approaches, organisations are looking further outward to identify new business models and opportunities for growth. This outward view, in many cases, is then driving the need to create and build out a wide decentralised business network that tightly integrates and automates the interactions between traditional and non-traditional partners. Such a network provides more opportunities for growth, adds new business capabilities, and expands an organisation’s ability to provide a rich customer experience (CX) – a part of almost every digital transformation programme today.
From an IT standpoint, to build this network, many organisations are looking to blockchain. Perhaps the promise of this technology has not yet been met, but blockchain’s support of a trusted, distributed network of participants with automated business logic (“smart contracts”), lineage/tracking, and consensus is appealing and relevant to the problem at hand. Thus, as potential future blockchain concepts such as alternative currencies and self-sovereign identities develop, the use of blockchain as an enabler of broader decentralised business networks is perhaps the use case for which organisations have been searching. This won’t happen overnight, and there are still both technical and non-technical challenges to overcome, but this may be the best use of the technology within the enterprise at this point in time, leading to sustained momentum and continued innovation.
Blockchain. The hype was massive – businesses everywhere were going to be leveraging the benefits of blockchain technology – not just the gamers and the financial sector. While blockchain activity hasn’t stalled, neither does it seem to have gained the predicted, unstoppable market momentum just yet. So, what’s going on? We asked a range of industry experts for their take on blockchain’s potential in the business world. Part 3.
A confusing market?
Nick Fulton - Paybase Head of Partnerships, comments:
Despite having been a part of our lives for almost 12 years, blockchain technology has failed to penetrate the sphere of mainstream use and understanding. Although many traditional businesses have engaged with blockchain-based products, the “blockchain platform market is a confusing array of overlapping and fragmented offerings” that has failed to attract a mainstream following. The question is not, therefore, whether blockchain could be used in mainstream businesses but rather, what it would take to achieve mainstream adoption.
After 2017 when Bitcoin was valued at almost $20,000, more serious interest was taken in the cryptocurrency industry. Since then, blockchain technology has been both tested in proof of concept implementations and utilised in live products by a wide spectrum of traditional businesses - Bank of America, Mastercard, Walmart and IBM, to name a few. But pure blockchain-based businesses have failed to disrupt an industry to the same degree as the platform giants that dominate the market e.g. Uber, Netflix, Deliveroo etc.
What’s holding blockchain back?
A key value proposition for blockchain is to enable multiple parties to transact or share information within a trusted, peer-to-peer ecosystem. Network effects are critical to this. Network effects are a method for building value in certain business models. The telephone, for example, relies on network effects.
The more people using telephones, the more valuable the telephone network - but the fewer people using the network, the more redundant the product. This affects blockchain-based businesses even more heavily owing to the saturation of the blockchain market - there is not just one single blockchain but hundreds of different blockchains on which apps are based, all with different rules, use cases and technological frameworks. Gamified ride-sharing app Arcade City, for instance, must not just attract riders and drivers to use their product but they must also convert them to using the blockchain technology on which the app is built.
This difficulty is exacerbated further by the two-sided businesses model i.e. those that require both consumers and suppliers. Blockchain-based businesses using this model must not only overcome the challenge of encouraging users to adopt new technology but they must also gather sufficient users to make up the supply and demand sides of their audience and build marketplace liquidity.
Network effects for blockchain-based businesses are, therefore, not only critical but they can be even more difficult to establish.
Could intermediaries help?
Whilst intermediaries like Facebook with large networks could roll out a blockchain solution into their already large network and utilise their pre-existing network effects, they typically have vested interest in maintaining their position as the intermediary.
As blockchain technology is largely focused on decentralisation, it may be a conflict of interest (and potentially difficult to monetise) for a big, centralised business like Facebook to migrate all of their operations onto a blockchain.
What will prompt adoption?
Firstly, blockchain businesses should focus on building their minimum viable network (MVN) - “a network that has enough diverse stakeholders on board to be able to create the basic amount of interactions and activity” to function.
Secondly, there remains a large gap in education about the blockchain industry. The lack of regulation and market volatility has led to a bad reputation that’s been difficult to eradicate. Early adopters continue to occupy the space, so for blockchain adoption to grow, prioritising education (rather than building prototypes) is key and should be integrated into a business’s value proposition where possible.
Finally, I believe that businesses must build their front-end applications to be similar in design and UX to those of non-blockchain-based platforms. This strategy will help to encourage a non-blockchain-savvy audience (along with early adopters) to engage with blockchain technology and drive progress towards mainstream adoption.
Increasing adoption in the telecoms market
Andrei Elefant, CPO, TOMIA,explains:
Telecoms companies are increasingly adopting blockchain solutions to help them overcome the issues and challenges that today’s digital world poses. Firstly, there are the difficulties involved in the dramatically increasing traffic, as consumers and enterprises are using more data and communicating more often. This means that the amount of data recorded from calls, texts and browsing becomes harder to manage. Plus, in the context of overseas roaming, this means an overwhelming number of CDRs (call detail records) are sent from one mobile operator to another as subscribers cross networks during their travels.
Moreover, it has never been more challenging for carriers to manage the complexities involved with multi-party connections, to contain the costs of billing, protect margins, and avoid billing and settlement disputes. To complicate matters even more, with 5G around the corner, the faster speeds, new business models, and innovative new services that will be mandated by consumers and enterprises alike, the tracking, fulfillment and settlement of agreements will become more challenging than ever before.
As we can see, the obstacles to avoiding disputes, ensuring efficiency, and optimising costs are many. The need to overcome the carrier and system challenges is not around the corner like with 5G. Rather, it’s here right now, and it’s very real and it is very big.
TOMIA, along with industry partners Microsoft, KPMG & R3, recently developed a blockchain platform that streamlines operations, optimises margins, and fosters unprecedented trust among wholesale partners.
Blockchain solutions allow for the centralised management of critical business information including agreement details, supplier and customer information, and tariff and partner settlement information. This can optimise the dispute management process and significantly minimise the frequency of disputes too. Telecoms companies can optimise business through streamlined carrier partner management processes from contract creation and management through settlement.
Such solutions are rising in popularity within the telecoms industry and without a doubt it will not be long before they are ubiquitous. Blockchain is widely discussed however yet to be implemented in the majority of industries, although telecoms serves as one example where its rise is delivering a multitude of benefits to both consumers and enterprises.
Neil Evans, CTO EMEA at UNICOM Global, believes blockchain has huge potential for managing compliance due to its immutability; once data has been saved onto the chain it is practically impossible to change or delete. Significant transactions, documents and actions can all be written to the chain to provide a trusted record of events:
“There are many practical applications of blockchain from a compliance perspective. Companies and regulators can see the potential, but as with any new technology there is a learning curve while everyone gets their heads around exactly how it can be done. As always, the devil is in the detail.
“One interesting area is how blockchain can help to address a core requirement of the GDPR and other regulations that revolve around protecting and managing data: being able to prove, without a shred of doubt, what has happened to the data throughout its lifetime. Has it been updated, added to or manipulated, for example? Or has it been moved to another system or organisation?
“To address this, you need a tamper-evident digital ‘paper trail’ of all the events that surround the data. And importantly, that trail must be recorded with the same degree of integrity as the data itself. Blockchain really comes into its own here because it’s the ideal place to hold this information in a way that is immutable.
“So, in the case of the GDPR, any personal information about individual customers would go into whatever secure repository the organisation has chosen to store it: an enterprise content management system or CRM system, for instance. It goes without saying that this system must have high levels of security in place to maintain privacy as required by GDPR.
“Meanwhile, the digital log that records the events surrounding that data gets written to a blockchain – providing concrete evidence of everything that happened to the customer’s personal information.
“Another example of the power of blockchain in compliance is the role it can play in managing the ‘right to be forgotten’ (RTBF), an aspect of the GDPR that compliance experts have really struggled with. There’s an in-built paradox: if a customer asks for their data to be deleted from your systems, how do actually you prove you have done that, without leaving a record that can be tied back to the customer in some way?
“Here’s a simple scenario that demonstrates how, say, a bank could use blockchain technology to provide evidence of compliance without leaving any traces of the customer’s personal data behind. First, the bank creates a case ID for the RTBF request on its internal case management system (or other IT system) and then goes ahead and deletes the customer’s data. An audit trail is written to the blockchain, recording all the steps the bank has taken to purge the data from its systems, and this record is associated with the same case ID.
“The bank then notifies the customer that the case is closed and all remaining records of the case (including the case ID) are destroyed. By this point only the customer retains any record of the case ID. At a later date the customer could query the blockchain system using the case ID and view an audit trail of the RTBF, or ask a regulator to do so on their behalf. In this way, the integrity of the deletion process is maintained, while allowing checks to confirm that the process has been successfully completed.
“So blockchain can provide absolute proof that what should happen to enterprise information has actually happened – such as preventing unauthorised access, or deleting a customer’s information in response to a request for erasure.”
By Yevhen Kochuh, Head of Business Analysis Office at Ciklum.
When we look back on the late 90s and early 2000s, it is often defined as the “dot com” era of business — a single leap in technology that radically changed the way we do business and share data. In much the same way, we may be leaving the social media era and beginning a new era defined by an emerging technology: blockchain. Blockchain for business has been talked about and developed since 2008, however, this revolution does need to be carefully managed. If blockchain is to change the way we do business, and how economies operate, then we need to think carefully about what will follow.
For those who do not know, blockchain technology allows for data transfers to be both transparent and automated. Be it a transfer of electronic funds between banks, companies or peers, data transparency is created due to public or private ledgers. All involved with the transactions can audit every step and ensure that each party is holding up their end of the bargain. Therefore, if an inventory record needs to be kept on file and updated every two weeks, the blockchain can store that record and allow all parties to check on its status. The second part of blockchain's formula is automation, or smart contracts. A smart contract will execute a transaction or a data transfer when specific criteria are met. Take the inventory record mentioned above; there may be a stipulation in the blockchain that states data will only be transferred if the record is properly updated. If the record holder misses their deadline, the transaction will automatically not go through, and instead, notify all parties that the record holder is in violation of the agreement.
This has been considered in great detail by financial service firms, and for good reason. When dealing with money, there are a great deal of checks and balances that must be in place in order to loan money or release funds. Currently, many of these records are looked into manually, with each step of a loan process requiring human intervention. However, if this could be automated, and transactions could be approved by checking all criteria against a database, millions (if not trillions) of dollars could potentially be saved. The benefits of transparency and automation are beginning to go well beyond finance. Throughout 2018, we will very likely see the dawn of this new technological era as blockchain begins to directly impact a multitude of industries.
Like with financial services, efficacy isn’t a strong suit of the healthcare industry. It can take days (if not weeks) to get approvals on referrals and medication due to strict regulations and confidentiality standards. These delays drive up the cost of healthcare and drive down patient satisfaction. However, because blockchain is automated and can check records as well as regulations instantly, this technology has the potential to make a major impact on how we receive our healthcare.
Take, for example, getting approval from an insurance company for baby formula. In many states, if a newborn has a milk allergy, insurance companies will be required to provide prescription formula for a period of time as long as it can be proven that the allergy exists. As it currently stands, this would require the family to visit their pediatrician and perform a test. If the milk allergy test returns positive, then the doctor will prescribe a formula, which the child’s family will need to submit to insurance. From here, a decision from the insurer can usually take one or two weeks. If denied, resubmission can occur, taking another week or two before a decision is made. This is just one example of how a patient will need to wait on medicine because of the inefficacy of healthcare’s approval process.
If approval or denial was performed on a blockchain, however, the turnaround time would significantly decrease because a patient’s history would already be in the block. An approval process would become automated, with the software knowing what documents need to be attached, what each corresponding law in for the state, what the patients plan covers and if all necessary steps have been completed. Instead of a one- to two-week wait, with manual checks along the way, blockchain’s automation could provide approval or denial in a matter of seconds after it checks against a database. If requirements haven’t been met, the block would provide a clear statement to all parties involved, letting everyone know what is still required (thanks to the visible ledger).
While healthcare regulations deal with federal and state red tape, supply chains will often add international regulations into the mix, which require even more checks and approvals along the way. As it currently stands, 90% of all global #trade is moved by ocean freight, and it is highly dependent on non-digital paperwork.
The shipping and receiving process can require up to 30 sign-offs from organizations lasting more than a month. If a form is ever late, it will leave a container at the shipping port for days, if not weeks. These delays cost time and money, but blockchain could solve these problems.
By digitising both the paperwork and the approval processes, the supply chain process will become far more transparent and efficient. All involved parties will be able to track a shipment, see where a container is in its journey and understand, if there is a delay, who is responsible for the hold up. Once approval is needed from a company or a specific party, they will be notified and approval can be granted digitally and remotely, allowing the supply chain to continue moving. Coupled with the IoT revolution, organization would even be able to reroute items within the shipping container to their destination without ever needing to manually input any data.
Major Hotels and Hospitality
While healthcare and supply chains have inordinate amounts of paperwork and regulations, the hospitality industry is dealing with a different challenge: crowdsourcing. AirBnB, HomeAway and other home-sharing services are taking would-be guests away from traditional hotels because of price, as well as convenience. Guests want to feel like they’re getting a unique experience. Fortunately, blockchain can help resolve these issues and make larger hotel brands more personalized.
Thanks again to integration of different types of technology, this time GPS, a blockchain technology, can completely tailor a guest’s hotel experience. Once the hotel is booked, the hotel can be notified and they can then select the room they’d like to stay in, be it closer to an elevator, near the breakfast area, or totally away from everyone if they’d like. Then, once the guest arrives in the hotel’s city, the blockchain can recognize this and provide updates on their room, either through email or the hotel’s app. That hotel app can also be automated and used as a keycard once the room is ready. Once the guest is ready to leave, the blockchain will settle their bill (if they haven’t paid ahead of time or if there are any additional chargers), then ask if they’d like to rebook in the future, which would start the process all over again.
The above examples have shown how blockchain can benefit large institutions and corporations, but as blockchain technology becomes more popular and simpler to set up, it will also help the profit margin and transparency for smaller business owners such as farmers. Providing food to retailers in mass requires federal regulations to be met from the time a seed hits the ground to when the final product hits store shelves. For family farmers or tiny international suppliers, proving they are meeting all requirements and getting a fair share of the profits can take a significant amount of time and cut into the bottom line.
Through blockchain, however, smaller operations would be able to submit paperwork to the ledger quickly, proving that they are meeting all regulations. Once a harvest comes in and it’s time for shipment, as seen in the supply chain section, they would be able to send their product to stores and ensure it is being delivered properly. Unlike a toy or piece of furniture, fresh food products have a shelf life, so getting a product from a farm to a store shelf as quickly as possible directly affects the farmers revenue. If the block continues to the sale of the item, a farmer would be able to tell how well their product is performing, and ensure they are getting a proper cut of the profits.
Any small business owner, from startup software providers to mom-and-pop clothing designers, can benefit from blockchain automation. It levels the playing field, and due to transparency, makes sure everyone is doing their part and holding true to their agreements. Thanks to automation, it also significant reduces paperwork and manual checks, which for many industries, is incredibly burdensome and expensive. While financial services may make the phrase “blockchain” known, in 2018, expect blockchain to being its rise in popularity across all industries.
Mattias Fridström, from Telia Carrier, outlines the key factors to locating a sustainable data centre and why it matters.
Sustainability has now become a very important issue for all businesses. Not only because they are gaining a better understanding and profound sense of responsibility about the impact they – and we as individuals – have on the world, but because it affects how successful companies will be. Interest in sustainable businesses will only flourish, impacting how individuals and companies choose products and services, as well as where the talent of tomorrow wants to work.
Simultaneously, our dependency on data centres is set to grow as technologies such as AI, 5G and IoT become part of everyday life, both personally and professionally. IDC predicts worldwide data creation will grow to 163 zettabytes by 2025 – ten times the amount of data produced in 2017. With that in mind, we need to ask ourselves what the factors are in choosing where to locate sustainable data centres and IT infrastructure. The reality is that any data centre provisioned by a company must serve two sustainability masters and find that all-important balance between them.
Firstly, which goes without saying, the data centre must be sustainable for the business in terms of cost and performance. If the service that customers receive is degraded, there are outages or latency issues with their websites and applications it will simply create too many risks and, in a worst-case scenario, place the businesses itself in peril. The productivity and staff within the business can also be in jeopardy given the extent to which most business rely on cloud-based services as part of their own infrastructure.
The second ‘master’, becoming increasingly noticeable, is the push we all feel both commercially and personally to be more responsible custodians of our planet. Consumers, potential employees, investors and business partners are increasingly making decisions about the companies they want to be associated with based on their commitment to a sustainable environment. For data centres, this used to simply be about PUE (power usage effectiveness). However, data centre owners now need to consider the sources from which power comes, the physical impact of the site, recyclability, as well as the impact of the communications infrastructure on which the data centre relies to reach all those that use its services. We also need to consider the biggest waste product of data centres, heat. Is it being re-used as part of heat consuming systems, or simply ejected into the atmosphere?
Let’s take a look at how some of these factors more directly related to infrastructure are being addressed in the context of one European country, Sweden, which has made the kind of progress that others can only be envious of.
For such an energy intensive industry, something that is set to increase in the future as volumes of data and processing needs increase, the source of power for data centres has become a global issue of concern. Whilst many countries are making progress on moving to power generation that is less dependent on fossil fuels, Sweden is leading the way.
Sweden's electrical power comes primarily from hydro and nuclear sources. Together, they account for 80 per cent of all electricity production in the country. The remainder comes largely from wind and co-generation of heat and power.
In 2017, total electricity production amounted to 159 TWh. Hydropower accounted for 64 TWh, which represents 40 per cent of Sweden's total electricity production capacity. During a typical year, approximately 65 TWh of electricity is produced this way, but fluctuating precipitation can cause this to deviate by around 15 TWh. Wind is also a growing area, and in 2019 was expected to reach 20 TWh.
Overall, Sweden’s high proportion of hydro, wind and nuclear power make it a very low producer of hydrocarbon- based electricity, and it therefore has a competitive green electricity footprint when compared with other European countries.
Application and latency
Whether the latency added by network overhead is important or not will depend upon the nature of an application and the user’s perception of performance. The total application latency will, in addition to the network latency, be the sum of delay contributed by the hardware, software and application implementation of the user’s system, as well as the load on any external servers that the application relies on.
For the user, it is the total quality of experience that counts. This is where the long-term approach of Nordic countries towards telecommunications infrastructure has put them in an enviable position, embracing technologies that are efficient, fast and with a long lifespan.
The Nordic region is particularly well connected and all countries in the region were early adopters of fibre technology, more than three decades ago. This is especially true for the Stockholm region which is arguably one of the best-connected in the world. Here, an extensive network of fibre optic infrastructure connects locations in all geographical directions: to the north of Sweden – all the way into the Arctic Circle, towards Oslo in the west, eastwards with cables to Helsinki, Tallinn, Riga and Vilnius via the Lithuanian coast then St. Petersburg and onwards into Russia. And finally, towards the south with multiple paths towards Copenhagen, Western Europe, the US, Asia and beyond. In addition to Telia Carrier, there are twelve other international carriers offering services in Stockholm.
Fixed users within range
One of the most widely-discussed topics in the telecom world is the time that it takes for data traffic to reach a particular destination and come back again. This is referred to as Round Trip Delay (RTD) and quite simply, it measures (in milliseconds) the time it takes for traffic to transit the network from its point of origin to a specific destination and back.
How fibre connections are routed and joined together across the globe has a big impact on bandwidth and latency. Sweden and Stockholm in particular has made itself the beating heart of fibre backbone traffic with over 350 million users having an RTP of less than 30ms. A huge factor in this is the way in which Telia and other fibre carriers have located their Points-of-Presence (PoPs) in key metropolitan locations across the backbones available.
Choosing the right location for a data centre, or indeed the service provider from which you will procure data centre services is critical – it always has been – but now performance and cost are no longer the major criteria. The world views the impact of data centres very differently and the BBC has even said ‘eco-browsing’ will trend in 2020, as consumers seek to reduce their impact with less browsing.
Ignoring sustainability as part of your data centre planning decisions will have long-term impacts on your business and, as we all know, switching providers it not something that can be done overnight. Consider all the factors and get it right the first time.
As we enter 2020, digital transformation is set to dominate the business agenda for the coming decade and beyond. At the forefront, some of the world’s largest organisations are investing huge amounts of capital into their digitalisation. Thames Water is investing £1 billion over the next five years to digitalise key infrastructure, while Volkswagen recently announced €4 billion to be spent on automation through to 2023.
By James Whelan, MD of Avantus Systems.
However, digital transformation is not reserved for large corporations with extensive resources, and companies of all shapes and sizes are starting their own transformation journeys, from trying out new digital workflows to investing in cloud-based solutions. IDC estimates that around $2.3 trillion will be spent on digital projects by 2023 – account for more than half of all IT spending.
Whatever the sector and focus of the business, digitalisation projects align with the objective of making the business more efficient and agile. A successful project can lower costs by reducing infrastructure overheads, improving flexibility and scalability, and even unlocking entirely new business models and markets.
In many cases achieving these results can mean implementing major changes across the business, often overturning decades of established working practices. This means proper planning and preparation is essential for any project to succeed. However, many organisations focus on the technical aspect of digital transformation and overlook one of the most important aspects of their business – their employees.
The risk of forgetting to put employees first
Getting the workforce onside can have a huge impact on a project’s chances of success. Research from McKinsey shows that a staggering 70 percent of digital transformations fail, with poor employee engagement being one of the most important factors.
Implementing new technologies and working practices without properly consulting with the workforce can cause a number of serious issues that prevent a project from reaching its full potential, or even derail it completely.
If the decision to go ahead with new technology comes entirely from the top, it will be made without proper insight into the needs of the workers who will be dealing with it on a daily basis. This can easily result in the launch of new technology that lacks essential functionality or is a poor fit with established processes. On the other end of the scale, the organisation may invest in a top-of-the-line solution without preparing the workforce to use it effectively, resulting in poor ROI. It’s also common to find that digital projects include solutions that do no integrate well with existing systems used by the organisation, or those of third-parties, because the decision makers were not aware of these complexities.
Aside from specific technical issues, unilaterally forging ahead with digital transformation can also be bad for morale. As a rule, people do not like being surprised with major changes and are likely to resist change if they feel it is being forced upon them. Unless employee buy-in is established at the start of the project, employees will often go back to the tried and tested solutions they are familiar with, skirting around the new technologies and workflows wherever possible. Without full commitment and take-up from the workforce, the digital project will have little chance of achieving any objectives around improving efficiency and agility, potentially resulting in a damaging series of unintended consequences such as resentment, increased staff turnover, lack of continuity, higher recruitment and training costs and lower overall productivity.
In order to avoid issues such as these, it is crucial that employee engagement efforts begin at the very start of the project, long before implementation and ideally before the solutions have even been chosen.
Research from Gartner has found that managing employee commitment to any change in the workplace becomes more difficult the further along the process is. The research found that nearly three-quarters of employees subjected to major changes in the workplace experience moderate-to-high stress levels.
Focusing on communication
Rather than digital transformation being something dictated by the company’s senior leaders, it should be seen as a collaborative project that involves employees at all levels across the organisation. From day one, senior managers should have a clear strategy for communicating their plans for digital transformation with employees, long before they begin scouting for potential solutions.
Appointing someone to act as a digital transformation evangelist can be a useful approach, ensuring that there is a single figurehead who is able to coordinate and take responsibility. Depending on the size of the company, this could be the CIO, another senior IT decision maker or may even be an entirely new role. Digital transformation leaders should be in constant contact with representatives from different departments across the business to share information and gather feedback.
To establish a proper dialogue with the workforce, the organisation can also provide individual employees with a clear channel of communication to both receive information and provide their own ideas and feedback. A central focal point such as a workplace portal can be very effective at ensuring that staff have their say, particularly if it is integrated with an existing employee intranet or dashboard. This will make it easy for the decision makers tasked with overseeing the project to communicate with the workforce and allow employees to respond without disrupting their own workflows.
Once a communication strategy and the right tools have been implemented, IT and business heads can start gathering valuable feedback and insight that will help to shape both individual projects and the company’s digital transformation journey at large.
Managing this through a central portal or dashboard will also make it easier for the decision makers to compare data from across different departments, such as HR, finance or production, and help them to choose solutions and strategies that will work across the board. As solutions are tested and implemented, project leads can incorporate feedback from department heads and the wider workforce to help shape and guide the implementation.
By including all employees in a digitalisation project from the very beginning and ensuring that they have a centralised resource for staying informed and providing their own feedback, organisations will be able to pave the way for a smoother digital transformation journey.
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 1.
Says Babur Nawaz Khan, Product Marketing, A10 Networks:
It’s time to have a look at the year 2020 and what it would have in store for enterprises.
Since we are in the business of securing our enterprise customers’ infrastructures, we keep a close eye on how the security and encryption landscape is changing so we can help our customers to stay one step ahead.
In 2019, ransomware made a comeback, worldwide mobile operators made aggressive strides in the transformation to 5G, and GDPR achieved its first full year of implementation and the industry saw some of the largest fines ever given for massive data breaches experienced by enterprises.
2020 will no doubt continue to bring a host of the not new, like the continued rash of DDoS attacks on government entities and cloud and gaming services, to the new and emerging. Below are just a few of the trends we see coming next year.
Ransomware will increase globally through 2020
Ransomware attacks are gaining widespread popularity because they can now be launched even against smaller players. Even a small amount of data can be used to hold an entire organisation, city or even country for ransom. The trend of attacks levied against North American cities and city governments will only continue to grow.
We will see at least three new strains of ransomware types introduced:
To no surprise, the cyber security skills gap will keep on widening. As a result, security teams will struggle with creating fool-proof policies and leveraging the full potential of their security investments
Slow Adoption of new Encryption Standards
Although TLS 1.3 was ratified by the Internet Engineering Taskforce in August of 2018, we won’t see widespread or mainstream adoption: less than 10 percent of websites worldwide will start using TLS 1.3. TLS 1.2 will remain relevant, and therefore will remain the leading TLS version in use globally since it has not been compromised yet, it supports PFS, and the industry is generally slow when it comes to adopting new standards. Conversely, Elliptical-curve cryptology (ECC) ciphers will see more than 80 percent adoption as older ciphers, such as RSA ciphers, are disappearing.
Decryption: It’s not a Choice Any Longer
TLS decryption will become mainstream as more attacks leverage encryption for infection and data breaches. Since decryption remains a compute-intensive process, firewall performance degradation will remain higher than 50 percent and most enterprises will continue to overpay for SSL decryption due to lack of skills within the security teams. To mitigate firewall performance challenges and lack of skilled staff, enterprises will have to adopt dedicated decryption solutions as a more efficient option as next-generation firewalls (NGFWs) continue to polish their on-board decryption capabilities
Cyber attacks are indeed the new normal. Each year brings new security threats, data breaches and operational challenges, ensuing that businesses, governments and consumers have to always be on their toes. 2020 won’t be any different, particularly with the transformation to 5G mobile networks and the dramatic rise in IoT, by both consumers and businesses. The potential for massive and widespread cyber threats expands exponentially.
Let’s hope that organisations, as well as security vendors, focus on better understanding the security needs of the industry, and invest in solutions and policies that would give them a better chance at defending against the ever-evolving cyber threat landscape.
Dave Sterlace, Global Head of Technology, Data Centers at ABB, outlines some of the trends to look out for in 2020 and beyond:
The 2010s saw a huge increase in digitalization, and with most of the world’s IP going through data centers, demand for more digitally-enabled, responsive facilities has also grown.
In 2020, it is likely we will see a number of data-hungry technologies becoming more commonly used. With this in mind, what will be the key drivers for the data center sector in the coming years?
The impact of 5G
Without doubt, 5G will be one of the biggest disruptors. Although it is still in its infancy, the global rollout of 5G will put the growth of digital data into overdrive, with Cisco predicting that we will soon enter the “mobile zettabyte era”. A 5G network will enhance data efficiency and reduce latency, serving users with nearly 100 times higher transmission rate than 4G networks - vastly improving the end user experience.
Everything will be impacted by 5G, from high-performance cloud data centers to edge services. With the capability of supporting one million devices per square kilometer, it really will bring forward the IoT age - and with that will come a ‘data tsunami’ that will need to be collected, analyzed and stored very quickly.
As such, we’ll see the increase of the ‘smart’ data center and the use of the cloud, and other technologies such as edge computing, to manage demand.
5G is also expected to enable new data-intensive services like autonomous vehicles, which will require terabytes upon terabytes of data to be processed very quickly. Any lack of connectivity – even for a short period of time – could have significant impact. As a result, we will see an increasing number of edge data centers – many of which will be managed remotely - to enable autonomous vehicles and compatible technologies that rely on data such as Virtual Reality (VR) and Augmented Reality (AR). The infrastructure will be critical in moving these types of technologies from novel gadgets to critical business assets.
For example, Artificial Intelligence (AI) can have a real impact on predictive maintenance, machine-learning can help refine predictive maintenance by learning from exceptions, and AR can couple with both to make ‘remote hands’ for maintenance a reality.
However, giving a non-human, machine-learning algorithm control of mission-critical infrastructure is not something that can happen overnight. For many operators, relinquishing control and putting faith in an AI-based system is a step they’re not quite ready to take. That said, the potential benefits are considerable, particularly in terms of increasing efficiency and freeing up engineers to focus on other crucial tasks such as maintenance and safety.
What is important to consider is – as with human skills – machine-learning technology needs to be ‘trained’ for the job it is there to carry out, so it can effectively respond to the specific functions, needs and pressures at each individual data center.
Another important driver in 2020 and beyond will be sustainability – particularly against a backdrop of increased climate awareness.
As more operations adopt IoT platforms – and 5G enabled networks become a reality – data demand will grow significantly, and with that will come a major increase in energy consumption. The industry will shift to how providers can operate in the most sustainable way, without compromising on operational reliability and efficiency. A holistic approach to data center design will look to where energy efficiencies can be gained through small improvements in electrification and digitalization.
With sustainability likely to remain a key economic and political driver in the years ahead, data centers can adapt with digital solutions to save cost and carbon, without compromising critical business operations.
In conclusion, ongoing digitalisation will have a real, positive impact on the speed and efficiency of the data center sector. While the use of some technologies – such as AI-based automated control systems – will be gradual, the advent of 5G will accelerate the need for smarter, more efficient facilities that can manage the increased demand. With the 2020s set to be the ‘decade of data’, digital solutions will be crucial for the data center sector to adapt and respond.
By Neil Murphy, Global VP, ABBYY:
Contribution of digital workers will grow by 50%
“75% of organisations are struggling to recruit digital skills, making room for a boom in the number of digital workers in businesses. These workers can augment automation efforts with AI and machine learning, working in harmony alongside humans. In fact, we found that the contribution of digital workers will grow by 50% in the next two years, illustrating a real shift to a future built on human-machine collaboration.
Automation can and should be human-centric – humans and machines, not human versus machines. Only then can human workers focus on higher-level, creative and socially responsible tasks, and give customers better experiences and faster service. In 2020, businesses that are quick to incorporate digital workers with content intelligence skills within their automation platforms will gain a significant competitive edge.”
Overhauling processes will become a necessity, not a nice-to-have
“With the process mining market set to triple by 2023, and as more complex deployments of digital transformation technologies ramp up, the ability to monitor a business’ processes will become critically important. However, individual technologies like RPA and BPM only have visibility over the steps they control – so new technology is needed to provide visibility of the process end-to-end.
To get the insights needed to improve customer service and operational efficiency, and ultimately boost profits, organisations will need to take advantage of process intelligence tools that go beyond more simple process mining. Process intelligence provides a comprehensive view of running processes, giving businesses the ability to act on what they find and improve processes in real-time. While we expect to see large enterprises leading the way, some smaller businesses in process-intensive industries like customer service or finance will begin to transform their processes too in the coming year.”
RPA adoption will boom in the UK as businesses see rapid ROI
“Currently, less than a quarter of UK businesses are investing in RPA. With more than 4 in 5 of those who have invested in RPA having seen a return on their investment within just a year, with a majority seeing improvements in efficiency, market share, and revenue growth, we expect to see adoption of RPA in the UK boom in 2020. We predict that adoption will be particularly strong in the banking and financial services sector, which is already leading the way with 38% adoption. By the end of 2020, we can expect this to have increased to two-thirds of businesses in this sector. Indeed, Forrester predicts in 2020 that more than a million knowledge worker jobs worldwide will be replaced by RPA bots thereby allowing staff to focus on more complex tasks.
Going beyond just RPA, businesses are seeing even greater benefits when investing in RPA alongside process or content automation technologies – with 71% having grown their revenue, compared to 37% of those using RPA alone. Again, we’re likely to see banking and financial services businesses leading the charge, to enhance efficiency and productivity in an uncertain economy.”
CIOs will put AI ethics at the top of their agenda
“The new challenge for CIOs adopting AI technologies comes down to one thing: ethics. It has now become paramount that CIOs know what uses of AI could cause problems – whether bad, biased or unethical – and what they can do to make sure their business remains on the right side. In 2020, we’ll see CIOs begin to question their AI deployments: are the AI applications they are building are moral, safe, and right? Is the data behind your AI technology good, or does it have algorithmic bias?
With augmented intelligence set to the become the new normal, no CIO wants to be known for bad and biased use of AI – especially since the legal ramifications of such actions will ramp up significantly in 2020. This will be the year that AI ethics hits the agenda of every CIO, and businesses, workers, and the public will all benefit from it.”
DB Hurley, CTO of Marketing Cloud at Acquia, provides his predictions on what he expects to see from the open source community in 2020, from refining the customer experience to allowing users to have better control of their data:
From Paul Hampton, Senior Director of Product Marketing at Alfresco:
Companies all over the world in every industry are currently overwhelmed with data. The total amount of data, which doubles every two years, clocked in at 4.4 zettabytes (trillion gigabytes) in 2013 and is likely to hit 44 zettabytes by 2020. This will have a significant impact on the modern enterprise as the digital revolution continues to transform businesses, bringing new challenges.
1. Rich media content surpasses traditional content
The phrase “a picture is worth a thousand words,” will hold especially true in 2020 as rich media content like videos and images become increasingly prevalent in modern businesses. How people consume information is drastically changing and is reflected in how they interact with organizations. We’ll see a huge increase in the exchange and sharing of rich media between businesses and customers to simplify and expedite their experiences. For example, customers sharing dash cam footage and a photograph of the damage to a car with an insurance agency following an accident. In 2020, expect that rich media will surpass all of the traditional content created to date, posing new challenges for how enterprises manage and regulate more complicated content.
2. Content will increase tenfold
With the rise of rich media and growing number of business applications, the amount of content will increase at least tenfold. As the volume increases, organizations will face a challenge in trying to understand the information they’re responsible for. Already, we’re seeing organizations unable to effectively grasp their vast content sources, an issue that will become a bigger concern as the content grows in volume and complexity.
3. Managing dark data
Simply put, dark data is information, often personal information, that enterprises have, but don’t realize they have. Dark data encompasses unstructured data, which Gartner predicts will encompass 80% of all global data by 2022, and is often stored in silos, network file stores and unregulated tools across an organization. With new data regulations such as GDPR and CCPA, there will be an increased urgency for organizations to get a handle on their dark data. Until they do, I expect to see more organizations getting hit with serious fines for not protecting their information.
4. Blockchain will play a bigger role in privacy
Blockchain will become more pervasive for business applications. Within data privacy and security, there are countless applications that could benefit from blockchain. For example, a healthcare organization utilizing blockchain to manage the authentication of patient records. It removes the potential for human error and ensures a completely secure process.
5. Facing the challenge of shadow IT
Younger workers are rejecting traditional enterprise technology in favor of their own tools. Many IT leaders are already concerned with the practice, often referred to as Shadow IT, and in the coming year, the issue will escalate to business leadership. Employees using rogue tools unvalidated by IT professionals puts the entire organization and its customers at risk. It’s an extremely important issue that shows no sign of improving as more young workers climb the corporate ladder.
6. Hype technology will bring value to the modern enterprise
The technologies that have previously delivered more hype than value will start to come to fruition. There will be a democratization of artificial intelligence technology, allowing more and more people to actually start driving significant business value from using the technology to solve business problems. Within the data space, AI will be critical in helping companies understand the information they have. Furthermore, it will dictate what information is valuable and what can be removed. No longer will organizations need to waste time, money and effort storing use less information ‘just in case’.
7. Unpacking the process of digitisation
Organizations making digital investments will increasingly start with process. The most valuable investments always directly connect to the customer. Thus, digital transformations should begin by evaluating the customer journey. What processes are part of that journey? And how can these be improved to provide exceptional experiences for the customer. That’s where the most valuable digital transformation will begin.
8. Democratisation of technology
As new technologies, that were once complex - needing a data scientist to understand, become more accessible, there will be a democratization of these technologies. Consider the fact that there used to be typists when typewriters were the new, latest technologies. We'll one day look at certain data science roles as we do typists. Employees across an organization will become more familiar and knowledgeable with artificial intelligence and machine learning so that very niche technology job roles will go the way of the typing pools of old.
2020 is set to be hybrid cloud’s year, and for good reason. It’s a way of working that welds together the benefits of the public cloud infrastructure with on-site data centres.
By Sean Roberts, General Manager of Public Cloud, Ensono.
Hybrid, in theory, provides the pros of both, while offering a far more flexible platform: one that can scale up and down and access the latest technology features of the cloud vendors, whilst maximizing utility of the existing assets on a more predictable capex model.
However, there’s still much confusion around what the hybrid cloud is and isn’t, which is holding back some businesses from achieving the very best from their IT infrastructure.
Clarifying the confusing
It’s worth mentioning right off the bat that hybrid cloud is rarely something that’s decided upon as a primary option – rather, it’s something businesses move to out of necessity.
While small businesses can, with their light infrastructure and minor data footprint, jump straight to a public cloud service relatively easily, for larger organisations it’s a different story. Incumbent enterprise are usually burdened with legacy IT architecture compiled over decades of operation and M&A activity. Trying to move everything to a public cloud service with the click of a button is, unfortunately, impossible which is often why a hybrid half-way house approach is adopted.
The most successful hybrid journeys begin by addressing the application portfolio rather than a straightforward lift and shift. Is there data that is required to stay on premises for contractual reasons? Are there technical, commercial or licensing requirements that demand some form of private cloud too? Where that’s the case (ruling out a cloud-only option), larger organisations are finding that the hybrid cloud – the best of both worlds – is the way forward.
Still, it tends to rely on a matter of rationalizing the application portfolio. How many businesses, for instance, have separate applications that provide duplicate capabilities to separate parts of the business? This usual comes about as a result of M&A activity or via some form of decentralised IT decision making. Understandably, many have taken advantage of a migration to a hybrid cloud system to consolidate their applications and remove such duplication before moving to public cloud.
That’s especially evident in applications that have a cloud-native version available. In lieu of that update being an immediate viable path, parking existing applications on a hybrid cloud is a solid holding position - assuming, of course, that the cost benefits of shifting an application and its data to the cloud outweigh the price of doing so.
A necessity, but still, a useful place to be
What’s ultimately become clear this year is that hybrid cloud is a useful interim place to be. One that’s especially invaluable for organisations whose infrastructure is in a state of some flux. For companies that don’t have an appropriate native cloud option in place, it can be a bridge to better. A full public cloud service is typically the end goal, and the majority of businesses now acknowledge that it’s a case of ‘when’ not ‘if’.
What’s more, there are plenty of solutions currently on the market that make hybrid infrastructure a viable and long-term option. Private cloud platforms such as AWS Outposts and Azure Stack, for example, bring native AWS and Azure services to both data centres and on-site facilities. They’ve also made inroads into the world of hybrid cloud, which in turn demonstrates that major players in this space are working with hybrid in the medium as well as the long term.
Hybrid cloud offers businesses a way to move on from legacy systems without having to transform to a public cloud architecture overnight. It allows organisations to tread a cautious path to becoming cloud-first, offering a beneficial transitional state and the opportunity to plan the next stage of development.
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 2.
We start 2020 with a forecast of 10 of the leading trends that are expected to gain pace in the tech industry in 2020, as anticipated by Alibaba’s DAMO Academy, the global research initiative by Alibaba Group.
From the emergence of cognitive intelligence, in-memory-computing, fault-tolerant quantum computing, new materials-based semiconductor devices, to faster growth of industrial IoT, large-scale collaboration between machines, production-grade blockchain applications, modular chip design, and AI technologies to protect data privacy… more technology advancements and breakthroughs are expected to gain momentum and create a big impact on our daily life.
“We are at the era of rapid technology development. In particular, technologies such as cloud computing, artificial intelligence, blockchain, and data intelligence are expected to accelerate the pace of the digital economy,” said Jeff Zhang, Head of Alibaba DAMO Academy and President of Alibaba Cloud Intelligence. “In addition to exploring the unknown through scientific and technological research, we are also working with industry players to foster the applications of innovation in different industries, making technologies more accessible for the businesses and society at large.”
The following are highlights from the Academy’s predicted top 10 trends in the tech community for this year:
1. Artificial intelligence evolves from perceptual intelligence to cognitive intelligence
Artificial intelligence has reached or surpassed humans in the areas of perceptual intelligence such as speech to text, natural language processing, video understanding etc; but in the field of cognitive intelligence that requires external knowledge, logical reasoning, or domain migration, it is still in its infancy. Cognitive intelligence will draw inspiration from cognitive psychology, brain science, and human social history, combined with techniques such as cross domain knowledge graph, causality inference, and continuous learning to establish effective mechanisms for stable acquisition and expression of knowledge. These make machines to understand and utilize knowledge, achieving key breakthroughs from perceptual intelligence to cognitive intelligence.
2. In-Memory-Computing addresses the "memory wall" challenges in AI computing
In Von Neumann architecture, memory and processor are separate and the computation requires data to be moved back and forth. With the rapid development of data-driven AI algorithms in recent years, it has come to a point where the hardware becomes the bottleneck in the explorations of more advanced algorithms. In Processing-in-Memory (PIM) architecture, in contrast to the Von Neumann architecture, memory and processor are fused together and computations are performed where data is stored with minimal data movement. As such, computation parallelism and power efficiency can be significantly improved. We believe the innovations on PIM architecture are the tickets to next-generation AI.
3. Industrial IoT powers digital transformations
In 2020, 5G, rapid development of IoT devices, cloud computing and edge computing will accelerate the fusion of information system, communication system, and industrial control system. Through advanced Industrial IoT, manufacturing companies can achieve automation of machines, in-factory logistics, and production scheduling, as a way to realize C2B smart manufacturing. In addition, interconnected industrial system can adjust and coordinate the production capability of both upstream and downstream vendors. Ultimately it will significantly increase the manufacturers’ productivity and profitability. For manufacturers with production goods that value hundreds of trillion RMB, If the productivity increases 5-10%, it means additional trillions of RMB.
4. Large-scale collaboration between machines become possible
Traditional single intelligence cannot meet the real-time perception and decision of large-scale intelligent devices. The development of collaborative sensing technology of Internet of things and 5G communication technology will realize the collaboration among multiple agents -- machines cooperate with each other and compete with each other to complete the target tasks. The group intelligence brought by the cooperation of multiple intelligent bodies will further amplify the value of the intelligent system: large-scale intelligent traffic light dispatching will realize dynamic and real-time adjustment, while warehouse robots will work together to complete cargo sorting more efficiently; Driverless cars can perceive the overall traffic conditions on the road, and group unmanned aerial vehicle (UAV) collaboration will get through the last-mile delivery more efficiently.
5. Modular design makes chips easier and faster by stacking chiplets together
Traditional models of chip design cannot efficiently respond to the fast evolving, fragmented and customized needs of chip production. The open source SoC chip design based on RISC-V, high-level hardware description language, and IP-based modular chip design methods have accelerated the rapid development of agile design methods and the ecosystem of open source chips. In addition, the modular design method based on chiplets uses advanced packaging methods to package the chiplets with different functions together, which can quickly customize and deliver chips that meet specific requirements of different applications.
6. Large-scale production-grade blockchain applications will gain mass adoption
BaaS (Blockchain-as-a-Service) will further reduce the barriers of entry for enterprise blockchain applications. A variety of hardware chips embedded with core algorithms used in edge, cloud and designed specifically for blockchain will also emerge, allowing assets in the physical world to be mapped to assets on blockchain, further expanding the boundaries of the Internet of Value and realizing "multi-chain interconnection". In the future, a large number of innovative blockchain application scenarios with multi-dimensional collaboration across different industries and ecosystems will emerge, and large-scale production-grade blockchain applications with more than 10 million DAI (Daily Active Items) will gain mass adoption.
7. A critical period before large-scale quantum computing
In 2019, the race in reaching “Quantum Supremacy” brought the focus back to quantum computing. The demonstration, using superconducting circuits, boosts the overall confidence on superconducting quantum computing for the realization of a large-scale quantum computer. In 2020, the field of quantum computing will receive increasing investment, which comes with enhanced competitions. The field is also expected to experience a speed-up in industrialization and the gradual formation of an eco-system. In the coming years, the next milestones will be the realization of fault-tolerant quantum computing and the demonstration of quantum advantages in real-world problems. Either is of a great challenge given the present knowledge. Quantum computing is entering a critical period.
8. New materials will revolutionize the semiconductor devices
Under the pressure of both Moore's Law and the explosive demand of computing power and storage, it is difficult for classic Si based transistors to maintain sustainable development of the semiconductor industry. Until now, major semiconductor manufacturers still have no clear answer and option to chips beyond 3nm. New materials will make new logic, storage, and interconnection devices through new physical mechanisms, driving continuous innovation in the semiconductor industry. For example, topological insulators, two-dimensional superconducting materials, etc. that can achieve lossless transport of electron and spin can become the basis for new high-performance logic and interconnect devices; while new magnetic materials and new resistive switching materials can realize high-performance magnetics Memory such as SOT-MRAM and resistive memory.
9. Growing adoption of AI technologies that protect data privacy
Abstract: The compliance costs demanded by the recent data protection laws and regulations related to data transfer are getting increasingly higher than ever before. In light of this, there have been growing interests in using AI technologies to protect data privacy. The essence is to enable the data user to compute a function over input data from different data providers while keeping those data private. Such AI technologies promise to solve the problems of data silos and lack of trust in today's data sharing practices, and will truly unleash the value of data in the foreseeable future.
10. Cloud becomes the center of IT technology innovation
With the ongoing development of cloud computing technology, the cloud has grown far beyond the scope of IT infrastructure, and gradually evolved into the center of all IT technology innovations. Cloud has close relationship with almost all IT technologies, including new chips, new databases, self-driving adaptive networks, big data, AI, IoT, blockchain, quantum computing and so forth. Meanwhile, it creates new technologies, such as serverless computing, cloud-native software architecture, software-hardware integrated design, as well as intelligent automated operation. Cloud computing is redefining every aspect of IT, making new IT technologies more accessible for the public. Cloud has become the backbone of the entire digital economy.
from Henrik Nilsson, Vice President EMEA at Apptio
AI and IoT adoption will need to be matched with better alignment of IT with the business
The adoption of AI and IoT is on the rise, as many organisations increasingly realise the business benefits of the technologies. However, as these technologies move into the mainstream, companies will struggle to quantify their value over time as their IT finance systems aren’t set up for this. In 2020, the ability to calculate long-term ROI from technologies whose costs aren’t fixed will be increasingly important for CIOs and CTOs looking to justify their technology investments.
IoT, for instance, will widen the cost base for the IT team as smart devices proliferate. While this may reduce other costs, such as labour, over time smart devices are likely to become another layer of legacy technology. This makes it hard to assess the total cost of ownership without having dedicated tools for doing so. Similarly, AI will produce fast results in the short term, cutting down laborious manual processes, but its value is harder to quantify over time. For both of these technologies, as with other emerging innovations, CIOs and CTOs will need to have a defensible strategy for proving their value in order to align with the needs of the business while balancing their budgets.
Roles within IT are going to change and demand broader skillsets
The past 10 years have seen many new roles crop up in IT (CDO, Head of Cloud Excellence, TBM Manager, etc.) that weren’t conceived of the decade before. The pace of change in technology means this is only set to continue, and as a result there will need to be a concurrent improvement in both hard skills (such as AI development skills) and soft skills (to manage IT’s relationship with the rest of the business).
While individual business units will be the ones driving forward with new technologies in order to pursue innovation, the CIO and the expanding IT team will now be the ones to ensure those demands align with overall business needs. This shift in scope of work – and the greater level of responsibility it entails – means that IT professionals need to upskill in order to meet the new demands of the role.
Cloud providers will specialise their offerings instead of price-warring – and businesses need to get wise to the cost implications
As seen in other software industries, overly aggressive price wars would likely upset the cloud market. As a result, AWS, Azure and GCP will all continue to enhance their specialities in 2020 (for instance focusing on scale, or a specific sector, or AI capabilities) to provide differentiation.
This will have a knock-on effect on costs. Apples to apples comparisons of pricings is already difficult, but moving forward businesses will have to do a much better job of tying value to cloud to make the right decisions for their business needs. Cloud services constantly scale to meet demand, which increases cost – but how do you compare providers or services with one another? How do you know exactly what value you get for your money?
To combat this uncertainty, in 2020 companies will need to establish a cloud centre of excellence and a “FinOps” mindset, whereby all areas of the business have greater understanding of, and accountability for, cloud spend. Currently, most IT functions don’t have the right set-up to manage cloud spend, with disparate instances of “Google here” and “AWS there.” This needs to change as organisations mature their cloud strategy and should start with the development of a central company IT strategy.
IT finance management will need to change for agile to work
Agile is becoming an increasingly popular way for forward-thinking IT teams to work, but IT finance systems aren’t set up to properly assess costs and business value. As the way companies work moves from a waterfall methodology to an agile one, it’s not just the IT finance team that needs to change to keep up, it’s also management of the organisation in general.
Product development has shifted, with multiple iterative trials now taking place for each incremental improvement. It’s imperative that the rest of the business also change to align with this shift, particularly the CFO. The CFO needs to ensure that the organisation’s capital is spent wisely and in the right areas to support both the short- and long- term goals of the organisation.
The office of the CFO needs to introduce governance and controls that don’t hinder and slow down the benefits of agile, but provide a helicopter perspective across all of the organisation’s investments. This will ensure that all of the team’s efforts are supporting the business’ short- and long-term plans, as agile takes centre stage and becomes the accepted way of working.
Arcserve issues Top Three Data Protection Predictions for 2020
● The ransomware epidemic drives businesses to adopt a proactive approach to cybersecurity
● Cloud for BCDR matures and along with it, easier methods of moving data
● Businesses adopt unique BCDR tactics to combat climate change
Over the last year, the data protection space faced a swath of challenges, compelling vendors to organically and inorganically evolve to meet rapidly transforming business needs. Ransomware attacks soared, with all sectors experiencing a 118% increase in attacks, data protection as a service (DPaaS) deployment rose to dominate many business continuity and disaster recovery (BCDR) strategies, and the impacts from climate change on business operations became clear as states like California intentionally turned off power lines to prevent wildfires.
The year ahead will test many businesses, IT organizations, and data protection vendors as threats to business data become increasingly pervasive. Arcserve, LLC, the world’s most experienced data protection provider, shares its top BCDR predictions for 2020.
Ransomware reaches epidemic proportions
Over the next year, we should expect cyberattacks to escalate, with cybercriminals taking a more tailored approach to disseminate malware, and in many cases, targeting the data backups themselves.
Companies across all industries need to understand that ransomware is a “when” not “if” scenario, and better prepare for this continued onslaught of cybercrime. Instead of relying solely on security solutions, IT leaders must take a two-pronged approach to ransomware mitigation to avoid choosing between data loss or paying a ransom - and in many cases, both. This means not only making investments in more advanced threat detection and remediation software, but also ensuring that data backup and disaster recovery protocols have entered the modern era.
In 2020, more businesses will seek out vendors who offer an integrated approach to cybersecurity and disaster recovery with solutions that combine the two. In doing so, IT leaders will move away from segmenting threat prevention and data protection to assure mitigation from cyberattacks, no matter the level of sophistication or target. Further, IT teams will invest more time into making backup plans known to business leaders by more clearly documenting who is responsible for what if their organization were to fall victim to an attack.
Cloud strategies reach full maturity
Migrations to the cloud will only continue to increase. But, now IT professionals will weigh whether to deploy hybrid and multi-cloud strategies, and look for new ways to overcome the obstacles and complexities associated with each – especially as SaaS-based solutions become more prevalent. Many companies are struggling to determine which data and applications should be in the cloud versus on-premises, learn the nuances between varying hyperscaler subscriptions, features and functionalities, and ensure their IT teams have the proper training and skills to manage these environments.
In 2020, we should expect organizations to dedicate more resources to deploy these infrastructures successfully, and standardize a security model that works across different vendors to reduce gaps, avoid misconfigurations, and ensure critical data, workloads and applications remain resilient. And, to match the pace at which organizations are moving to the cloud, we will likely see an emergence of offerings that aim to make it easier for companies to migrate their critical data. These solutions will deliver capabilities to ensure there is no impact to production systems while the migration occurs so organizations don’t need to suffer any unnecessary downtime.
Businesses proactively prepare for epic storms
As weather events become more severe, businesses need to adjust their disaster recovery plans to better anticipate those that could halt their operations and IT services. In the upcoming year, disaster recovery and business continuity specialists will employ DR techniques, such as California’s planned power shut offs, to prevent climate change from causing extended outages, data loss and financial damages. And, they will begin documenting these prevention tactics, particularly in areas where severe weather is more likely, to enable business leaders to prepare for these scenarios ahead of time.
“While it’s clear there are several internal and external factors that could impact business continuity, the fact remains that enterprises are working toward making necessary investments to keep their priceless corporate data safe,” said Oussama El-Hilali, CTO at Arcserve. “Now, the onus is on IT teams and business leaders to make sure they’re staying on top of and actively combatting new threats that can cause extended data loss or downtime. Those that clearly define their policies and procedures, and make educated investments into data protection will remain resilient.”
according to Giorgio Girelli, General Manager of Aruba Enterprise
It’s also possible that the new year will bring with it an even higher attack cadence and intensity. Cybersecurity Ventures predicts that there will be an attack on a business every 14 seconds by the end of this year, a trend that will continue into 2020.
IT leaders and business executives of all kinds, across all sectors, should turn their attention to tackling the threat posed by cyber-crime. In 2020, firms must look to migrate data to highly redundant cloud infrastructure, to both safeguard that data and mitigate the potential damage caused by an attack.
Secure your data in the cloud
As the cloud market continues to mature, cloud solutions can offer security standards that match or surpass on-premise systems. As a result, in a significant shift from the position of scepticism held by many IT leaders, recent research from McAfee found that 69 per cent of organisations now trust that their data is secure in the cloud.
With a network of enterprise-level data centres, cloud providers can guarantee the maximum redundancy of systems and maximum levels of business continuity. Further, teams of highly qualified engineers and solution architects can build a bespoke solution, designed from scratch with specific data storage and security needs in mind.
The architecture of data centres used by cloud providers also demonstrates an unerring commitment to security. Heavily guarded cloud servers are housed in warehouses offsite, away from most employees. The data held on the servers is also encrypted, which makes hacking or tampering extremely difficult for cyber-criminals.
Whereas a malware infection on a personal computer could result in the exposure of a significant amount of personal data, this is not the case with cloud infrastructure. Its resilience is magnitudes greater.
Putting the right disaster recovery process in place
Cloud won’t just secure your data but will also enable to simplify part of the IT processes and ensure continuity of service. Baglioni Hotels, in fact, an Italian company that owns nine luxury hotels moved to the cloud in order to optimize the management of the information systems in all its hotels and the commercial activities connected to them.
With businesses of all sizes increasingly threatened by sophisticated cyber-criminals, disaster recovery (DR) has taken on a new significance. What’s more, IT disasters aren’t just limited to cyber-attacks. Hardware failures, human error, power outages and natural disasters such as hurricanes or earthquakes also have the potential to disrupt business processes and put valuable data at risk.
According to a recent study from Spiceworks, 95 per cent of organisations have a disaster recovery plan in place, and 90 per cent included data integrity and backups in their DR plans. However, only 28 per cent of businesses included cloud or hosted services in their DR plans, which is surprising given how many firms now rely on cloud-based platforms. It’s clear that this planning gap must be rectified.
Further, end-users expect data to be continually available, which means businesses simply cannot afford to fall victim to downtime. For this reason, it’s important to invest in a cloud-based solution that allows businesses to respond swiftly should an unavoidable incident occur. Without the right DR processes and technology in place, a business risks the wrath of unhappy customers.
Stay in business with the cloud
Though no data architecture is totally immune to attack, cloud infrastructure has proven itself to be a secure option available to businesses today. Migrating data to the cloud will ensure a business enjoys the highest cyber-security standards, provided by a vendor whose full-time concern is the health and security of the data it holds. The cloud also represents the most effective protection against downtime and data loss, both of which can have a direct impact on a company’s reputation and, most importantly, its bottom line.
In 2020, businesses in all industries have to come to terms with the scale of the security threats facing them. With so much riding on the ability to protect corporate data, it’s time they turn to the only solution capable of standing up to it: the cloud.
Until now, robotic process automation (RPA) and artificial intelligence (AI) have been perceived as two separate things: RPA being task oriented, without intelligence built in. However, as we move into 2020, AI and machine learning (ML) will become an intrinsic part of RPA – infused throughout analytics, process mining and discovery. AI will offer various functions like natural language processing (NLP) and language skills, and RPA platforms will need to be ready to accept those AI skill sets. More broadly, there will be greater adoption of RPA across industries to increase productivity and lower operating costs. Today we have over 1.7 million bots in operation with customers around the world and this number is growing rapidly. Consequently, training in all business functions will need to evolve, so that employees know how to use automation processes and understand how to leverage RPA, to focus on the more creative aspects of their job.
RPA is set to see adoption in all industries very quickly, across all job roles, from developers and business analysts, to programme and project managers, and across all verticals, including IT, BPO, HR, Education, Insurance and Banking. To facilitate continuous learning, companies must give employees the time and resources needed to upskill as job roles evolve, through methods such as micro-learning and just in time training. In the UK, companies are reporting that highly skilled AI professionals, currently, are hard to find and expensive to hire, driving up the cost of adoption and slowing technological advancement. Organisations that make a conscious decision to use automation in a way that enhances employees’ skills and complements their working style will significantly increase the performance benefit they see from augmentation.
Schneider Electric collaborates with industry peers in World Economic Forum Workgroup
By Steve Carlini.
I must be honest and confess that my knowledge of the World Economic Forum (WEF) was limited to a once-a-year meeting in Davos-Klosters where the world's top leaders collaborate to shape global, regional, and industry agendas. The Forum's mission, as I understood it, was to improve the state of the world by bringing together leaders across industries and society – a pretty lofty goal!
What I didn’t know is that the WEF has numerous platforms, councils, communities, and workgroups that are all dedicated to the same goal. I was pleased to find out 5G was high on the Forum’s radar when they contacted me a few months ago and said they were forming a 5G workgroup comprised of stakeholders across the next gen telco ecosystem. I was more pleased when they asked me to join this group, the World Economic Forum’s 5G-Next Generation Networks Programme, and I happily accepted.
It was an exciting invitation for me because I’d spend the prior six months researching and collaborating on the data center architecture needed for 5G. I wanted to share what I’d learned with a broader group and collaborate further with the members of this programme. I was also interested in participating in the cross-industry workshops planned for London, Singapore, and Silicon Valley.
Sharpening Our 5G Focus
The first workshop was held in London and focused on three areas:
The first output from the 5G-Next Generation Networks Programme is a “Repository of 5G Use Cases,” that features 5G case studies from Korea Telecom, Ericsson, Nokia, Ford, Lyft, and others. Schneider Electric’s use case is featured on pages 31 and 32 with the topic of Metro/Regional and Local Mobile Edge Cloud (MEC) buildout. Highlights include:
Hope for Finding the “Killer App” for 5G
In addition to the “Repository of 5G Use Cases,” a white paper called “The Impact of 5G: Creating New Value across Industries and Society” is also available. Schneider Electric collaborated on the content and provided a manufacturing-based case study:
Schneider Electric plans to leverage 5G to simplify factory IT operations, improve support to manufacturing, and accelerate factory digitization. 5G demonstrations at Schneider Electric’s Le Vaudreuil factory leverage low latency and high throughput and secure indoor coverage to validate a range of use cases along various aspects. These include:
As the group moves forward, programme members will continue to collaborate and we are confident we will identify the “killer app” that will drive deployment and adoption of 5G.
At Schneider Electric, we are also collaborating on establishing cooperative models for infrastructure investment. Topics under discussion include policy impact and stakeholder mapping (smart cities), business/revenue models, infrastructure investment ecosystem, etc.
Subsequent to the Forum’s next workshops, look for new 5G content to be published. It’s going to be an exciting journey!
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 3.
By Shannon Yost, Director of Strategy, Base2 Solutions
Businesses are increasingly betting on digitization to win in the marketplace. In 2017, a Gartner survey found that 67% of business leaders believed they would no longer be competitive if they weren’t significantly more digital by 2020. That’s likely fueling the boom in digital transformation spending, which IDC predicts will grow from 36% to 50% of all worldwide ICT (information and communications technology) investments by 2023, for a whopping $7.4 trillion combined over the next three years. But where is all that money going and what will it mean for innovation in the near term? Here are six predictions for 2020.
Machine learning and AI won’t replace humans. Yet.
The coming year will see increasing adoption of machine learning models. But while classification accuracy will continue to improve and frequently surpass human capabilities, people won’t become obsolete. That said, we’ll see more and deeper intelligence in embedded devices, web analytics and everything in between, requiring greater parallelism and larger memory footprints in embedded hardware. AutoML services that generate models from data rather than hand-tooled code will increase the pace of adoption while putting a greater emphasis on dataset quality and bias. Everything from sales forecasts to disease diagnoses will be enhanced by AI-driven analytics backed by ML models.
5G gets real-ish
Given the sheer volume of hype that surrounds 5G, there’s a fair amount of skepticism that it can live up to all of its lofty promises. But it’s likely to have a real impact on digital transformation and digital innovation, whether that’s next year or sometime a little further down the road. As carriers expand their currently sparse 5G coverage in 2020, we’ll start to see some of the benefits–like nearly non-existent lag and the ability to connect more devices at higher speeds–begin to drive the convergence of AI and IoT, enabling a new wave of platform innovation. 5G also is expected to supercharge the adoption of XR (both augmented reality and virtual reality). Like previous Gs, however, the transition to 5G won’t happen overnight, with 2020 being the first of a three-year ramp to widespread availability.
First waves of the drone invasion
2020 probably won’t see a mass drone invasion but we will begin to see expanded usage beyond some of the early pilots (pun intended) by companies like Amazon and Google for package delivery. With the FAA working with industry to streamline the waiver process for commercial drone applications, use cases such as inspection (construction, rail lines, utilities, oil and gas), agriculture and timber management, and public safety will accelerate. As the skies get more crowded, safety and air space management will be increasing concerns. AI and ML will play a key role in enabling drones to detect and avoid other objects (both flying and stationary), while 5G communication will allow for faster location/collision response and geofencing. In addition, companies will need to focus on cybersecurity for their drone fleets to safeguard the sensitive data they collect and transmit.
The Rise of Model-based Systems Engineering
Interest in and training for model-based systems engineering (MBSE) picked up significantly in 2019. Expect to see that trend accelerate in 2020. MBSE replaces the documents used in traditional Systems Engineering with models to improve outcomes for development projects, especially projects that are complex and difficult. By some estimates, MBSE can reduce total development costs by as much as 55%.
Medical devices get smaller and more portable
One market that is ripe for digital innovation in 2020 is medical devices. There is a growing demand for devices that are more portable (especially wearables), connected, and allow for better patient data access. A good example is the Butterfly iQ, a hand-held ultrasound device that shows live or stored images (interpreted with a neural network) on a standard smartphone. It costs just a fraction of standard ultrasound devices, making advanced imaging more accessible in the developing world. Expect to see other types of medical devices taking advantage of advances in silicon, interface design, user experience and integrating with existing technology. But devices that rely on connectivity and cloud-based services also will need to be designed with patient privacy and the security of data in mind.
Automotive and IoT cybersecurity
Cars have become software platforms and are rapidly moving toward autonomous operation. Meanwhile, the number of IoT devices for consumer and commercial applications is exploding. Even aircraft avionic systems are now connected. But all of these devices are attractive targets for hackers and other bad actors, especially as more data and computation is handled at the edge and hybrid cloud. 2020 will see an increasing investment in cybersecurity for connected devices that exist outside the corporate firewall.
From a digital innovation standpoint, 2020 promises to be another exciting year. Stay tuned.
Jamie Jefferies, VP and GM of EMEA at Ciena, comments:
“In 2020, the matchday experience for football fans in EMEA will get even more connected. We’ve already seen technology-driven enhancements like e-ticketing, open WiFi and mobile payments, but with the introduction of 5G, this is going to get a lot smarter. What this looks like exactly is only limited by creativity. We may see the resurgence of a 90s favourite in Player Cam using wearable technology to see the game from the perspective of your favourite player or even the referee. We often see club owners bow to fan pressure when hiring and firing managers - so could this go even further to a point where fans will make substitutions and tactical decisions from their smartphones in the stadium?
Whatever happens, recent trends point towards even more network heavy features, all powered by fibre and 5G. I also predict that these 5G-driven experiences will take football closer to the “day-out” culture seen with American sports as entertainment, engagement, and activities become more important to fans, and owners. This approach will lead to new revenue streams for football clubs and network service providers.”
“Analysts have been predicting the growth of the fourth industrial revolution for a number of years, and there are several different opinions as to what this looks like and if we are there. But one trend that is undeniable is this move towards a more connected and integrated world – which is something that will drive IT spending in 2020. As industrial systems get more connected, important aspects of manufacturing and industries like waste reduction, efficiency and resource management will get quicker, smarter and more automated.
But, this drive towards connected devices will put a lot of strain on existing hardware. If we are to truly gain the benefits of this golden digital age, we need to see an increase in companies deploying adaptive networks that can handle the increase in capacity and bandwidth demands. Simply put, connectivity in 2020 and beyond will be fundamental to the innovation and success of organisations.”
“Each year retailers prepare for the crowds and spikes in sales they hope to see during peak season shopping. And while Black Friday and Cyber Monday immediately jump out as key peak days, Boxing Day and the New Year sales also pose a great opportunity for retailers to capitalise on the shopping momentum.
However the way we shop as consumers is significantly changing and the long queues and crowds have been swapped for savvy buyers who are checking out prices online and looking for the best deal. That’s a whole lot of shopping, and a whole lot of bandwidth needed to handle these transactions.
During peak shopping tides, it’s fundamentally important that retailers can support the surge in online traffic and meet customer demand – otherwise they risk losing out on valuable key sales. Network connectivity plays a critical role in supporting this, and now more than ever networks need to be able to adapt dynamically and adjust ‘on the fly’ to handle spikes in traffic. As we continue to move and evolve towards a more online and digital centric approach it is essential that businesses can sustain the unprecedented demand on these holidays that try to ‘break the internet’.”
Explains Deepark Ramchandani Vensi, Account Principal at Contino
Challenge: What cloud provider to I choose?
“Traditionally, organisations do not have the enterprise support structures to onboard providers like Amazon and Google (not the case with Microsoft, who usually have well established support relationships). Instead, they decide to either progress with an RFP or they choose the supplier that’s easier to onboard. Both of these approaches fail to cater for the impact on engineering and the developer experience.
“So, what approach should you take? Perhaps you already have a small team of engineers who are trained in a certain cloud provider. Perhaps you have a product that needs developing that could best use the services provided by a certain cloud provider. Or perhaps the regions in which your business operates and your customers are based are best aligned to the regional availability of a certain cloud provider.
“In all cases, choosing one initial provider to prove out your organisational maturity for adoption at scale is critical. Which brings us onto our next challenge: trying to take on too much cloud at once!”
Challenge: Cloud brokers - multi-cloud managers
“It seems strange to have this on the list of challenges and blockers to wide-scale cloud adoption as we approach 2020, but it still manages to lurk around. The next step organisations often see as vital (once they’ve forced their engineering teams into a rigid multi-cloud framework) is to look at multi-cloud management brokers. These provide yet another abstraction framework and an inefficient API set to target in order to provide a ‘service catalogue’.
“The history of cloud brokers has shown that this typically ends in either an expensive bill from the broker or a convoluted engineering mesh that hinders scalability and often leads to frustration within the engineering teams.
“Providing engineering teams with a loosely coupled framework that lets them explore and consume the best that cloud providers have to offer has proven to be the only approach that scales. This can then be complemented with certain domain-specific tools that enable a more effective governance model, without hindering engineering creativity.”
Challenge: Lift and shift techniques from on-premises
“The illusory truth effect tells us that if you say something enough times, even if it’s false, people will believe you. This seems to have been the case when it comes to cloud security.
“As organisations have woken up to the importance of security when consuming cloud at scale, security teams have continued to tell us that the on-premises approach to cloud security is the safest approach. In reality, applying this data centre thinking to the cloud does nothing to improve an organisation’s security posture.
“Instead, it leaves behind a cloud environment that isn’t suitable, or flexible enough, for engineering teams to consume due to the restrictive perimeter-based policies that are in place. Additionally, these traditional security approaches bring with them solutions that aren’t designed to use the native services that cloud service providers have to offer. Defaulting to an IaaS based deployment approach results in a bill at the end of the month that negates the business case for cloud.
“Policies to tackle cloud security include policy-as-code: Having your environment defined as code has a plethora of advantages – one of them being the ability to define an organisation’s guardrails into a policy engine (as code) and then subsequently enforcing said policies across the estate and proactively preventing any possible violation.
“Another policy is identity-based and least-privilege security. With the increased number of services and devices that need managing, simply relying on your security perimeter isn’t enough (this is often a practice that is heavily relied upon with on-prem). Identity and least-privilege based approaches force users and services to rely on techniques such as MFA and granular role-based access controls, and have a route to live that is consistent and well managed. Additionally, modern cloud-based identity providers are capable of learning and adapting to user behaviours, providing risk-based scores on the access being granted.”
Technology is evolving at such a pace that many companies and individuals are finding it hard to keep up with developments and risk being left behind. Understanding and embracing Artificial Intelligence (AI) is a prime example of where this challenge is most keenly felt.
Regarded as the most important current driver of innovation, it has attracted significant amounts venture funding which indicates that this dynamic technology is certainly making its mark. Companies tied to AI accessed record funding of $19.5 billion in 2018 (compared to $13bn in 2017) and it looks like it’s on track to surpass this record in 2019, with AI companies raising more money across fewer rounds despite wider economic uncertainty and volatile equity markets.
It has the power to transform businesses but, as DataRobot’s Evangelist, I’ve been able to monitor developments and track trends by guiding and supporting real world projects deployed by our customers. It’s for this reason that we believe companies would be wise to consider and embrace the following five ‘stand out’ trends for AI in 2020 so they can keep one step ahead of the game.
AI Platforms for the Enterprise will complete their journey across the "chasm" of technology adoption in 2020. The resulting mass-market adoption of AI will have a profound effect on businesses, with clear winners and losers. A broad range of niche tools will emerge within the AI marketplace, along with a confusing flurry of overlapping marketing initiatives by competing suppliers. However, businesses will coalesce around the most trusted platforms. Platforms that accelerate, democratise and automate AI projects will lead the process of change. The biggest impact of market maturity will be AI’s application across whole industries, enabling much more efficient and effective processes, whilst bringing huge gains for consumers. New applications will be developed in domains previously thought to be uniquely human. Those who scale AI across their businesses will achieve the greatest returns; their competitors – especially those who lag in adopting AI – will suffer.
Hyperautomation will take off in 2020. This involves blending the rules-based automation of RPA with Artificial Intelligence. It was recently named the top trend by Gartner. Many companies have implemented Robotic Process Automation (RPA) to enhance their operations. However, only between 8-13 percent of businesses have “scaled” RPA, and their focus primarily has been on automating simple tasks. This leaves many looking for greater benefits, with promising results shown by successful projects with more intelligent automation. Adding AI to RPA permits the automation of more complex work and end-to-end processes: examples include areas like completing background checks in the financial sector or managing the distribution of fresh produce to different stores within retail chains.
AI success will also be a key business theme in 2020, as enterprises focus on achieving business value rather than “moonshots” and experiments. This is because too few AI models make it into production, despite huge investments. Businesses will need to refine their AI approaches, focusing more on potential ROI, effective deployment and implementing successful business change. In 2020, we expect that executives will prioritise opportunities more aggressively, linking AI projects to KPIs - revenue growth, cost reduction and enhanced customer experience. More AI models should be deployed, with greater emphasis on continuous monitoring and retraining to avoid performance drops. Executives also will place greater emphasis on change management and encourage greater involvement from business users (rather than just data scientists or specialist teams), to increase their AI capabilities across more lines of business and processes.
AI will force companies to clarify their values and revisit ethics. A number of flawed AI projects have rightly attracted criticism, including accusations that a high-profile new credit card is sexist, and the exposure of a racist algorithm in healthcare. Hiring decisions, electoral manipulation and facial recognition usage are also common flashpoints. Companies know they need to embrace AI, but for this to be successful, AI needs to be trustworthy. Focusing on ethics and social responsibility is good for business. Businesses will increasingly reject black-boxes and opaque point solutions, in favour of AI with decisions they can justify and explain. Leaders will define their AI ethical guidelines, aligned to their brand and organisational values (which vary considerably in our pluralistic society). Like any technology, AI applications will reflect the underlying business culture and principles of their users, and when used appropriately, will serve to reinforce and accelerate positive outcomes for society.
The emergence of super-fast 5G data networks combined with the increased ubiquity of devices (the Internet of Things), and the use of AI across industries will make AI omnipresent in 2020. Faster and more-affordable connectivity will enable the growth of even larger data sets to train more sophisticated AI applications. Real-time data feeds from devices will be continuously monitored by AIs to make proactive recommendations. For example, AI reviewing data from your watch might predict a heart attack before it happens, scheduling proactive interventions and care. Intelligent Virtual Assistants like Alexa will continue their relentless march. This will all intensify the feeling that AI is everywhere.
Kurt Muehmel, Chief Customer Officer at Dataiku, offers the following observations:
“As corporations seek to differentiate themselves in an increasingly crowded marketplace with increasingly commoditised products, tailoring the customer experience at the individual, moment-to-moment level, is an attractive opportunity. The wealth of data that our digital lives provides all of the raw material needed for such customisation. It could see the same online store, presenting itself in different ways to different consumers, and customising the products themselves to what you want before you know that you want it. Of course, this raises significant data privacy concerns. We will also see an increased concern to ensure that consumers provide consent for their data to be used in this fashion. After all, some consumers may prefer the fully-customised experience, whereas others would prefer greater privacy. Corporations seeking to get the most out of AI will need to define for themselves where that line is and how they can ensure that they remain competitive while respecting their customers' privacy preferences. Error in finding this balance could lead to market irrelevance in one case or consumer backlash in another.
“As businesses of all types seek to apply AI across their processes and embed AI in the services that they provide to their customers, they are becoming increasingly aware of the ethical, reputation, and business risks that they run if they do not take sufficient responsibility for their work. So while many efforts will continue to be focused on further developing core AI technologies, 2020 will see both broader participation in AI, and a great focus on education both on and using AI. Furthermore, as AI because more prevalent in our lives, we should expect that topics related to AI will be taught in non-technical courses, such as Philosophy and Political Science, while Computer Science courses will also cover the ethics and justice of AI. These trends are positive, serving to both create more responsible AI, while also building public trust in this powerful new technology which will come to define our era.”
It’s proving increasingly difficult to find an area of our daily lives that technology hasn’t enhance in some way shape or form. Intelligent, innovative solutions are making us all more efficient as we go about our day-to-day lives.
By Megan Johnstone, The Inn Collection.
Everything from dining out to staying in hotels has been brought in line with the digital age, and augmented reality is one of the top emerging tools being used within the hospitality sector. From speeding up check in desks to providing guests with control over the ambience of their rooms, we’re going to take a look at some of the top ways that these technologies are being implemented into the industry.
Adding new dimensions with AR
Augmented reality, known as AR, uses technology to alter a person’s perception of their surroundings. It offers a simulated virtual environment for a person to interact with. It can be applied to a range of purposes, from entertainment to training facilities, and it can be accessed through devices such as smartphones, headsets, or tablets. With AR, the existing surroundings are still there, but the digital elements it creates add a layer over reality — hence the name! Through AR, environments can become more interactive, providing the user with a new perspective. So, how can this be adapted for the hospitality sector?
Interact with your room
AR has become commonplace in hotel rooms, serving guests by enhancing their stay and giving them an element of personalisation. Nowadays, customers are attracted to the concept of an experience, and the idea of any kind of unique element contributes to this. In this way, AR can be used to make rooms more interactive, which will add value to the guest’s stay. Providing maps which facilitate AR technologies is a popular approach, and when a smartphone is pointed at areas on the map AR generates a range of helpful visual content. This added convenience is what distinguishes a stay as an ‘experience’, by providing customers with a handy solution making their stay better overall.
Gamification has a huge scope for application within the hospitality sector. It has become a phenomena of sorts in the wider world, with successes such as GeoCaching and Pokémon Go sweeping across the nation sending participants on augmented missions and treasure hunts. AR has become one of the most accessible, portable applications for gamification, and all a user needs to get started is a smartphone.
As an entertainment platform, AR can be used anywhere with an enabled device. There’s no end to the visual creations which can be generated — from creating simulations of global landmarks to depictions of celebrities, the possibilities are endless! AR based apps could even allow guests to redecorate rooms, allowing you to tailor your stay to your décor preferences, demonstrating how effective the technology could become for allowing guests to have more say over their stay.
Take a tour the smart way
While the booking process has moved online, we could be anticipating a further shift towards AR based booking systems. While 360 degree tours such as the Kingslodge Inn virtual tour are already popular with guests, this service could become even more thorough. The technology can add real time dates and information to provide guests with a better idea of room availability, the facilities available, and any events happening in the location of the hotel. By providing these extra details, guests can book a trip that suits their requirements. Information such as transport links could also feature on these AR tours, giving guests a genuine perspective of how they’ll get around the area during their stay. Whether you’re visiting the hustle and bustle of London, or if you’re simply looking for some traditional pubs in Durham, an AR tour could give you a first-hand perspective through some creative digital elements.
The hospitality industry looks set to embrace all of these advances, in order to captivate the imagination of an ever-modernising audience!
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 4.
Some thoughts from Bernd Greifeneder, CTO and co-founder of software intelligence company, Dynatrace:
CIOs will increasingly look to AI and automation to bridge the gap between the demands of the enterprise cloud and constrained resources. Research has shown that IT organizations are struggling to keep up with the growing demands of digital transformation. The scale and complexity of today’s enterprise cloud environments has moved beyond the point where humans alone are able to manage it. To bridge this gap, there will be significant moves toward AI and automation in 2020. Moreover, the need for automation will cause CIOs to demand more accuracy and transparency than the mathematical predictions from machine learning-based AI provide. Explainable AI – capable of identifying anomalies and performance problems with precise root-cause and triggering automatic self-healing actions to resolve issues before end-users are impacted – will become increasingly common.
IT organizations will transform toward autonomous cloud operations. Nearly 90% of CIOs contend AI will be critical to their ability to master the increasing IT complexity that is accelerating with the pace of digital transformation. In 2020, we’ll see a new wave of transformation toward autonomous cloud operations starting with CI/CD pipelines and autonomous, self-healing production operations. By bringing together AI and advanced automation for DevOps practices and cloud-native environments, organizations will be able to achieve continuous software delivery pipelines that reduce the need for human intervention and resolve problems before end users are impacted. As a result, we’ll see the beginning of a new era of “NoOps” for cloud native environments, changing how development and operations think, operate and align.
IT will play an increasingly valuable role in driving business decisions by leveraging data flowing through their systems. IT organizations will increasingly mine the wealth of user experience, customer behavior, and application performance data already in their systems and tie it to business metrics to support the business in better decision making. These decisions will be achieved by taming the accelerating data volume and velocity with AI to provide alerts, detect anomalies and deliver real-time answers about high-impact business issues such as degradation in revenue or variations in digital service adoption rates across products, geographies or segments. Leveraging answers from digital performance and user experience data that have previously been difficult, slow, or impossible to maintain will change the way teams work and drive improved business outcomes.
NoSOC trend on the horizon. While NoOps is increasingly part of an organization’s Autonomous Cloud journey, “NoSOC” is the next trend on the horizon. There is just too much toil resulting from the many false positives that SOC teams are subjected to, defocusing them from the actual job of making their services more secure and better protecting their organization’s and customers’ privacy. The dynamics of cloud technology and the continuous delivery of microservices is making this toil worse.
Informed by insight from customers, partners, and industry analysts and insiders, ExtraHop leaders predict a year of tool consolidation, headline-grabbing breaches, and a shifting industry focus on what makes a successful tech start-up.
The Year of Deeper Scrutiny for Fast-Growth Companies: “2019 was a tough year for heavily hyped, fast-growth companies going public in Silicon Valley. Several companies that raised huge rounds ultimately failed to deliver expected results or even approach profitability after they went public, and Wall Street was not amused. In 2020, we expect the investment community to more deeply scrutinize companies' financials and business fundamentals, ultimately leading to the support of companies who deliver on their promises, are capital-efficient with sound vision and innovation, and have truly sustainable business results and models to back them up.” — Arif Kareem, CEO
Antiquated Threat Detection Methods like File Hashing and Signature-Based IDS Waste Time: “Since the 1990s, file hashing has been the default mechanism for detecting malicious threat activity, despite the fact that it's ineffective against modern attacks that use polymorphic or fileless methods to go undetected. The same goes for signature-based IDS, which is extremely noisy while providing very little actual alert context. Security teams will continue to rely on these antiquated methods of detection because they are expected to, regardless of how well they work in today's threat landscape.” —Jesse Rothstein, CTO and co-founder
Accountability for the Ethical Use of Users’ Data: “Recent headlines tell of giant data corporations like Google and Facebook monetizing users' data and lacking sufficient transparency in these activities. There’s already been significant social backlash, but in 2020 we predict that users will demand companies not just follow the often-dated laws, but that they also do what’s right. Regulations like GDPR and CCPA are helping to bring more clarity around what’s appropriate, but 2020 will be the year that the industry is held accountable for the ethical — in addition to regulatory-compliant — use of personal data.” —Raja Mukerji, CCO and co-founder
A Slowing Economy Will Force Tool Consolidation: “In security programs, it's been very difficult to turn tools off. What gaps will I create? What unintended consequences will I see? As the economy has rolled along over the last decade, most security programs have had the necessary funding to add new tools and retain legacy tools under the guise of risk management. Economic slowdown is likely to change all of that, as investments in new technology will require cost savings elsewhere. A tighter economy will finally cause us to pull the plug on legacy security tools.” —Bill Ruckelshaus, CFO
"Observability" Will Gain Ground as Both a Concept and a Vocabulary Term in Security and DevOps: “Observability is a term that several companies are using to describe the practice of capturing metrics, logs, and wire telemetry (or sometimes other data sources, mostly in the DevOps space). The value of correlating insights from these data sources has gained enough ground that vendors need a word for it. Observability, The SOC Visibility Triad, and other terms have been spotted in marketing materials and on big screens and main stages at security and analytics conferences. In 2020, we'll see heated competition to control the vocabulary and mental models that enterprises and vendors use to discuss and market security best practices regarding gathering multiple data sources and correlating insights between them.” —John Matthews, CIO
A Major Information Leak from a Cloud Provider is Coming: “In 2020, we are likely to see a major information leak from a cloud provider. While at the same time the cloud providers are providing many useful built-in tools, it's not clear that they are using their own tools to secure themselves. As a further prediction, the leak will not effectively diminish migration to the cloud. As we have noticed with other breaches, they do not significantly erode confidence in the services.” —Jeff Costlow, CISO
The Wave Begins Towards Security Tool Consolidation: “Organizations will take a strong look at the number of security vendors within their ecosystem in 2020 to determine overlap and begin a move towards consolidation of tools. The winners will include those that have proven their API superiority and ability to work together within an organization’s ecosystem. The losers will be those who have not proven their ability to strengthen core security.” —Chris Lehman, SVP of Worldwide Sales
A Vendor Will Be Responsible for a Major Breach of Data Due to Phoning Home: “In 2019, ExtraHop issued a security advisory about the vendor practice of phoning data home and how this is happening without the knowledge of customers. The problem with this practice is that it expands the attack surface via which that data can be breached, exposing it to threats within the vendor’s environment. 2020 may well be the year that a breach of a vendor’s environment exposes the data of one or more of their customers. Regulations like GDPR have imagined exactly this type of scenario and laid out specific requirements for data controllers and data processors. But when such a breach occurs, it will have broad impact and implications.” —Matt Cauthorn, VP Security
The Big IoT Breach is Coming: “In 2017, major ransomware attacks crippled the networks — and operations — of major global organizations. While those attacks did billions in damage, for the most part, IoT devices were left unscathed. But sooner or later (and probably sooner) the big IoT breach is coming, and it could have global implications. Whether it happens in the US or abroad, in healthcare, shipping and logistics, or manufacturing, IoT devices around the globe are fertile hunting grounds for attackers. Taking down every connected device — from telemetry sensors to infusion pumps to mobile points-of-sale — could easily grind operations to a halt.” —Mike Campfield, VP of Global Security Programs
Vincent Lavergne, RVP of System Engineering at F5 Networks,
Without wheeling out all the usual clichés, 2019 has been another whirlwind of disruptive innovation and opportunity – with plenty of challenges to tackle and circumvent along the way.
The threat landscape mutated with predictable unpredictability, multi-cloud app deployments are becoming mainstream fixtures, and DevOps methodologies started exerting a newfound influence on business plans.
The big question is what happens next? What anticipated app-centric trends will change the game and tear up the rulebook (again)?
Digital transformation takes shape
2020 will see more organisations shift away from aspirational sloganeering to substantively embrace what can, and should be, a seismic step-change.
Inevitably, business leaders will get more involved in application decisions designed to differentiate or provide unique customer experiences.
Expect a new generation of applications that support the scaling and expansion of business’ digital models to emerge. This will include taking advantage of cloud-native infrastructures and driving automation through software development.
Further down the line, digital transformation efforts will likely be AI-assisted, particularly as they leverage more advanced capabilities in application platforms, telemetry, data analytics, and ML/AI technologies.
End-to-end instrumentation will enable application services to emit telemetry and act on insights produced through AI-driven analytics. We anticipate that these distributed application services will improve performance, security, operability, and adaptability without significant development effort.
The rise and rise of Application Capital
Applications are now firmly established as the main conduit for companies to develop and deliver goods and services. They have become modern enterprises’ important assets.
Even so, most still only have an approximate sense of how many applications they have, where they’re running, or whether they’re under threat.
This will soon change.
To manage Application Capital effectively, it is essential to establish a company-wide strategy that sets policy and ensures compliance. This includes addressing how applications are built, acquired, deployed, managed, secured, and retired. At a high level, there are six distinct and unavoidable steps that need to take place: build an inventory, assess the cyber risks, define application categories, identify the application services needed for specific activities, define deployment parameters, and clarify roles and responsibilities.
The primary aim of an application strategy should always be to enhance and secure all digital capabilities – even as their reach and influence shift and expand.
DevOps’ culture club
The technical minutiae DevOps methodologies and associated tools got a lot of publicity this year.
2020 will be all about getting the culture right, marrying theory with best practice and unlocking new levels of productivity without upsetting the operational apple cart.
Culture is not optional. Team structure alone dramatically changes pipeline automation, with traditional single-function teams often falling behind their contemporary, DevOps-driven counterparts.
Consequently, we will see more collaborative team structures and alignment on key metrics that give NetOps additional means to focus on what the business requires: faster and more frequent deployments.
DevOps has a ten-year head start on NetOps in navigating and overcoming obstacles around certain types of integration, tools, and skillsets. Collaborative teams can explode the status quo by promoting standardisation on tools that span from delivery to deployment (like Jenkins and GitHub/GitLab).
DevOps should not – and cannot – end with delivery. That means deployment functions – along with a complex pipeline of devices and application services – must be automated. This won’t happen without effective cultural realignment.
The data centre is alive and kicking!
Conflating the adoption of SaaS with IaaS caused speculation that cloud was cannibalising IT. Pundits warned that data centers would disappear.
The rumours were exaggerated. Data centres are still being built, expanded and run around the globe. The cloud hasn't managed to – and likely never will – kill the data centre.
Early in 2019, an IDC executive told channel partners at the IGEL Disrupt conference that over 80% of companies they surveyed anticipated repatriating public cloud workloads. Security, visibility, and performance remain common concerns.
Repatriation-related opportunities include improving availability of multi-cloud operational tools and a push towards application architectures that rely on more portable technologies such as containers.
The data center is not dead. It just evolving.
Application protection challenges
According to F5 Labs, the server-side language PHP – used for at least 80% of websites since 2013 –will continue to supply rich, soft targets for hackers. Situational awareness is critical to mitigate both vulnerabilities and threats.
Businesses are also realising that applications encompass more than just the code that they execute. Attention needs to be paid to everything that makes them tick, including architecture, configurations, other connectable assets, and users. The prevalence of access attacks such as phishing are an obvious case in point.
F5 Labs analysis of 2019 breach data confirm the need for risk-based security programs instead of perfunctory best practice poses or checklists. Organisations need to tailor controls to reflect the threats they actually face. The first step in any risk assessment is a substantive (and ongoing) inventory process.
As ever, the industry will gradually incorporate emerging risks into business models. For example, cloud computing has gradually shifted from a bleeding-edge risk to a cornerstone of modern infrastructure. The risks associated with the cloud have either been mitigated or displaced to contractual risk in the form of service level agreements and audits.
API and you know it
The word is out. Application programming interfaces (APIs) can transform business models and directly generate revenue. Cybercriminals know this.
More than ever, organisations need to focus on the API layer, particularly in terms of securing access to the business functions they represent.
One of the biggest issues is overly broad permissions, which means attacks through the API can give bad actors visibility into everything within the application infrastructure. API calls are also prone to the usual web request pitfalls such as injections, credential brute force, parameter tampering, and session snooping.
Visibility is another major and pervasive problem. Organisations of every stripe – including IT vendors – have a notoriously poor track record of maintaining situational API awareness.
API security can be implemented directly in an application or, even better, in an API gateway. An API gateway can further protect APIs with capabilities like rate limiting (to prevent denial of service attacks) and authorisation. Authorisation narrows access to APIs by allowing access to specific API calls to only specified clients, usually identified by tokens or API keys. An API gateway can also limit the HTTP methods used and log attempts to abuse other methods so you're aware of attempted attacks.
Appy New Year!
All this is of course the tip of an increasingly interconnected iceberg. Any New Year’s resolution worth its salt should include a commitment to comprehensively master the development, deployment, operation, and governance of application portfolio. The best way to do this, and to get visibility into the code-to-customer pathways for all applications, is to leverage a consistent set of multi-cloud application services. Here’s to a safe, innovative and transformational 2020!
Risk agendas should also include cybersecurity fatigue, privacy laws, and ethics of advanced technologies, says Cory Cowgill, CTO, Fusion Risk Management.
For years, businesses have enjoyed the benefits brought by cloud computing, and today most organisations have adopted some form of cloud into their IT infrastructure. By 2026, the cloud computing revenue market in Europe is predicted to exceed $75 billion. But as with any technology, the cloud has also brought security risks that are of increasing concern in boardrooms and at C-Level. In 2020, executives and risk professionals will need make sure that the following risks are high on their agendas:
Cybersecurity breaches are becoming more prevalent, and high-profile breaches such as Equifax have demonstrated that no organisation is immune. Playing catch up with the many threats is a full-time job and not one that necessarily focuses on the more famous threats such as cyber criminals. In fact, the threat can be far closer to home in form of the cloud servers that so many businesses rely on. As cloud technology has grown, so have the security concerns associated with it. Because of that, we’re going to see more pressure put on the big cloud providers such as AWS, Google, and Microsoft to ensure their cloud infrastructures are secure. To protect their reputations and guarantee positive customer sentiment, these cloud providers are likely to look at increasing their security measures. There will also be a heightened awareness of customer use of SaaS products, such as ServiceNow and WorkDay, and the ownership of responsibility that comes with working with a third-party provider. As we’ve seen from the fallout of security breaches and attacks, boards and senior management are finally realising that security is legally their responsibility.
Digital transformation and business continuity management (BCM) will become increasingly interwoven within organisations. Recently, The Bank of England, Prudential Regulation Authority, and Financial Conduct Authority announced a share policy summary and co-ordinated consultation papers on new requirements to strengthen operational resilience in the financial services sector. As advancing technologies create more risks for organisations, stakeholders are placing increased emphasis on systems that can cope with disasters and other business disruptions. Unfortunately, the traditional paper documents and spreadsheets that many organisations still rely on are not enough to support these systems. They simply cannot support a resilient, fast-to-respond BCM program that today’s businesses need. They also need to look at reducing or completely removing siloes that exist within the organisations in order to make risk an organisation-wide effort. An integrated approach makes responding to an event simpler and more effective. Advanced new BCM software programmes offer vast improvements in risk management. Of course, getting the buy-in to invest in these kinds of programmes can be difficult but with senior management increasingly recognising that BCM needs to be part of the DNA of any organisation, that conversation is becoming easier.
Off the back of the introduction of GDPR in Europe comes a U.S. version – the California Consumer Privacy Act (CCPA), coming into law on 1 January 2020. Whilst its limited to California businesses and anyone they do business with, we expect to see this become the most widely adopted privacy regulation in the U.S., unless and until Congress passes a federal law. But it’s not just the U.S. where the influence of GDPR is being felt. South Korea and Brazil are also updating their privacy laws – expect privacy and its impact on business to be huge in 2020.
CISOs and IT executives are suffering from too much choice and cybersecurity fatigue. With constant “new solutions” for everything cybersecurity-related, the market is growing every day. CISO’s know that cyber threats are real and present tremendous organisational risks. But which technology is going to give them the best chance of defending against or surviving them? With so much choice and a constantly changing cyber risk landscape, it’s more difficult to identify the best approach. We’re likely to see push back against cybersecurity vendors in 2020 to simplify solutions.
Network-connected technology powers much critical equipment (i.e. x-rays, MRIs) as well as patient record systems and billing software make healthcare a key target for ransomware attackers. Hitting these targets means a loss of revenue but also potential loss of life. Ransomware threats are occurring faster than the security updates and patches can’t keep up. Hospitals usually pay up because they can’t afford downtime – recovery from an attack can cost more than the ransom. We’ve also seen that municipalities, particularly small to mid-sized cities, are vulnerable for many of the same reasons. With this increase in the impact of ransomware, expect to see a mirrored increase in the efforts put in to search for strategy and relief in the defense of this growing threat.
Ethics of Advanced Technologies
Today we are awed by the potential of advanced technologies like artificial intelligence and machine learning. In many cases, these technologies are developing faster than society’s ability to figure out their ethical implications. One example – “deep fakes” are manipulated video or audio files produced by sophisticated artificial intelligence that yield fabricated images and sounds that appear to be real. Deep fakes have mostly been used to harm the reputations of celebrities and politicians. But now the technology is being used in criminal scams to trick companies out of big money. Therefore, boardrooms will be asking more questions about the legal and ethical risks of advanced technologies.
As we have seen, the risks to organisations come in all forms. By ensuring business continuity and preventing potential business disruptions today, organisations must embrace technologies that offer more benefits than risks and protect their businesses with robust management of cloud computing, cybersecurity, data privacy, and advanced technologies. As we enter 2020, businesses and individuals that are responsible for the risks are likely to strengthen their focus and investment in these technologies, recognising the benefits that BCM can provide to those looking to prosper in an increasingly difficult risk landscape.
The Great Cloud Migration:
Enterprises will more fully shift their data storage to public, private, or hybrid cloud services. Many of these companies are either underway in or have completed their progress of this shift, and they are noticing real cost benefits coming through in the budget, post-completion. The only question to ask yourself—do you have control of consumption and potential additional costs of your company’s data?
Battling for Limited Talent:
Companies will need to have a strong program in place for attracting and refreshing true engineering talent, including Gen-Z. Being able to retain acquired talent will be where organizations gain or lose competitive edge, and the impact of this will start to be seen in 2020. The next big question to ask yourself is “why is your talent attraction program better than your closest competitors?”
Practicing Vendor Talent Management:
The process to test, manage, and run with multiple vendors has never been more important. For many years, general wisdom has been to reduce the number of vendors to a manageable number, only utilizing those vendors to mitigate red tape and misplaced resources in finding and working with new suppliers. This is no longer the case, though; if you want your organization to benefit from today’s technologies, being adept at managing and testing out multiple vendors is critical.
Insights Drawn from Data:
Either across the enterprise or in specific departments and functions, AI, NLP, ML, DL, Bots, insight engines, and many other technologies that fuel decisions based on data will be included into more enterprise 2020 roadmaps. The technology here has moved so fast in the last few years, it will be important to address not only if you have something in place, but are you sure it/they really utilize all that is available, and thus, maximize its/their output?
Every HR team needs data analytics. With the right analytics, HR teams can be armed with the necessary data to decide where resources can be better allocated, where new team members may be needed – and which candidates are right for the job – and consider the future direction and strategy of the business.
By Brad Winsor, VP Workforce Analytics, SplashBI.
Having the right employees on board for this process is vital, and using data analytics to intelligently identify who these employees are is what will set apart the successful businesses from the unsuccessful.
However, knowing where to start is often the problem. With so much data to look at, where is the best place to begin? Identifying top performers is a key challenge, but by using metrics on areas such as attendance, engagement and performance, these individuals can not only be recognised, but replicated to ensure all employees are happy and productive in their roles.
What does a ‘top’ performer look like?
Most organisations will say that they only want top performers. And why not, why should they settle for less? Yet, many organisations still need to consider what ‘top’ really means – but this can only be achieved by adding data in the mix to identify not only existing top performers but potential top performers.
For example, the benchmark for a top performer might be exceptional attendance, excellent engagement in tasks and achieving outstanding results. If this is the case, then a potential top performer might be an employee that hits two out of the three areas, with areas to improve in the third. By identifying which area the employee may need to develop skills or characteristics, the organisation can ensure that employee is fully supported to go from being a potential top performer to an actual top performer.
The same methodology applies for disengaged employees; what can a top-performing employee that is highly engaged teach the organisation about an employee that has recently become disengaged, and how can they become engaged? By looking at areas such as the employee’s current workload – are they under- or over-worked? – and their current team – has the size of the team decreased, or are they fully supported? – the organisation will gain a better understanding of which circumstances or factors may be contributing to the employee’s current lack of engagement. Looking into these areas will not only help to improve the working life of the employee but also identify if there is untapped potential for the employee to become a top performer.
Top performers in the recruitment process
How can the success of top performers be replicated in the search for future employees? The information gained from looking at the characteristics and skills of top-performing employees can be extremely useful when establishing if a possible new recruit would be a good fit for the company. Do their interests and past experience match that of current top performers? Do they share similar employment history or skills that mean the employee has the potential to also be successful?
At the same time, organisations must look at exiting employees and assess if they possess any skills that should be avoided or replicated, or if they are leaving a gap that must be filled. If so, can any winning characteristics or skills be identified in the list of candidates being considered? This may be the secret to replicating the success of current top performers – but only data can reveal that.
Retaining top performers
Identifying top performers and understanding how these characteristics can be used within the recruitment process is important, but organisations can’t forget to examine how these top-performing employees can be retained themselves. After all, these aren’t employees that organisations would want to risk losing. Organisations and HR teams can start by looking at employees that have been with the company the longest; what has made them stay for so long? Does the employee have a great relationship with their line manager, or have they taken advantage of particular perks? These areas can also be used to identify what can attract an employee to stay at the organisation, rather than look for a job elsewhere, which can also inform future company policies or incentives.
The right path for success starts with any organisation really knowing who its employees are, which are top-performing, or potentially top-performing, and how this can be replicated both amongst existing and potential employees. Armed with this insight from HR analytics, organisations can create a workforce that is motivated, engaged and dedicated to seeing the organisation grow and succeed.
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 5.
Dr Anya Rumyantseva, Data Scientist at Hitachi Vantara, addresses the critical role AI will play in tackling pressing societal challenges over the next few years and the need for improved diversity in the industry:
“Artificial intelligence isn’t just something debated by techies or sci-fi writers anymore – it’s increasingly creeping into our collective cultural consciousness. But there’s a lot of emphasis on the negative. While those big picture questions around ethics cannot and should not be ignored, in the near-term we won’t be dealing with the super-AI you see in the movies.
I’m excited by the possibilities we’ll see AI open up in the next couple of years and the societal challenges it will inevitably help us to overcome. And it’s happening already. One of the main applications for AI right now is driving operational efficiencies and that may not sound very exciting, but it’s actually where the technology can have the biggest impact. If we can use AI to synchronise traffic lights to impact traffic flow and reduce the amount of time cars spend idling, that doesn’t just make inner city travel less of a headache for drivers – it can have a tangible impact on emissions. That’s just one example. In the next few years, we’ll see AI applied in new, creative ways to solve the biggest problems we’re facing as a species right now – from climate change to mass urbanisation.”
“The UK has been struggling with a skills shortage for some time now – and it’s only going to get more pressing over the next few years. The UK is quickly becoming the region’s innovation hub, driven by government investment in areas like AI research. The country’s AI market is booming – it’s projected to contribute a huge £232 billion to the economy by 2030, and the UK is already home to around a third of all of Europe’s AI companies. It’s a very exciting time, but progress will hit a wall if we don’t have the right talent to drive it.
As a woman and a data scientist, diversity in my sector is an issue I’m passionate about. The industry needs to do its part. We need to work hard to build a more inclusive space that holds the door open for people from all walks of life to pursue careers in fields like data science and software development. AI is a diverse and rapidly evolving field – if the UK is to remain at the forefront of innovation it will need people from diverse backgrounds to think differently and bring new ideas.”
Here are IMImobile’s five predictions for customer experience technology in 2020:
Consumers will move from text to tap, as they ditch SMS in favour of conversational messaging and commerce.
“2020 will be the year companies transition from text to tap as they ditch SMS in favour of conversational messaging channels such as WhatsApp, Apple Business Chat and RCS. The smartphone generation expects to be able to interact with companies in the same way they would with friends and family, and today, this is very rarely done via traditional SMS.
“Mobile interactions will no longer be restricted by character limits and basic plain text – they will become richer, more immersive and conversational, presenting brands with an opportunity to increase their reach and raise the customer experience bar,” said Siva Bandi, Senior Product Manager at IMImobile
2020 will be the year of RCS.
“With Google’s recent announcement that it is formally rolling out RCS in the US, 2020 will be the year we will see its mainstream adoption by carriers and enterprises. Customer experience will evolve from text to tap, allowing companies to use the features of consumer messaging (picture messages, video, group chat, location sharing, interactive carousels, etc.) to deliver richer, more engaging and interactive customer interactions,” added Siva Bandi, Senior Product Manager at IMImobile
Companies will take a platform approach as they pull the trigger on rolling out new customer communications channels.
“As we move into 2020, there is simply no excuse for companies failing to adopt new customer communications channels, such as Facebook Messenger, WhatsApp, RCS and Apple Business Chat. Proactive, two-way communication is the name of the game and the evolution of these channels is driving this change. Therefore, companies can no longer rely on traditional communications and development methods.
“They need to be able to roll out such services quickly and effectively, maximising the interactive capabilities these channels bring, in a way that complements their existing customer communication channels. At the same time, they must be able to pull together data from these different channels to so they can have a channel-agnostic view of customer interactions. Through adopting a ‘platform approach’ towards customer communications, companies will be able to view and manage all channels centrally while being able to add new ones (and the associated customer journeys) much more rapidly,” said Dan Garner, Senior Vice President, Solutions at IMImobile
Chatbots will hit the mainstream.
“To date, there has been a lot of trial and error when it comes to the deployment of chatbots. There has been a lot of hype as they have the potential to drive new customer experience strategies and capabilities. However, poorly designed and integrated chatbots risk damaging customer experience, especially if the customer processes and systems in the background are siloed. In 2020, we will see chatbots hit the mainstream as a result of the technology maturing and companies understanding appropriate use cases where they can excel. This, coupled with improved integration with back office systems, means they really can now deliver true value and improve customer experience.
“While chatbots won’t outright replace contact centre agents, they will increasingly play an important role in providing first line customer service support. The combination of rich, interactive channel capabilities and chatbot automation means they will be able to drive a far superior customer experience. Key to this is ensuring that there is the right interaction blend throughout the customer journey, as many people will still welcome the opportunity to message or speak to a real person as required,” added Dan Garner, Senior Vice President, Solutions at IMImobile
Customer self-service will become much more intelligent.
“Companies understand the significance of self-service when it comes to delivering a great customer experience, improving operational efficiency and reducing costs. In 2020, we will see further advances in AI and automation drive much more intelligent levels of customer self-service. This will enable customers to do a lot more themselves, at their convenience, without having to change communications channels mid journey, interact with a different department, wait days or even weeks for a transaction or interaction to be completed,” concluded Dan Garner, Senior Vice President, Solutions at IMImobile
Last year, AI reached the ranks of cloud and IoT, as people began name-dropping “the AI effect” wherever they could. This is one way to tell that a new technology is still at the hype stage – so is no-one talking about its problems. When an innovation is rising towards the ‘peak of inflated expectations’, people talk about it in hushed tones as something almost miraculous – and they hardly ever discuss the obstacles to adoption or the challenges that the new technology will bring.
As the new year and the new decade begins, the good news is that people are starting to think much more critically about AI – as a result, they are starting to recognise and talk about these challenges.
So, if 2019 was the year that artificial intelligence (AI) started living up to its hype, 2020 will be about discussing and, of course, solving the key challenges that increasingly widespread AI adoption has revealed.
And some of these challenges are difficult and intractable indeed, not least how we can make AI available to every organisation and how we can solve the thorny ethical problems that such a powerful technology presents to humanity. Here, then, are what I believe to be the most important developments in the field of AI in the year ahead.
Enter the AI Ethicist
As we entrust more parts of our working and personal lives to AI-powered applications, the ethics of AI is becoming an increasingly important consideration.
One of the most important ethical questions is about intention – for example, where, when and how AI interacts with the analogue reality that we have all been accustomed to live in. Who is going to decide what is the right playing field and what constitutes acceptable or unacceptable uses of AI.
Regulators and legislators are trying to define and control the degrees of freedom, but they are working at analogue speeds. We need faster answers to questions that we’re already facing today. In 2020, we will start to see enterprises employ people or even teams of people whose main role will be to formulate the ethics of our new AI-powered world. These AI Ethicists will need to liaise with the ecosystem of affiliated AI entities and gradually create, from the bottom up, the rules and conditions that will define the field of play.
Catalyst – the 5G effect
If we’re all to enjoy the benefits of AI, we need an infrastructure technology that will enable end users to live, work and interact in the cloud. Future AI applications – and indeed current ones –require vastly increased speeds together with location-agnostic access and minimal latency, which is exactly what 5G will bring.
It’s not hyperbole to suggest that 5G will be the key catalyst for a revolution in the way we experience reality. Truly connected homes and workspaces; advanced, telematics-enabled healthcare services, and “almost real” (VR / AR) interactive experiences – these are only a few of the ways where 5G could easily become the highway to the adoption of AI technologies from all industries, functions, and users.
It had to come, didn’t it? Today it seems like everything is available as-a-Service (including, bizarrely, Proof of Concept-as-a-Service). But let’s not mock, because this model has genuinely revolutionised the way that organisations procure technology, bringing enterprise-grade tech within the ambit of even the smallest company.
What’s so exciting about AI-as-a-service is not just that the huge economies of scale will make the technology available to every organisation that wants to use it. It will also give us the much-missed ability to harness all the infrastructure, platforms and knowledge towards creating real and sustainable value.
By packaging AI as part of a solution, we’ll make it much easier to identify valuable new use cases while providing a platform with end-to-end responsibility for delivering them.
Putting solutions at the centre
It used to be products and vendors that accelerated value. With AI, however, it will be the solutions themselves, since they will provide holistic approaches to resolving burning business questions rather than today’s piecemeal approach.
From 2020, businesses will use methodologies for best of breed, more cost-efficient of breed or even open source ways of answering the key questions facing their organisation. These solutions will focus on four categories:
a. ML for the masses. Data augmentation and data-healing techniques to enable the creation of test sets, which will enable more insightful outcomes upon choosing the right algorithm
b. Customer-centric analytics. Utilising AI to “translate” data rich journey touchpoints to humanly-digestible experiences in an easy, UX friendly way
c. Smart X / Smart Everything. Leveraging IoT technologies to enhance real experiences, utilising ML as an accelerator of adoption
d. Higher degree automation. RPA and machine-to-machine automation that does not just solve one part of a (sub)process but rather identifies and prioritises holistic reactive and proactive value creation across entire value chains. Moving from the bot and the bot farm to an Intelligent Automation-enabled Centre of Excellence
Of course, this can only scratch the surface of how AI will evolve in the coming twelve months. And that’s the fun of moving past the hype into the often challenging but ultimately incredibly rewarding world where AI powers everything.
While the term ‘cloud native’ is fast becoming the latest buzzword, the technology has been used by internet players for some years. But now it is quickly making its way into the wireless and telecoms space, and with it will bring the flexibility and agility that edge computing so promised—but did not quite deliver—in 2019.
Cloud-native will make up for the shortcomings of edge computing by bringing edge functions into the cloud native model allowing these to exist as microservice/containerized software, that can be spun up whenever and wherever it is needed.
2020 will see the industry realize that edge computing as we know it today is just a point solution as cloud native gives you everything the edge can’t do as part of its basic philosophy.
2020 will expose the slow adoption of cloud gaming. While the technology has the potential to penetrate the mainstream gaming market, it must first prove its value to family audiences and casual gamers.
That said, cloud gaming’s potential to propel mobile gaming won’t go unnoticed by telcos—and with the recent partnership between SK Telecom and Microsoft’s cloud gaming service, xCloud, setting pace, others will be snapping at their heels. In fact, the realization of 5G services— increasing the quality and decreasing the latency associated with cloud gaming—will make telcos even more attractive partners for cloud gaming companies in 2020. But caveat emptor: gamers don’t like glitches – telcos will have to assure constant availability and bandwidth to deliver the perfect experience.
In 2020 Artificial intelligence (AI) will become more intelligent and less artificial. AI is data intensive—advancements in the technology are made by ‘training’ algorithms on mounds of data. However, in 2020 we’ll see AI rely less on data and more on logic. AI’s logic will more closely resemble the way humans reason, which will decrease the amount of data needed to train the algorithm, making AI accessible to more businesses and applications than ever before. As such a more “organic” or “frugal” AI will come into play throughout 2020.
The popularity of cloud services is growing, and threat actors are here to exploit the trend.
We are observing more and more cases where our customers’ infrastructure is partially or entirely located in the cloud – cloud migration has been the dominant trend of the past couple of years. This is resulting in a blurring of infrastructure boundaries. In 2020, we expect the following trends to emerge:
As a result, conducting an attack will become harder and the actions of threat actors will become more sophisticated or more frequent – relying on chance rather than planning.
The transition to the cloud has blurred the boundaries of company infrastructures. As a result, it is becoming very difficult to target an organization's resources in a precise manner. On the other hand, it will also be difficult for a company to identify targeted attacks at an early stage and separate them from the overall mass of attacks on the ISP.
Those who plan to deploy cloud infrastructure in 2020 need to talk in advance with their provider about a communications plan in the event of an incident, because time is of the essence when it comes to security incidents. It’s very important to discuss what data is logged, and how to back it up. Lack of clarity on such information can lead to complications or even make successful incident investigation impossible. We note, however, that awareness of cloud infrastructure security is not growing as fast as the the popularity of cloud services, so we expect to see an increase in the complexities of investigating incidents as well as a decrease in the effectiveness of incident investigations, resulting in higher investigation costs.
Due to the blurring of the perimeter and the inability to separate the resources of the attackers from the resources of the provider, the difficulty of tracking and responding to attacks will increase.
It's also worth noting that when companies pass on their data to a cloud provider for storage or processing, they also need to consider whether the provider possesses the necessary level of cybersecurity. Even then, it is hard to be absolutely certain that the services they are paying for are really secure, as it requires a level of expertise in information security that not all technical officers possess.
The increase in the availability of cloud services will allow not just companies but also attackers to deploy infrastructure in the cloud. This will reduce the cost of an attack and, consequently, will increase their number and frequency. This could potentially affect the reputation of the cloud services themselves, as their resources will be used in large-scale malicious activity. To avoid this, providers will have to consider reviewing their security procedures and change their service policies and infrastructure.
The good news is that we are observing an increase in the overall level of security of businesses and organizations. In this regard, direct attacks on infrastructure (for example, penetrating the external perimeter through the exploitation of vulnerabilities) is becoming much more expensive, requiring more and more skills and time for the attacker. As a result, we predict:
In particular, this means phishing attacks on company employees. As the human factor remains a weak link in security, the focus on social engineering will increase as other types of attacks become more difficult to carry out.
Growth of the insider market.
Due to the increasing cost of other attack vectors, attackers will be willing to offer large amounts of money to insiders. The price for insiders varies from region to region and depends on the target’s position in the company, the company itself, its local rating, the type and complexity of insider service that is requested, the type of data that is exfiltrated and the level of security at the company.
There is a number of ways such insiders can be recruited:
o By simply posting an offer on forums and offering a reward for certain information.
o The attackers may disguise their actions so that employees don’t realize they are acting illegally, disclosing personal information or engaging in insider activity. For example, the potential victims may be offered a simple job on the side to provide information, while being reassured that the data is not sensitive, though it may in fact relate to the amount of funds in a bank client’s personal account or the phone number of an intended target.
o Blackmailing. We also expect to see increased demand for the services of groups engaged in corporate cyber-blackmail and, as a consequence, an increase in their activity.
Cyber-blackmailing groups that collect dirt on company employees (e.g. evidence of crimes, personal records and personal data such as sexual preferences) for the purpose of blackmail will become more active too in the corporate sector. Usually this happens in the following way: the threat actors take a pool of leaked emails and passwords, find those that are of interest to them and exfiltrate compromising data that is later used for blackmail or cyberespionage. The stronger the cultural specifics and regional regulations, the faster and more effective the attackers’ leverage is. As a result, attacks on users in order to obtain compromising data are predicted to increase.
In 2019 the IT infrastructure market saw notable growth in the adoption of edge computing and hyperconverged solutions as higher numbers of industries realised the potential of the technologies. Now in 2020, the demand from businesses who want to harness the power of edge computing with hyperconverged infrastructure is only forecast to increase.
By Alan Conboy, Office of the CTO, Scale Computing.
In fact, “Fifty-seven percent of mobility decision makers said they have edge computing on their roadmap for the next 12 months,” according to a Forrester Analytics Global Business Technographics® Mobility Survey. That said, this is only the surface of what’s to come - here are my opinions and predictions on what will become mainstream throughout this year.
The growing landscape of the IoT
The global internet of things (IoT) market was valued at US$ 190.0 billion in 2018, and is projected to reach US$ 1,102.6 billion by 2026, according to research from Fortune Business Insights. As a result, organisations will be gathering data and insights from almost everything we use – not just from the moment we wake up but even while we sleep. Technologists say the birth of the iPhone led to the upsurge of edge computing and IoT. Considering the extent of Apple’s accomplishments, we will see a much wider variety of businesses making use of this capability to put reasonable amounts of compute into a tiny form factor and move it into dedicated functions.
As a result, in 2020, rather than revolutionary, expansion in the IoT space will be evolutionary. It will continue to progress, pushed forward by a requirement for more compelling, efficient, and cost-effective solutions, with edge computing at the forefront.
At the edge of the network
The majority of data is being created outside the four walls of the traditional data centre, which is unsurprising, as today’s world becomes more and more data-driven. As organisations get into the swing of 2020, they are beginning to closely analyse their cloud data storage. Cloud was originally perceived as the answer to all problems, but now the question is, at what cost?
More companies are looking at hybrid cloud and edge computing strategies in the hope of keeping data closer to its origin. This year, we’re expecting to see organisations more heavily relying on hybrid environments, and using edge computing to store, process, and minimise extensive quantities of data, which are then later transferred to a centralised data centre or the cloud.
Bigger and badder threats on the horizon
Over the past year, an onslaught of news headlines revealed that organisations from banks to airlines to hospitals, even entire local governments, fell prey to ransomware attacks. These menacing attacks are growing at a frightful pace, and will continue to become more sophisticated, more lucrative, and increasingly devious in 2020. Already this year has seen Travelex hitting the headlines as it fell victim to a ransomware attack. It is time for organisations of all sizes to think about what resources they need to modernise and protect their IT infrastructure.
As these threats continue into 2020, companies must understand that traditional legacy tools are leaving them exposed and vulnerable to tactical and well-organised criminals, as well as obstructing their digital transformation journey. Organisations should now be taking advantage of highly available solutions, such as hyperconverged infrastructure and edge computing, that allow them to not only keep up with changing consumer demands, but deploy the most effective cyber defences, disaster recovery, and backup.
Incase IT professionals are required to face the aftermath of data being corrupted, and with ransomware at the forefront of many organisations’ minds, a step-change is needed in the event of an attack. These organisations will increasingly be guided by insurance companies as they take more of an active role, not just in the recovery of data, but in deciding whether or not to pay the ransom demand. The total cost of doing business will also escalate in conjunction with the growing threat of cyber-attacks, and every company, no matter the size, should be preparing itself for the impact
Reducing the footprint
Smaller computing form factors that perform enterprise-level tasks will be required to address the needs of both IoT and edge computing. The principal driving factor of this trend will be the cost of deploying and maintaining computing systems outside of the data centre. The smaller the form factor, the lower the requirements for power, cooling, and space.
Cloud computing is simply not suitable for many computing needs at the edge of the network where IoT is growing, even with 5G networks. Small computing devices and appliances that can run business-critical applications and be highly available will be critical to meeting the growing computing demands of this new decade.
With a much more discrete footprint than traditional servers and server-based appliances, smaller form factor computing will make technology more accessible outside of data centres. Many traditional systems can be replaced with smaller, cooler, and less power-hungry alternatives. A more compact footprint also aligns with the environmental initiatives of organisations around the world to reduce energy consumption into the coming years.
Edge computing, IoT and data protection will experience momentous gains in 2020, ensuring those who deploy these technologies get a great start to the decade. We also expect to see major changes in the way that organisations utilise these technologies and, further, how consumers respond to various innovations.
In 2019 we saw a steady increase in the number and modes of cyberattacks. In fact, more than half of all British companies reported cyberattacks in the last year alone. To prepare for 2020, Tanium looked into the biggest concerns for IT decision makers within organisations in the UK.
By Chris Hodson, CISO at Tanium.
This revealed that not having enough visibility over the increasing number of endpoints, such as laptops, servers, virtual machines, containers, or cloud infrastructure, leaving them unaware and unable to protect all systems, was the biggest concern for the coming year (25%). The next biggest area of concern for respondents is the sophistication of attackers rising (23%) followed by employees clicking malicious links (18%), and the complexity of managing physical, virtual, cloud and container infrastructure (15%).
What this all serves to underline is the fact that successful cyber attacks usually occur when businesses don’t get the foundational security concepts right. When an organisation doesn’t have visibility and, by extension, control of the potential entry points across its IT environment, they are inherently vulnerable to attack. To best equip organisations for the threats to come this year, organisations must ensure that they are taking several important steps to build a comprehensive IT security strategy so that they can protect critical assets, monitor impact, and recover from any unexpected attacks or disruption. This includes:
1. Assessing organisational obstacles: Are security and IT operations teams working in tandem or in confusion about which department is responsible for ensuring resilience against disruption and cyber threats? According to our latest research (67 percent) of businesses say that driving collaboration between security and IT ops teams is a major challenge. The IT operations and security teams should be working together to protect the IT environment, company and customer data - without this, they can’t achieve true visibility of their environment and endpoints, which leaves them vulnerable to attack.
2. Knowing your environment: Understanding what is in an IT environment is a crucial step. If a CISO stops by the IT team and asks how many unpatched devices are on a network, can this be answered accurately? As organisations look to build a strong security culture, it is essential that IT operations and security teams unite around a common set of actionable data for true visibility and control over all of their computing devices. This will enable them to prevent, adapt and rapidly respond in real-time to any technical disruption or cyber threat.
3. Decluttering the infrastructure: One of the most cited issues throughout WannaCry and other major security incidents is the challenge of updating operating systems in an environment laden with legacy apps. If a business is running a critical application that requires keeping an outdated operating system on life support, it’s time to rethink its value. Generating awareness of risks around old infrastructure can also help employees better understand vulnerabilities themselves, including how easy it is for opportunistic attackers to exploit outdated tools.
4. Eliminating fragmentation: Most IT security and operations teams operate using a messy combination of point products—cumbersome to manage, impossible to fully integrateIt is crucial for teams to have clear visibility of what is across their environment, and this means eradicating silos and siloed ways of working and investing in a unified endpoint management and security platform instead of collections of point tools.
5. Educating your employees: By various estimates, up to 83 per cent of ransomware attacks originate when an employee clicks on a malicious link, opens an infected attachment, or visits a compromised website. Investing in ongoing training for employees to protect against phishing attacks should be a key part of your IT security strategy.
In order to have an effective IT security strategy in place, an organisation must have two lines of defence; employee advocacy and comprehensive IT security structure. Crucial to combatting any type of threat – whether a sophisticated attack, employee clicking on a malicious link or one that exploits an out-of-date piece of software - is clear visibility of all of the endpoints across the network and the ability to stop disruption almost instantly.
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 6.
MicroStrategy has revealed its 10 Enterprise Analytics Trends to Watch in 2020 report during Solutions Review’s BI Insight Jam event — a day dedicated to raising awareness around best practices when evaluating, deploying, and using business intelligence software.
In collaboration with leading analysts and influencers from Forrester, IDC, Constellation Research, Ventana Research and others, the annual MicroStrategy compilation highlights trends and insights that range from AI and mobile intelligence, to the explosion of data and data sources, to some very human factors including a predicted shortage of data and analytics talent.
“We’re thrilled to unveil our annual report on the top enterprise analytic trends to watch in 2020. We see a growing opportunity for decision makers to take advantage of the latest trends and advances in enterprise analytics, AI, ML, deep learning, and more,” said Vijay Anand, Vice President, Product Marketing, MicroStrategy Incorporated. “By collaborating with some of the world’s leading experts in the field, the report aims to drive an informed conversation for those leaders seeking disruptive technologies to leverage data, drive greater efficiencies and ROI, and beat the competition.”
The 10 enterprise analytics trends to watch in 2020 include the following:
1. Deep Learning Delivers a Competitive Advantage
“In 2020, the spotlight on deep learning will be the nexus between knowing and doing. No longer just a buzzword, the pragmatic advent of deep learning to predict and understand human behavior is a tempest disruptor in how companies will perform with intelligence against their competitors.” - Frank J. Bernhard, Chief Data Officer and Author, “SHAPE—Digital Strategy by Data and Analytics”
2. AutoML Improves the ROI of Data Science Initiatives
“Machine learning is one of the fastest-evolving technologies in recent years, and the demand for development in machine learning has increased exponentially. This rapid growth of machine learning solutions has created a demand for ready-to-use machine learning models that can be used easily and without expert knowledge.” - Marcus Borba, Founder and Principal Consultant, Borba Consulting
3. The Semantic Graph Becomes Paramount to Delivering Business Value
“The semantic graph will become the backbone supporting data and analytics over a constantly changing data landscape. Organizations not using a semantic graph are at risk of seeing the ROI for analytics plummet due to growing complexity and resulting organizational costs.” - Roxane Edjlali, Senior Director, Product Management, MicroStrategy and former Gartner analyst
4. Human Insight Becomes Even More Important as Data Volumes Increase
“As more and more knowledge workers become comfortable working with data, they should also become conversant with data ethnography, or the study of what the data relates to, the context in which it was collected, and the understanding that data alone might not give them a complete picture.” - Chandana Gopal, Research Director, IDC
5. Next-Gen Embedded Analytics Speeds Time to Insights
“Concise analytics delivered in the context of specific applications and interfaces speed decision making. This style of embedding and the curation of concise, in-context analytics can take more time, but with advances including no-code and low-code development methods, we’re seeing rising adoption of next-generation embedding.” - Doug Henschen, VP and Principal Analyst, Constellation Research
6. The Need to Combine Data Sources Continues to Grow
“We expect to see a continued focus on data diversity. Organizations rarely have a single standard platform for their data and analytics and multiple tools are used to access the data. The need to combine these data sources will only continue to grow.” - David Menninger, SVP and Research Director, Ventana Research
7. Data-driven Upskilling Becomes an Enterprise Requirement
“Enterprise organizations will need to focus their attention not just on recruiting efforts for top analytics talent, but also on education, reskilling, and upskilling for current employees as the need for data-driven decision making increases—and the shortage of talent grows.” - Hugh Owen, Executive Vice President, Worldwide Education, MicroStrategy
8. AI Is Real and Ready
“Next year, more of these confident CDAOs and CIOs will see to it that data science teams have what they need in terms of data so that they can spend 70%, 80%, or 90% of their time actually modeling for AI use cases.” - Srividya Sridharan, Mike Gualteri, J.P. Gownder, Craig Le Clair, Ian Jacobs, Andrew Hogan, Predictions 2020: Artificial Intelligence—It’s Time to Turn the Artificial Into Reality (Checks), Forrester, October 30, 2019.
9. Mobile Intelligence Evolves for 2020 and Beyond
“Half of organizations will re-examine their use of mobile devices and conclude that their technology does not adequately address the needs of their workers, leading them to examine a new generation of mobile applications that enable a better work experience and far more effective connectivity to the rest of the organization and to customers.” - Mark Smith, CEO and Chief Research Officer, Ventana Research
10. The Future of Experience Management Is Powered by AI
“As apps get decomposed by business process to headless microservices, automation and intelligence will play a big role in creating mass personalization and mass efficiencies at scale. The Intelligent Enterprise will take context and data to power next best actions.” - R “Ray” Wang, Founder and Principal Analyst, Constellation Research
2019 was an interesting year for the technology industry. Many technologies were hyped heading into the year; however, their paths of adoption took different directions:
With these trends in mind, we predict that 2020 will be a year of convergence and course correction for younger technologies, with an increasing trend towards sustainability and greener solutions.
Blockchain finding its niche in the secure, distributed data store
While 2019 saw quite a few interesting applications of Blockchain, there have been two main challenges to its adoption: 1. Lack of standardisation (platforms, specification, interfaces, etc), and 2. The fact that benefits of Blockchain are realized once a majority of the collaborating providers in a chain all start using the same – or interoperable – platform(s).
The current major players in the platform space all have their own standards for their products – design, components, contracts and implementation – thereby tying an early adopter down to a single product. This lack of standardisation has been an area of significant focus/attention recently. ISO, IEEE have both started standards initiatives that would be ready by 2021 – and we would expect the platform providers to start supporting these standards once they hit early access (hopefully by 2020).
In parallel, enterprises today have started adopting Blockchain in a phased manner – the approach now – from the enterprise’s PoV – is to design the to-be state of the information architecture in a future-proof manner (keeping the application code as product-neutral as possible), and realize the true benefits of interoperability and data-sharing once the partners start their implementations.
With the purchase of a few Blockchain products by the existing stalwarts in the market, cloud support and integration with other existing technologies are also on the increase. With all of the above, we believe 2020 would be the year Blockchain enters mainstream adoption as the distributed store of the future.
AIoT adoption enables more sustainable, greener solutions
2019 saw an increase in an infusion of IoT into existing scenarios - with most of the challenges around adding IoT/sensor capabilities and enabling intelligence on the edge being resolved (this fusion of IoT and AI is now known as AIoT). While the original purpose behind enabling these capabilities may have been to do with early prediction of faults or optimising usage patterns for efficiency, the large volume of data now available from these devices/sensors has opened up new avenues of exploration/optimisation.
The evolution of IoT into AIoT progressed in 3 distinct stages:
1. Enabling core capabilities on the edge – these included basic sensor development, integration with available devices, etc.
2. Collecting the data generated from these sensors and storing them in a structured form on a central data store – typically on the cloud
3. Realizing the synergy between AI/ML and IoT and combining them together into AIoT (2019)
Focus in this area has also been evolving along with the core technology itself – shifting towards applications of AIoT (away from initial device capabilities/integration). In other words while IoT provided access to a large base of information (‘here’s the data’), AI/ML has brought in the intelligence and decision making (‘here’s what you can do with it’, and ‘here’s where you are inefficient’).
We believe that in 2020 this continued focus on AIoT adoption, combined with the ability to move decision making to the edge will drive a responsible, sustainable and greener approach to energy consumption.
Focus shift in the field of AI/ML from ‘narrow’ to multi-modal (or ‘general’) intelligence
2019 saw an increase in adoption of AI/ML solutions in newer and previously unexplored areas. This will continue into 2020 as algorithms get more intelligent. However, the scope of existing machine ‘intelligence’ is still too narrow, and focussed mostly on single objectives. To put it simply: the engine that is classifying a picture of a ‘cat’ doesn’t really understand what exactly ‘is a cat’ (semantic information that is understood by a different ‘narrow’ engine that only understands the semantic concept of a cat).
There are already efforts underway to create multi-modal intelligence in the industry. We at Mindtree are also looking at implementations that can combine natural language with visual cognition. The goal is to expand narrow-ness of an AI solution, and to enable transferability so that we can prove understanding – in the example above, that would mean that an algorithm will eventually be able to recognise a picture of a cat and understand what that actually means – much like how humans think.
We believe that in 2020, the focus of AI would shift towards multi-modal intelligence – such an achievement would open the doors to many more uses for AI/ML in future.
Humans’ trust in AI/ML solutions increases
While 2019 has seen an increased adoption of AI/ML across the industry, there have also been quite a few ‘unintended consequences’ – incidents leading to an overall ‘trust crisis’ with decisions put forward by algorithms. Algorithms trained on data captured over the past few years naturally reflect biases inherent in the data – but when evaluated through a more evolved values-set, are obviously found lacking. Making AI/ML solutions interpretable has hence been an area of interest – if we can understand or interpret the steps an algorithm took to arrive at a decision, we would be able to decide the limitations of the algorithm itself, or the missing gaps in the data that the algorithm was trained on.
In 2020 we will see two things help to address these limitations. Firstly, we will see increased regulatory support to ensure AI/ML follow certain principles. Secondly, solutions will be built to give an outside-in view of black box algorithms, helping humans better understand black box algorithms and thus alleviating the current trust issues.
As customer experience grows in importance, organizations will need to re-imagine their IT departments in order to unlock their digital capabilities and empower business-wide innovation, according to MuleSoft, provider of the leading platform for building application networks.
As such, MuleSoft has outlined the key trends that will shape these new priorities in 2020:
IT emerges as business enabler with reusable building blocks
IT efficiency is crucial to the success of digital transformation initiatives, and there is increased pressure on IT departments to deliver more, faster. However, IT can no longer keep up with the demands of the business; little over a third (36 percent) of IT professionals were actually able to deliver all projects asked of them last year.
In order to reduce this growing IT delivery gap, we’ll see IT move away from trying to deliver all IT projects themselves in 2020. The IT team’s new role will center around building and operating reusable assets with APIs, which the rest of the business can use to create the solutions they need. Essentially, IT begins to create new building blocks that can empower both technical and broader line of business users with reusable APIs. With API-led connectivity and organization’s educating teams on the power of integration, IT will empower companies to digitally transform and innovate faster than ever before.
Unlocked data will supercharge AI
Businesses are investing more in AI each year, as they look to use the technology to personalize customer experiences, reduce human bias and automate tasks. Yet for most organizations AI hasn’t yet reached its full potential, as data is locked up in siloed systems and applications.
In 2020, we’ll see organizations unlock their data using APIs, enabling them to uncover greater insights and deliver more business value. If AI is the ‘brain,’ APIs and integration are the ‘nervous system’ that help AI really create value in a complex, real-time context.
APIs and containers will navigate multi-cloud complexity
The majority of large enterprises today use multiple clouds (both public and private). But multiple clouds are difficult to manage and being able to move workloads between them remains a challenge for many organizations.
In 2020, we will see organizations use APIs in tandem with containers to navigate multi-cloud complexity. APIs will unlock the data and unique functionalities of applications residing in multiple cloud environments, while containers will neatly package up code and all its dependencies, so the application runs quickly and reliably from one computing environment to another. For example, HSBC has built a multi-cloud application network to meet growing customer demand. Turning to the cloud to accelerate IT delivery, HSBC has built and published thousands of APIs that were deployed across multiple environments using containers to unlock legacy systems and power cloud-native application development.
The continued rise of digital ecosystems
In 2020, we will see the continued rise of digital ecosystems, enabling companies to seamlessly incorporate new products and services into coherent customer experiences. This will see a further shift from today’s economic model, where businesses look to ‘own’ customer engagements entirely. In the new model, ushering in a new Coherence Economy, each of these providers will coordinate their services across the same ecosystem, without ever ‘owning’ the customer.
Organizations will look to extend their own capabilities and their own customer data to other businesses via APIs. For example, Mastercard has turned many of its core services into a platform of APIs, allowing it to create the Mastercard Travel Recommender, which allows travel agents and transportation providers to access customer spending patterns and to offer customers targeted recommendations for restaurants, attractions and activities. We expect to see more organizations take this approach in 2020.
“We’ve reached the point where data and digital transformation are well established priorities for every organization” said Uri Sarid, CTO, MuleSoft. “In 2020, organizations will take these priorities to the next level and focus on the connectivity that drives them forward. By offering their digital assets as API building blocks, organizations can empower every stakeholder in the business to contribute to creating coherent customer experiences, while simultaneously turning the IT department from a cost center into a source of value.”
A look at some of the most significant data analytics trends that will transform how businesses use data.
By Adam Kinney, Head of Machine Learning and Automated Insights at user behaviour analytics platform, Mixpanel and former Head of Advanced Analytics for Twitter.
Data has become the mantra for success of innovative businesses a long time ago and next year this trend will continue to accelerate and evolve, bringing in more innovation and more sophisticated approaches to data analytics than ever before. Here are some of these trends:
Augmented Analysis will drive adoption of new approaches to analytics such as causal inference
Augmented analytics uses machine learning and AI to understand complex patterns across data sets and user behaviours to more quickly and accurately answer questions. One key new development in this area will be causal inference. Causal inference is the next big thing in analytics. The idea is to use advanced statistical methods to isolate the most likely causes for particular user behavior. For instance, people who frequently write product reviews buy more online than people who do not write reviews. However, this correlation may be caused by different factors such as review writers being more loyal to the brand, so encouraging brand loyalty may be a better approach to increasing sales and revenue than encouraging people to write more reviews. Traditionally, the only way to isolate these causal relationships is by running a controlled A/B experiment, which is both costly and time-consuming. If businesses are able to isolate the three most likely out of ten possible causes without running an A/B experiment, a lot would be gained. This will simplify decision-making processes and help product teams prioritise. It would also allow companies to better allocate their data analysis resources.
There will be a stronger focus on localised data strategies
Geographic relevance to data privacy is becoming increasingly important. While large, multi-national privacy regulations such as GDPR or major laws like the California Consumer Privacy Act (CCPA) make headlines, there are many smaller, regional laws and customs that are often overlooked. For instance, apart from the European Union’s GDPR regulatory framework on data privacy, each of the 28 member states has their own data privacy rules. Similarly, in the US a growing number of states are considering introducing data privacy rules that all have unique regulatory features.
This explosion in regional privacy laws has left many companies wondering how to navigate these differences. In 2020 we’ll see a growing number of local and regional data privacy regulations across the world, which is likely to force global businesses to adopt localized data privacy strategies involving regional data residency programs. In this scenario, personal information will be stored within a specific geography where that data is processed in accordance with the local laws, customs, and expectations.
The rise of intelligence augmentation
Today everyone talks about AI and while we are a long way from true artificial intelligence, machine learning in analytics can help people make smarter decisions today. For example, machine learning algorithms can monitor all of your business metrics and automatically alert you when they change in an interesting way that you’d want to see. Or if you notice yourself that a critical metric has dropped but you don’t know why, machine learning algorithms can sift through the thousands of possible causes to identify the cause for you.
If we look even further, we can identify a few other key areas where AI can augment human intelligence. In 2020 and beyond, AI will become more widely used for visual recognition and natural language processing, which is the ability to understand human language. One of the most immediate applications is an area called sentiment analysis, in which the AI can judge how someone is feeling by analysing their speech. Companies can apply this type of analytics to customer service to spot when a customer is getting angry and prevent issues that can damage the relationship with the brand.
AI will help drive more sophisticated predictions
Then there’s predictions. Predictive analytics tools look at historical data and use machine learning to identify patterns in the data that have a high probability of predicting an outcome. These patterns are used to create a model, and the model is used to predict an outcome from new data that becomes available.
In 2020 predictions like this will become more powerful as companies are able to combine data sources. For instance, they’ll be able to take their user behaviour data and combine it with their analysis from the customer calls, customer support, and use that to make more accurate predictions about customer behavior. Going a step further, AI and ML will be increasingly used to personalise products and service and tailor the user experience to the specific needs of the customer.
The growth of connected devices will drive stronger demand for IoT analytics
The IoT industry is expected to reach $1.29 trillion in 2020 and there are already more IoT devices than there are smartphones. This means more complex data sets and in greater volumes of data than ever before. This new environment demands tools and skill sets that most companies don’t yet have and they often have difficulty adapting without IoT analytics.
In the consumer space, companies will use IoT analytics to mine data with the permission of their users. This includes data from mobile apps, fitness trackers, mobile devices, vehicles, and appliances. The same way that companies can track user behavior online, IoT analytics offers insight into how consumers use products in the physical world. This will enable consumer companies can provide increasingly personalised services and users get a better experience.
In the B2B space, IoT analytics will help drive greater productivity, reduce errors, improve understanding of smart city infrastructure and drive efficiency improvements.
As our world is becoming increasingly connected, data will become ubiquitous and more sophisticated. Understanding how this will impact businesses will be key for making the most of this opportunity and creating better user experiences.
Despite cybersecurity investment increasing each year between 12 to 15% according to Gartner, organisations are struggling to keep up with the sophistication of threats. Attackers are successfully pivoting away from complex technical exploits and instead are identifying simpler ways to exploit a business’s core functionality; the business’s logic.
Business logic attacks are on the rise and pose a significant security threat to organisations across all sectors in 2020. These attacks don’t target what many consider to be traditional security vulnerabilities, but instead use automated bots to exploit weaknesses in the normal, everyday use of a website or app.
The recent Just Eat and Deliveroo hacks are good examples. Each food delivery service relies on a great customer experience and zero friction, to provide customers with this level of convenience, features such as one-click ordering and the storing of card details are incorporated into the service’s core-functionality. However, bot technology simplifies the practice of fraudulently taking over an account, enabling hackers to commit fraud via the account or sell on the verified username and password for a profit. In such instances, organisations often remain unaware that any untoward behaviour has taken place and so are unable to stop it in its tracks.
Taking preventative steps is vital. Businesses must focus on identifying bot intent, by not only asking “is this a bot?” but also “what is this bot doing?”. Once they have gained this visibility and understanding of their web-facing traffic, they can both stop the attack and mitigate risks.
Streaming services will be the next target
The popularity of Netflix, Amazon Prime and Hulu makes streaming services a prime target for account takeover hackers. Over the past 12 months, we’ve seen a rise of in the sales of compromised streaming services accounts. With the launch of new services such as AppleTV+ and Disney+, we don’t see this slowing down anytime soon.
These new platforms represent a new high value service that is virgin territory for thousands of previous breached credentials. In fact, just hours after the Disney+ launch in the US, Canada and the Netherlands, accounts and combo lists were put up for sale on hacking forums.
Like food delivery companies, streaming services are facing intense competition. That means the products and services they roll out must be as frictionless as possible, including their login systems. Therein lies the problem. These login systems naturally drive consumers to use simple, short passwords and the same password for various platforms, to make it easier to login on TVs and smart devices every time they want to watch a series or movie. On the other hand, this also makes it easier for criminals to access customer accounts using automated bots, like the recent Just Eat and Deliveroo hacks demonstrated.
The fact that users connect to their accounts when on holiday or at friends’ homes only adds to the issue. Streaming and delivery services find it harder to recognise when an account has been genuinely compromised and put the necessary, security processes in place.
As we head into 2020, streaming services need to improve their understanding of what bots are doing and not just how they are doing it. By doing so, they can start to manage good bots and rapidly mitigate malicious bot attacks, without adding friction to the customer journey.
Watch out for free VPNs
Earlier this year, the Motion Picture Association of America (MPAA) reported that at 613.1 million, streaming subscriptions had now surpassed cable subscriptions worldwide; the list of subscription services will continue to grow in 2020 with some big names announcing plans to launch new services.
And with all growth comes new challenges. As the number of streaming services increases, more consumers are going to use VPNs to watch movies or TV shows when visiting a country and most likely use those that are free. However, some of these “free” VPN services aren’t completely free, there is always a price to pay; whether it is monetary or otherwise. Free VPN providers often require users to forego either: speed, bandwidth or your security. Sometimes all three.
VPNs used by residential users include the right for the provider to make that connection available for use by automated bot traffic that wants to automate web requests from genuine residential addresses. These ‘residential proxy’ networks are essentially legitimate commercial botnets.
But because there is no infrastructure associated with running them, they are not detected by standard IP address blacklists and they use real consumer devices, so device-based fingerprinting will appear as a real user. Cybercriminals are taking advantage of residential proxies to perform different illegal or unauthorised activities on users’ machines. It is therefore important to diligently research a VPN provider, ensuring you thoroughly read the terms and conditions, before installing any VPN software; especially as more streaming services get launched in 2020.
5G marks the start of a technology revolution. By making high-speed connectivity ubiquitous, it will accelerate technology adoption across the country and transform how we work, play and innovate.
By Jennifer Major, Head of IoT, SAS UK & Ireland.
However, 5G will also create unprecedented complexity for network operators and service providers. 5G networks aren’t just expected to deliver high-quality sound and picture quality – they’ll be the carriers of 8K video streaming, machine communications and a whole host of applications never seen before. 5G networks will produce masses of data that operators will need to analyse, interpret and act on to deliver the one-to-one services consumers demand.
To deliver a truly resilient and customer-centric network, telco operators have no choice but to automate. The latest AI and machine learning technologies can monitor, operate and optimise the 5G networks of the future, delivering an unlimited connectivity service that’s stable, cost-effective and competitive.
A big data problem
The anticipation surrounding 5G is enormous. By connecting everyone to everything, and across every sector, the technology is widely expected to kick off a new industrial and technological revolution.
Yet the advent of 5G invites a massive capacity and operational challenge. As more customers and organisations migrate, 5G networks are expected to grow to cover 65 per cent of the world’s population and carry 35 per cent of all its mobile data by 2024. For telecommunications operators, this data will be generated in unprecedented volumes. Telecommunications operators will need to constantly invest in their infrastructure to support the capacity increases needed.
With the number of devices connected and communicating within the Internet of Things (IoT), and the high-speed, high-bandwidth possibilities of 5G, traditional data collection and analysis is no longer sufficient. For the sake of planning, running and optimising customer experiences, operators will have a limitless need for more intelligent decision-making. Decisions also need to be taken across domains and silos in a split second to meet customer expectations and honour service agreements.
Delivering on this promise would be impossible if we could only depend on human staff and operators to generate insight and make decisions. Fortunately, AI and machine learning have a vital role to play here. When you merge AI and IoT, you get the Artificial Intelligence of Things, or AIoT – a revolutionary combination that can transform industries, elevate customer experiences and accelerate business performance exponentially.
By 2022, Gartner predicts that more than 80 per cent of enterprise IoT projects will include an AI component, up from a mere 10 per cent today.
The best laid plans…
Telcos should seek to automate processes continuously during 5G rollout, with each step iterating on the last. By doing this, operators can sidestep major investments and overhauls by performing focused adjustments and expansions that both improve network performance and enhance the user experience.
However, before you maintain or enhance a 5G network, you first need to build it. Even at the planning stage there is enormous potential to make use of AIoT.
In today’s fast-moving, digital world, telcos are under tremendous pressure to deliver speed and scalability. To compete, they must be able to move resources, deliver connectivity and construct new capacity rapidly. However, it isn’t easy to respond with agility when, to supply connectivity to a new site of location, often requires the building of a new radio tower or data centre and significant civil engineering.
Network planning is often time and resource-intensive because the work is so data-heavy. Before a decision can be made, operators need to carefully review and analyse an assortment of population tables and network traffic. The objective is to have the cell site built and ready before customer demands start flooding in, but it’s challenging when the planning process is so complex and time-consuming.
AIoT solutions are invaluable here because they can do much of the heavy-lifting for you. AIoT and analytics can process and produce insights from an immense amount and variety of data faster than any human can. With this information, operators can truly predict - rather than simply respond - to demand, ensuring they can build a cell site in the most lucrative location before the competition moves in.
The customer comes first
AIoT really comes into its own, however, in network operations. 5G networks are micro in nature, made up of thousands of disparate and often siloed cell sites and data centres, each carrying and processing immense quantities of data. Keeping track and monitoring all of these sites, ensuring they are working properly and running efficiently, won’t be feasible without some form of automation.
You also need to consider the huge amount of data that needs to be analysed. It’s no longer feasible to move all of that data to a data centre or the cloud before you are able to detect network anomalies. This means that you need the ability to deploy AI out to the edges of the network, analysing the data in-stream, as it’s created. Actions can then be taken to fix problems before they disrupt the network, by detecting patterns of behaviour that provide an early warning of likely issues.
AIoT can predict network faults based on historical data and real-time, continuous analysis. It uses many different prediction models to work out the probability that a set threshold will be breached, alerting operators to the threat before it can snowball into a crisis.
The process is even more rigorous when machine learning is deployed for anomaly detection. Once a capacity issue is discovered, AIoT can search for the root cause and find the answer fast. From a business perspective, this enables automated monitoring and helps deliver on customer priorities like KPIs and SLAs.
Operators are still some time away from ‘closed loop automation’, where AIoT systems are able to automatically resolve network issues and provision resources. But the building blocks of technology to do this already exist. The challenge is capturing all the data that skilled workers use to make decisions, so that the decision-making process can be automated. For the time being, however, operators can benefit greatly from their workers acting in concert with AI solutions. They can work symbiotically, with the AI performing the analysis and providing the insight, while the human operators use this intelligence to solve challenges. Not only does this reduce human error, it frees the experts to focus on more tactical and valuable tasks.
Ultimately, 5G brings challenges but also great opportunities to the telco market. It will introduce an unprecedented level of complexity and disruption, which can be mitigated by adding a layer of automation to the process. AI-driven automation is an opportunity for operators to meet their business goals and deliver resilient, on-demand services optimised to meet the needs of the customer.
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 7.
Roger Magoulas, VP of Radar at O’Reilly takes a look at the new developments in automation, hardware, tools, model development, and more that will shape (or accelerate) AI in 2020.
1. Signs point toward an acceleration of AI adoption
We see the AI space poised for an acceleration in adoption, driven by more sophisticated AI models being put in production, specialised hardware that increases AI’s capacity to provide quicker results based on larger datasets, simplified tools that democratise access to the entire AI stack, small tools that enables AI on nearly any device, and cloud access to AI tools that allow access to AI resources from anywhere.
Integrating data from many sources, complex business and logic challenges, and competitive incentives to make data more useful all combine to elevate AI and automation technologies from optional to required. And AI processes have unique capabilities that can address an increasingly diverse array of automation tasks—tasks that defy what traditional procedural logic and programming can handle, for example, image recognition, summarisation, labeling, complex monitoring, and response.
In fact, in our 2019 surveys over half of the respondents say AI (deep learning, specifically) will be part of their future projects and products—and a majority of companies are starting to adopt machine learning.
2. The line between data and AI is blurring
Access to the amount of data necessary for AI, proven use cases for both consumer and enterprise AI, and more-accessible tools for building applications have grown dramatically, spurring new AI projects and pilots.
To stay competitive, data scientists need to at least dabble in machine and deep learning. At the same time, current AI systems rely on data-hungry models, so AI experts will require high-quality data and a secure and efficient data pipeline. As these disciplines merge, data professionals will need a basic understanding of AI, and AI experts will need a foundation in solid data practices, and, likely, a more formal commitment to data governance.
3. New (and simpler) tools, infrastructures, and hardware are being developed
We’re in a highly empirical era for machine learning. Tools for machine learning development need to account for the growing importance of data, experimentation, model search, model deployment, and monitoring. At the same time, managing the various stages of AI development is getting easier with the growing ecosystem of open source frameworks and libraries, cloud platforms, proprietary software tools, and SaaS.
4. New models and methods are emerging
While deep learning continues to drive a lot of interesting research, most end-to-end solutions are hybrid systems. In 2020 we‘ll hear more about the essential role of other components and methods—including Bayesian and other model-based methods, tree search, evolution, knowledge graphs, simulation platforms, and others. We also expect to see new use cases for reinforcement learning emerge. And we just might begin to see exciting developments in machine learning methods that aren’t based on neural networks.
5. New developments enable new applications
Developments in computer vision and speech/voice (“eyes and ears”) technology help drive the creation of new products and services that can make personalised, custom-sized clothing, drive autonomous harvesting robots, or provide the logic for proficient chatbots. Work on robotics (“arms and legs”) and autonomous vehicles is compelling and closer to market.
There’s also a new wave of startups targeting “traditional data” with new AI and automation technologies. This includes text (new NLP and NLU solutions; chatbots), time series and temporal data, transactional data, and logs.
And both traditional enterprise software vendors and startups are rushing to build AI applications that target specific industries or domains. This is in line with findings in a recent McKinsey survey: enterprises are using AI in areas where they’ve already invested in basic analytics.
6. Handling fairness—working from the premise that all data has built-in biases
Taking a cue from the software quality assurance world, those working on AI models need to assume their data has built-in or systemic bias and other issues related to fairness—like the assumption that bugs exist in software—and that formal processes are needed to detect, correct, and address those issues
Detecting bias and ensuring fairness doesn’t come easy and is most effective when subject to review and validation from a diverse set of perspectives. That means building in intentional diversity to the processes used to detect unfairness and bias—cognitive diversity, socioeconomic diversity, cultural diversity, physical diversity—to help improve the process and mitigate the risk of missing something critical.
7. Machine deception continues to be a serious challenge
Deepfakes have tells that automated detection systems can look for: unnatural blinking patterns, inconsistent lighting, facial distortion, inconsistencies between mouth movements and speech, and the lack of small but distinct individual facial movements (how Donald Trump purses his lips before answering a question, for example).
But deepfakes are getting better. Automated detection methods will have to be developed as fast as new forms of machine deception are launched. But automated detection may not be enough. Detection models themselves can be used to stay ahead of the detectors. Within a couple months of the release of an algorithm that spots unnatural blinking patterns for example, the next generation of deepfake generators had incorporated blinking into their systems.
Programs that can automatically watermark and identify images when taken or altered or using blockchain technology to verify content from trusted sources could be a partial fix, but as deepfakes improve, trust in digital content diminishes. Regulation may be enacted, but the path to effective regulation that doesn’t interfere with innovation is far from clear.
8. To fully take advantage of AI technologies, you’ll need to retrain your entire organisation
As AI tools become easier to use, AI use cases proliferate, and AI projects are deployed, cross-functional teams are being pulled into AI projects. Data literacy will be required from employees outside traditional data teams—in fact, Gartner expects that 80% of organisations will start to roll out internal data literacy initiatives to upskill their workforce by 2020.
But training is an ongoing endeavor, and to succeed in implementing AI and ML, companies need to take a more holistic approach toward retraining their entire workforces. This may be the most difficult, but most rewarding, process for many organisations to undertake. The opportunity for teams to plug into a broader community on a regular basis to see a wide cross-section of successful AI implementations and solutions is also critical.
Retraining also means rethinking diversity. Reinforcing and expanding on how important diversity is to detecting fairness and bias issues, diversity becomes even more critical for organisations looking to successfully implement truly useful AI models and related technologies. As we expect most AI projects to augment human tasks, incorporating the human element in a broad, inclusive manner becomes a key factor for widespread acceptance and success.
The first wave of companies to go passwordless
“Next year, we will see the first wave of companies go passwordless, embracing more effective ways of securing digital identities. Passwords have failed us as an authentication method for too long and enterprises will move beyond the reliance on this ineffective method.
“According to Verizon’s Data Breach Investigations Report in 2019, 80% of hacking-related breaches were as a result of weak, stolen or reused passwords. Companies will adopt alternative methods to determine access, such as biometrics, IP addresses and geolocation. As these factors are increasingly considered as trusted, more organisations will grant access without the need to enter a password.”
Best-of-breed security challenges to be overcome
“Next year, the number of businesses adopting ‘best-of-breed’ applications will continue to increase exponentially. We will see an explosion of applications as the workforce now has a much bigger say in tech purchasing decisions. This bottom-up approach enables teams to go out and pick the tool that they feel will enable them to do their best work.
“But, this new approach to workplace tools has inevitably led to new challenges, the greatest being security. Due to the increasing volume of cloud-based applications, the attack surface is far greater, making companies more vulnerable than ever. With 40 percent of large UK businesses expecting to be cloud-only by 2021, according to McAfee, 2020 will see companies forced to adapt and strike a balance between autonomy and security. To improve security, all businesses will need to embrace the zero-trust paradigm. The old perimeter-centric, binary model of security has failed and firms need to embrace this new approach that encompasses every connection that their workforce and wider stakeholders make.”
Murali Gopalakrishna, Head of Product Management, Autonomous Machines, NVIDIA, comments:
“In the year ahead, we can expect to see a number of critical advancements in robotics, enabled by the proliferation of AI. Traditionally, robots have been good at doing one single thing. Moving forward, we’ll see robotics advancing from this single functionality approach to being more dynamic and configurable. Using deep learning and AI, we’ll be able to train the robot overnight so they can do more things and multi-task; while being just as fast and efficient. We’ll also see UGVs/AGVs (unmanned and autonomous ground vehicles), which are usually confined to rigid tracks and pre-set paths, now given the flexibility to move safely around factory floors or logistics environments. This also includes co-bots--which are typically caged/placed in isolated environments given danger or safety risks--now working safely in shared spaces with humans, or in close proximity to each other.As demand grows for food and goods delivery, we’ll see delivery vehicles and drones becoming increasingly important to sustain consumer demand. Finally, safe, efficient and scalable simulation tools will continue to have a vital role in robotics development. Leveraging our expertise in graphics and gaming, our simulation technology can help companies train robots to properly handle diverse and rare conditions occurring in the workplace; and be built at greater scale and reduced cost. 2020 will be a defining year for showcasing how NVIDIA is bringing the power of modern AI, deep learning and inference to embedded systems at the edge.”
1. Authentication Means Everything: With the recent and continued failings of companies to secure customer access, 2020 will likely see the rise of large-scale multi-factor authentication adoption by enterprises and consumers. Companies who are truly looking to protect their customers and most importantly their revenues, will embrace higher forms of authentication to achieve those ends.
2. America Gets Serious: The U.S. is one of the few developed countries without a national data privacy standard. Not only is it hurting America economically and commercially, it has started to raise concerns relative to national security and the protection of American citizens. Congress will likely finally step up to address this gap and ideally take the lead by protecting more than just data, but the digital identities of all Americans.
3. Privacy and Security Become Competitive Advantages: In 2019 we saw the very beginnings of commercial enterprises promoting their privacy and security practices to their customers as a competitive advantage. In 2020, this trend will accelerate as companies begin to adjust to the new reality. A reality where more than 60% of customers hold companies responsible for protecting their data.
4. Rise of the Digital Identity: Digital identities were a thing of fiction just a few years ago. In 2020, commercial and government interests will begin to intersect as state and federal governments, as well as various sectors of business (ie: financial services, social media, healthcare) rush to build “digital identity standards”. Some standards will be built with the needs of the consumer/citizen in mind, but time will tell if others attempt to capitalize and commercialize the value of these digital identities.
5. Consumers and Citizens' Patience Runs Out: In 2019 consumers and citizens began to voice their concerns over companies' repeated data breaches and security failures that have exposed their data, finances, families and services to greater and greater risks. In 2020, we’ll see true and substantial consequences for organizations that do not keep their customers, employees, partners and citizens safe in the digital world. With over 80% of consumers reporting they would stop engaging with a brand online following a data breach, it's clear that people are ready to walk away from companies that can’t get identity and security right.
Prediction #1: The importance of Total Cost of Ownership (TCO): HPC storage solutions must deliver value far beyond the initial purchase price.
“As the requirements for HPC storage systems are becoming more diverse with the addition of new workloads such as Artificial Intelligence (AI), there is an increasing need to start looking at the overall impact on the organisation of the ongoing cost of operations, user productivity and the time to quality outcomes. In addition to evaluating the price/performance ratio, buyers will need to start paying close attention to a range of purchasing considerations that go beyond the initial investment. Those include the cost of unplanned downtime in terms of application user productivity, the cost of complexity and the headcount required to manage it, and the need for responsive support for mission-critical infrastructure such as your storage.”
Prediction #2: As Enterprise’s AI projects graduate from “exploratory” to “production” they will leave the public clouds for less costly on-premises solutions, funding a boom in HPC infrastructure build-out, but the requirements for that infrastructure will have changed based upon their cloud experience.
“Public clouds are great for learning and experimentation, but not for high-utilisation production operations. Public clouds will, however, have a large influence on the next generation of on-premise infrastructure that is built. The need for the lowest time-to-solution, quickly taking action based upon the insights that AI can give you, drives AI to push the underlying hardware (e.g: GPUs and storage) as hard as it can go. But the simple truth is that the cost of a dedicated resource in a public cloud is higher than the cost of owning that resource. Another simple truth is that the value of AI is the computer deriving information that you can act upon from mountains of data. Add in the fact that AI has an insatiable need for growth of training data, and that public clouds have never-ending charges for data storage, and the costs climb. Put those simple facts together and it’s clear that production AI will be less costly if it is performed on-premises. The industry has become used to the extreme flexibility and simplicity of management that public clouds provide, and they will want to retain those characteristics in their on-premise solutions at the lower cost it provides.”
The move to the cloud has been a major topic of conversation at board level over the past couple of years. In fact, 77% of enterprises have at least one application or a portion of their enterprise computing infrastructure in the cloud.
By Mike Gatty, Head of Secured Connectivity, Maintel.
This move to the cloud means that many businesses are now reassessing how they connect to their data and applications. Should they use ‘traditional’ wide-area network (WAN) methodologies or should they consider using SD-WAN?
SD-Wan, software-defined wide-area networks (SD-WAN), were seen as a gimmick by many when they first entered the public discourse, due to outlandish promises of reduced bandwidth costs. The truth is that SD-WAN can reduce these costs, but not always – leading to a level of a distrust at board level.
Changing perception of SD-WAN
Over the past 12 months or so, the reputation of these types of networks has begun to change. The original myth of cost-savings has been dispelled and CIOs have started to recognise the true value of SD-WAN – agility, scalability, and dynamic network control.
Enterprise network architects are now demanding solutions that are better suited to meet the changing needs of their business and operational requirements. This means quicker to deploy, better use of available bandwidth, dynamic congestion control, and application-aware routing. And this is why it’s time to take SD-WAN seriously.
SD-WAN as the future
There are several other things, on top of that widespread adoption of public cloud services, currently causing business decision makers to turn their heads away from the familiar concept of Multiprotocol Label Switching (MPLS) and look to the more agile SD-WAN as the future of their networks.
Another trend is the high costs of traditional bandwidth, especially private circuits, as high-quality Internet services are becoming more prevalent.
Today’s average worker needs access to large amounts of bandwidth on a range of devices to meet enterprise operational requirements, whether downloading or streaming high volumes of content, having instant access to corporate data and information, and even collaboration such as video calls. This high use of bandwidth will only increase in the future as well, with big data and IoT playing a crucial role as network architectures continue to evolve.
To meet these needs, SD-WAN is able to run over a variety of transmission infrastructures, including public Internet, private circuits, and LTE (3G/4G). This connectivity can be added seamlessly without the pain of expensive re-configurations of remote devices, and therefore will greatly reduce the cost of deployment.
In addition to all this, one of the most significant benefits SD-WAN has over traditional WAN models is the ability to dynamically route traffic across any piece of bandwidth a business owns, as the technology is transport-agnostic.
Businesses preparing for change
Despite these clear benefits, there has not been widespread adoption of SD-WAN yet. Many companies have been tied down to multi-year service contracts, so have been unable to adopt new technology. Introducing the use of SD-WAN can also require significant investment, so many CFOs are keen to sweat the existing technology they use as much as possible.
However, over the next 12 to 18 months this looks certain to change.
SD-WAN gives enterprises the agility, scalability, and visibility from a network perspective to meet the structural change in application and data location, as well as meeting the impending requirements for big data and IoT applications.
There’s no doubt we’ll see significant market uptake for SD-WAN once legacy network obligations have been met and existing assets have been written off. In the mean-time, business are in danger of falling behind the curve if they are not already looking at how SD-WAN could underpin their business application and data access strategy.
As the SD-WAN migration can be complex and costly, businesses should ideally look to partner with a company that has experience in helping companies utilise the benefits of the new SD-WAN model. The benefits of SD-WAN are plenty, but only if the migration is managed well and the technology leveraged to its full capacity. SD-WAN technology may not have reached its full potential yet, but it’s time for business leaders to take it seriously.
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 8.
BYOD will increase the use of mobile platforms an attack vector and drive zero trust security adoption
“The bring your own device (BYOD) trend being championed by the mobile and remote workforce is making mobile platforms a lucrative attack vector for fraudsters. It’s much easier to infect a mobile application and let it do your work than to attack a larger system – and the industry has witnessed an increase in malware attacks on internal networks vs. external networks as a result. This needs to be a strong focal point in every fraud prevention strategy for 2020.
As the BYOD trend continues to proliferate - and employees no longer have designated “work” or “personal” computers and are increasingly using mobile devices, and even smart watches, on company networks – organizations need to abandon the belief system that everything in the perimeter is secure. In 2020, organizations need to stop differentiating between remote and local users and instead apply the same security postures and compliance checks to all users. This will ensure it is the right user, using a clean device. Employing Zero Trust principles will help stop the dramatic increase in attacks on internal networks, while allowing companies to remain agile in digital transformation and future of work initiatives.” - Prakash Mana, VP of Product at Pulse Secure.
Security tools will continue to consolidate
“As more and more enterprise technologies have been introduced, security architects and managers have been forced to adopt and juggle an increasing number of security tools. There is a unique paradigm happening where customers and their vendors explore new products whenever the organization adds new use case, including Cloud infrastructure or SaaS applications. For example, it’s not unusual for an organization to have a multitude of different gateways to protect different applications. The amount of effort and the level of complexity for a security admin to manage all these different proxies is high, and it exposes organizations to visibility and controls gaps, as well as vulnerabilities due to a larger attack surface. In 2020, we will see increased enterprise demand for comprehensive security solutions to manage access across a company’s entire digital ecosystem – from mobile access, to cloud, to data centers and even IoT devices – on one unified platform.” - Sudhakar Ramakrishna, CEO at Pulse Secure.
Continued adoption of AI for cybersecurity threat response and risk mitigation
“65% of enterprise cybersecurity teams aren’t using automation to manage their environments. And, given the increasing complex nature of the digital ecosystem, this will change in 2020 as AI and ML technologies for automated response to cybersecurity and risk mitigation continue to evolve. Furthering this trend is a better understanding of how AI and ML can be deployed in the fraud prevention space without creating false positives and while ensuring the solution implemented is doing what it is supposed to do. There is currently a knowledge gap among many security experts with how these solutions work, what devices they should be pulling ML information from and how they can do it in a secure fashion. Any organization implementing an automated, intelligent solution in 2020 needs to first secure its infrastructure to ensure they aren’t exposing their company to new vulnerabilities.
Can you imagine the chaos and cascading security and regulatory implications that would occur if an AI solution was compromised and used against an organization? Companies must proactively defend their AI automation from attackers. If an organization is relying on AI or ML technologies to make intelligent decisions for them, they must know definitively that no one has tampered with it and that the information is accurate.” - Mike Riemer, Chief Security Architect at Pulse Secure.
Regulatory requirements catch up to reduce IoT and IIoT device security exposure
“After years of haplessly watching technology race ahead of regulation, governments around the world have started to enact regulations to protect consumers and mitigate security risk. A big focus for 2020 will be the increase in regulatory requirements around IoT and IIOT devices as they proliferate in corporate networks and OT systems. When organizations do not know where a device is on their network, or who it is communicating with, that poses severe security risks. And, as more organizations adopt IoT and IIoT devices in the workforce, there need to be security policy and controls in place. In the United States, much of this regulatory reform has been spearheaded by the state of California, which recently passed SB-327, the first law to cover IoT devices. It will take effect January 1, 2020, and regulators around the world will certainly be watching to see how effective the legislation is at minimizing security risks from IoT devices. Since the regulatory laws often have a cascading effect, we can certainly expect to see similar bills appearing across the country and eventually at a federal level. Organizations will need to make sure they, or any third-party security vendors, are compliant to protect IoT devices and the information they contain.” - Mike Riemer, Chief Security Architect at Pulse Secure.
Consumers will expect security in cloud and SaaS
“Cloud is now the norm across a wide range of industries, as is SaaS. Traditionally, the response to adopting a new enterprise technology is to purchase new products to secure it. As the digital enterprise has become more complex, that model is simply not sustainable for stretched security personnel. Adding to this complexity are regulations like the GDPR, which introduced penalties for not adequately securing consumer data. In 2020, cloud and SaaS providers are working with security vendors to proactively secure cloud and hybrid environments, as well as secure SaaS products. As customers adopt Zero Trust policies, they will scrutinize any outside software for security flaws that could compromise their business, raising the industry standard for security in cloud and SaaS products.” - Mike Riemer, Chief Security Architect at Pulse Secure.
Zero Trust goes from “nice to have” to “must have”
“Zero Trust garnered a significant amount of attention in 2019 as companies at the cutting edge of enterprise technology began adopting it. In 2020, organizations at all levels of digitization will convert to Zero Trust frameworks as the threat landscape diversifies. As remote work and hybrid IT models become increasingly common, organizations across all industries will adopt Zero Trust in order to better manage user access and data. With hybrid IT adoption comes an additional challenge: transition. Securing legacy systems while ensuring data is moved securely requires a complete understanding of the ecosystem, as well as the ability to translate different policies across shifting systems. By vetting every user and device before allowing them access, Zero Trust minimizes threats while ensuring nothing is lost in transition.” - Sudhakar Ramakrishna, CEO at Pulse Secure.
Healthcare braces for a flood of cybercrime
“It is already well-understood that the healthcare industry struggles to secure its trove of sensitive data. But, even as widely discussed as this issue is, the healthcare industry has been slow to adopt effective security measures and quick to embrace an even greater influx of data during digital transformation efforts. As healthcare continues to evolve towards the convenient, self-service model that today’s digital-first consumer demands, there will be serious security implications as companies try to control the release of data and information. For example, telemedicine is making patient care extremely convenient, but is the doctor-patient communication secured and encrypted? If not, anyone can intercept the data and communication in transit. How do you secure that information stored on the end-user’s phone? The security of any network is only as strong as the weakest link. In this service model, the end-point device is most likely to be compromised and healthcare organizations need to ensure they are meeting all the security and regulatory requirements.
Adding to the pressure is the looming threat of new patient data regulations, including a revamp to HIPAA that could always require that health data be accessible to patients. To deal with regulatory scrutiny, the healthcare industry will have to rapidly modernize cybersecurity practices with a Zero Trust model that can adapt to the flood of new data sources but also secure cloud and hybrid environments. Should the patient access data requirement pass, providers will also need to manage an influx of new access points and users.” - Mike Riemer, Chief Security Architect at Pulse Secure.
From the edge, to third party, cyber-physical and identity risk, RSA’s ‘20 Predictions for 2020’ to highlight rising complexity and growing challenge of managing digital risk
Digital risk management experts at RSA Security have today released their ‘20 Predictions for 2020’ eBook, detailing key cyber trends for the year ahead. With contributions from across the RSA team – including President, Rohit Ghai and CTO, Dr. Zulfikar Ramzan – the predictions offer a steer to companies on emerging threats and highlight that cybersecurity issues remain the number one digital risk for organisations undergoing digital transformation.
“In 2019, across all verticals, cyber-attack risk ranks as one of the top digital risk management priorities,” commented Rohit Ghai, President at RSA Security. “Don’t expect anything to change in the New Year. In fact, across both public and private sectors, organisations will continue to embrace digital transformation initiatives. On the risk register, cyber-attack risk will remain the leading business risk and inevitably, organisations will continue to struggle to gain visibility across a growing number of endpoints and a more dynamic workforce. Both will create gaps for potential exploitation.”
Some of the key trends that businesses need be aware of include:
Cybersecurity becomes a matter of safety
“There will be a shift in mindset from cybersecurity to “cyber safety” in 2020. Global events like the Summer Olympics in Japan or World Expo in Dubai are blending physical infrastructure with connected systems to deliver better user experiences,” comments Alaa Abdulnabi, Regional Vice President of META. “However, these events underscore a new reality: cyber is much more than just a data security issue. It will become a component of physical security, too.”
Expect to see a cyber incident at the edge in 2020
“The continued proliferation of IoT devices will make edge computing an essential component of enterprise IT infrastructure in 2020,” comments Rohit Ghai, President. “To power these systems, 5G will become a bedrock for organisations looking to speed up their IT operations. With this innovation and speed will come greater digital risk. A security incident in the New Year will serve as the wake-up call for organisations leaning into edge computing. It will remind them that threat visibility across is essential as their attack surface expands and the number of edge endpoints in their network multiplies.”
Breach accountability becomes murky with third parties
“A high-profile case where an organisation is breached due to an API integration will create confusion over who is responsible for paying the GDPR fine,” comments Angel Grant, Director of Digital Risk Solutions. “This will spark conversations about regulatory accountability in a growing third-party ecosystem.”
The rise of cyber-attacks in the crypto-sphere
“The security of cryptocurrencies rests on safeguarding users’ private keys, leaving the ‘keys to kingdom’ accessible to anyone who fails to adequately protect them,” comments Dr. Zulfikar Ramzan, CTO. “Cybercriminals usually follow the money, so expect that cryptocurrencies will be at or near the top of attacker’s wish lists in 2020.”
The API house of cards will start to tumble
“Many organisations have stitched together a fragile network of legacy systems via API connections to help better serve customers and improve efficiency,” comments Steve Schlarman, Director & Portfolio Strategist. “A security incident in the New Year will disrupt the patchwork of connections and it will lead to major outages. The event will serve as a call-to-action for security and risk teams to evaluate how their IT teams are patching systems together.”
The identity crisis will worsen
“Businesses are coming to realise that mismanaged credentials and passwords are often the weakest link in a security chain and identity compromise continues to be at the root of most cyber incidents,” comments Rohit Ghai, President. “Next year, we will see identity risk management become front and centre in cyber security programs as organisations adopt more and more cloud solutions; as workforces become more dynamic with gig workers and remote employees and as the number of identities associated with things or autonomous actors continues to dwarf the number of human actors on the network.”
By Paul Speciale, Chief Product Officer, Scality
Object storage at the edge will be on Flash
Object storage will move into the edge, for applications that capture large data streams from a wide variety of mobile, IoT and other connected devices. This will include event streams and logs, sensor and device data, vehicle drive data, image and video media data and more, with high data rates and high concurrency from thousands or more simultaneous data streams. These applications will be developed for cloud native deployment, and will therefore naturally embrace RESTful object style storage protocols - making object storage on flash media an optimal choice on the edge to support this emerging class of data-centric applications.
Data storage will become massively decentralised, as enterprises leverage a combination of on-premises and public cloud IT resources. This will create a need for a unified namespace and control plane to simplify data visibility and access. Moreover, corporations will use a variety of public clouds, each one selected to help solve specific business problems, thereby creating a multi-cloud data management problem. In addition, the emergence of edge computing will further drive decentralisation as corporations choose to deploy IT resources “near” the edge devices they manage. These trends all help to create a new and extreme “cloud data silos” scenario, that can only be addressed by solutions that provide global data visibility across these distributed clouds and data centres.
Digital Transformation and Multi-Protocol:
Multi-Protocol systems will be embraced during digital transformation: Customers transforming from legacy applications to cloud native applications will continue to embrace RESTful protocols as their standard mechanism for accessing data storage services and systems. Systems that are multi-protocol (legacy protocols such as NFS and SMB for file access plus new RESTful APIs such as AWS S3 and Azure Blob Storage for object style access) will be adopted to help companies transition during this phase. Moreover, object storage services and systems will become a standard solution for stateful container storage.
Infrastructure application will be deployed on Kubernetes, including storage:
Kubernetes will be the default platform for infrastructure deployment in the data centre: as enterprises transform and adopt cloud-native applications, the need for a standard deployment and orchestration framework for containers will increase, just as it did during the Virtual Machine (VM) wave over the course of the last two decades. Kubernetes will be that standard orchestration platform, not only for applications deployed in containers, but also for infrastructure elements built as services and microservices. This will extend to data storage and data management infrastructure deployed on Kubernetes.
Monopolies and the cloud:
It’s already started, but in 2020, IT teams will make the move from “all-cloud” initiatives to hybrid- and multi-cloud data management solutions as they continue to recognise that to depend 100% on a single cloud provider is to empower a monopoly. Cloud providers have capitalised on lock-in, and their customers see it. And this is a key reason why 53% of enterprises that had moved everything to public cloud are already repatriating some of their data (IDC). Storing data in one cloud and on-premises, (hybrid cloud infrastructure) or in multiple clouds (multi-cloud infrastructure) are both sensible, proven approaches to ensure organisations can remain in control and beat the monopoly.
Monopolies and AI:
AI will compete more strenuously against….AI, fueling monopolistic practices and reducing competitive situations (a key early example of this includes the homogenisation of air travel pricing). To be ready for what the fourth (and fifth) industrial revolution brings, the division between what requires ‘humans’ and what does not will accelerate, so we will continue to see the divvying-up of those tasks and functions that require humans, and those that AI does well. As time goes on, humans will do what requires care, creativity and artisanship; and everything else will be automated. 2020 will see this division of ‘labour’ accelerate.
Hackers and Data Breaches:
New ways of identifying patients, customers, and depositors will be developed in 2020, as the already accelerating pace of hacking and data breaches continues. There’s huge value in stored data. Until they make these changes, hospitals and medical providers, for example, will remain strong targets due to the value of the data they store: not just patient health information, but also the patient identification that goes along with it (government ID, birth date, address, etc.).
Organisations will stop unnecessary "rip and replace" to reduce waste in 2020. When technology refresh cycles come around, many organisations are compelled by their vendors to take on full replacement of both hardware and software. This results of course, in a large amount of technology waste that gets processed, or 'demanufactured', (using energy and human resources) for recycling and disposal. Servers, which can contain toxic chemicals like Beryllium, Cadmium, Chromium Hexavalent, Lead, Mercury, BFRs and more should be used until they 'break', not just until a vendor wants to sell its customers a new round. It's time for that "rip and replace" culture to reform.
Storage is a great example of a place where that reform can happen. Software-defined storage, with ultra-strong data resiliency schemes, is a great way to take data servers to their true end-of-life, rather than replacing at refresh time. Adopting a robust software-defined storage solution that can scale infinitely using standard servers -- and that is 'generation-agnostic' so it can accommodate the steady evolution of hardware over time -- is a good way to reduce waste. What is ultra-strong data resiliency in storage? When storage is spread across a collection of storage servers, those nodes can share a highly parallel distributed logic that has no single point of failure: it doesn't not depend on any single component. This kind of system is resilient, self-healing, adaptive, location aware and constantly renewing. In that kind of scenario, you can wait for hardware to fail before you replace it, because it won't affect data availability--server outages are not a problem. Even better, some resiliency models can lose a full datacentre -- or a datacentre plus a server. Eventually, servers will fail. When that happens, their metal, plastics and glass can be recycled; and toxic components disposed of safely. Why accelerate and increase the processing, waste and energy when using the systems until they must be replaced is a solid option?
With Thanksgiving in the rear-view mirror, we find ourselves rapidly hurtling towards that annual unknown: the new year. 2019, much like 2018 before it, left businesses and customers awash in a tidal wave of data breaches. Just as prior years’ data breaches led to 2018’s General Data Protection Regulation (GDPR) furore, so the increased number and sophistication of data breaches led to 2019’s increase in regulatory oversight initiatives such as the California Consumer Protection Act (CCPA), New York Stop Hacks and Improve Electronic Data Security (NY SHIELD) Act. As companies begin thinking about their primary 2020 cybersecurity activities, they need to proactively strategize.
With that in mind, here are SecurityScorecard’s top 6 cybersecurity predictions for 2020.
1. Forecasting cloudy days
Organisations seeking to retain their competitive edge will be accelerating their digital transformation strategies from “cloud first” to “cloud only” over the next few years. According to Gartner, the worldwide Infrastructure-as-a-Service (IaaS) public cloud market grew 31.3% in 2018 while the overarching cloud services industry grew 17.5%. More than a third of polled organisations listed cloud services as one of their top three technology investment priorities for 2019. Based on the data, Gartner estimates that the cloud services industry will nearly triple its size by 2022.
As companies migrate their mission critical data and applications to the cloud, we predict that malicious actors will focus more on open ports, Distributed-Denial-of-Service (DDoS), and web application attack methodologies. Securing the cloud will need to be a primary initiative for organisations throughout 2020 unless they want to be another news headline.
2. Bringing in The Terminator
As more organisations look to mitigate data breach risks and costs, artificial intelligence and machine learning might be one answer to the problem. According to IBM’s “2019 Cost of a Data Breach” report, organisations using fully deployed AI/ML security solutions spent on average $2.65 million compared to the $5.16 million organisations without automation spent.
As organisations face the stark reality that data breaches are now a “when” rather than an “if,” more will incorporate new, Big Data, analytic technologies to mature their cybersecurity programs. In combination with increased cloud migration, more companies will mature their cybersecurity programs using AI/ML for greater visibility and control over digital assets.
3. Malicious software phishing for critical infrastructure
Malicious nation-state actors will continue to focus on malware and ransomware attacks. Nation-state actors don’t just want to sell cardholder data on the Dark Web, they’re targeting critical infrastructure such as electricity and water companies.
In August of 2019, emails sent to U.S. utilities companies contained a remote access trojan as part of a spear phishing campaign. The advanced persistent threat is another in a long line of attacks targeting critical infrastructure.
With at least thirteen global presidential elections scheduled for 2020, we can expect to see more malware and ransomware attacks attempting to undermine voters’ confidence.
4. A flood of data privacy regulations
The cybersecurity Magic 8 Ball indicates that “all signs point to yes” when asking whether more regulations would come in 2020.
CCPA and NY SHIELD foreshadow 2020’s privacy and security trends. The United States Congress debated a federal privacy regulation in June 2019. Despite being derailed at the end of the year, businesses and congresspeople alike are pushing to create a single, cohesive federal law governing privacy and security.
The United States isn’t the only country looking to formalise and consolidate its privacy laws. The Saudi Arabian Monetary Authority (SAMA) cybersecurity framework in conjunction with the GDPR’s extraterritorial impact pressures other Middle Eastern countries to update their privacy regulations. For example, the Dubai International Financial Centre Authority (DIFCA) sent out a call for public commentary in June 2019.
5. More than quantity – also quality
If the GDPR and CCPA taught the cyber community one lesson in 2019, it would be that not all laws are created equally. While the GDPR and CCPA are testing just how far a “local” law can reach, India’s Personal Data Protection Bill and the failed New York Privacy Act test the standard of care companies need to provide.
Both of these regulations use the term “data fiduciary.” Traditionally used in terms of money, a fiduciary duty requires a company to act in someone else’s (often shareholders’) best interests. If regulations continue to use the term “data fiduciary,” organisations may be held to a higher standard of care than “negligence.” If regulations begin to adopt the term “data fiduciary” in 2020, we predict a cultural shift recognising information as a financially valuable asset.
6. Building a security dam for your supply stream
Judging by the increased regulatory and industry standard focus on governance, compliance requirements will continue to focus on protecting your organisation from third-party risks. As more organisations add Software-as-a-Service (SaaS) applications to their IT catalogue, they also share more data with third parties.
As new laws are enacted and enforced, companies will see more stringent vendor risk monitoring requirements and increasingly be held liable for losses caused by breaches arising from their supply stream. Continuously monitoring of your third-party risk may be one of the few ways to mitigate the financial impact of those breaches.
Enterprise technology continues to be at the forefront of innovation, driving technologies such as artificial intelligence (AI) and software-as-a-service (SaaS), all powered by the cloud. This has influenced IT decision making, spending, governance across all categories of technology – pushing IT budgets to increase in line with innovation, employee demand and business demand. Research by Spiceworks has revealed 44% of businesses plan to grow IT budgets in 2020, compared to 38% in 2019. The research also found the biggest drivers for IT budget increase to be the need to upgrade outdated IT infrastructure, followed by escalating security concerns, and employee growth.
So, what is in-store for enterprise technology in 2020, and how will this influence the way businesses use such technology?
Line of business driven IT spending: Traditionally, IT focused on managing the capacity and availability of technology investments purchased by the business and serving as the official gatekeeper for the organisation. But there has been a market shift, with individual business units driving technology buying decisions and the technology itself shifting towards cloud-based consumption models. As a result, IT now needs to help organisations better understand and more intelligently consume technology. That speaks to cost, productivity, risk mitigation, and otherwise using technology to its greatest advantage. In the year ahead, we’ll see more organizations focus on consuming technology in a more intelligent way.
Software usage vs. compliance: Cloud offerings enable users to buy and consume what they want, when they want it without centralised oversight. This creates new challenges for IT as they attempt to track and manage all the technology in their environment. Many organisations have tolerated the unmanaged use of cloud solutions because they believe that the biggest spend is still under control. Yet, businesses don’t really know if all that SaaS and cloud spend is creating unmanaged compliance and data risks. Businesses will continue to struggle to manage any associated compliance and governance risks. Without proper visibility and insight, these challenges are only going to grow and become more difficult over time.
Cloud overspend vs. ROI: There is a growing disconnect between an organisation’s understanding of usage and spend on cloud services, and how vendors are charging for those services. Azure and AWS are now tracked and billed by the hour or even the second, yet many businesses are still trying to analyse usage data on a monthly or even yearly basis. That creates a significant challenge for organisations trying to understand, manage and optimise spend. This dissonance will only increase as new enterprise innovations, like serverless technology, take hold in the years ahead.
Serverless computing: Today’s cloud infrastructure is relatively easy to understand compared to what it will look like in 2020 and beyond. For example, when provisioning a cloud instance today, a user only needs a basic idea of how it operates and what it will cost. But when new cloud approaches like serverless become more popular, cloud usage will be managed by the people writing code. In serverless computing, the code drives the cost to deliver the service, and businesses are not yet prepared to deal with these new consumption models. In the next year, companies need to prioritise understanding consumption models because those models will have a significant impact on their business.
Cloud and decentralised IT will continue to have ripple effects on how organisations enable their businesses throughout the year. The key will be for enterprises to embrace the ongoing shift to empower their workers, create a more strategic IT team and identify opportunities where technology can drive value for the overall business.
We often hear that data is the fuel of modern business, but we think that food provides an even better analogy. When we go to fill our car up at the pumps, very few of us prefer a particular brand– we just want a full tank. But when it comes to what we eat, it’s not enough to have a full belly; we need the right sort of food that is both nourishing and tastes good.
By Nick Goode, EVP Product, Sage.
It’s the same with data. Filling up on information doesn’t necessarily make a business better; in fact, the wrong sort of data can have a highly damaging effect on the health of the whole organisation. That’s because – in the era of the connected business – the effects of bad data aren’t confined to the system in which it resides. Instead, it ripples out to a range of other applications and processes that rely on that information.
Businesses may not realise it, but bad data is a serious and costly issue. In 2016, IBM estimated that poor quality data costs over $3 trillion in the US alone. (By comparison, the size of the entire big data industry in the same year, according to IDC, was a ‘paltry’ $136 billion.)
This can only ever be an estimate, though, because it’s difficult to put a price tag on the missed opportunities, reputational damage and lost revenue that comes from having the wrong data – not to mention the time and effort involved in searching for and correcting it. Knowledge workers spend far too much time searching for and correcting errors in their data.
Other researchers provide further evidence for the devastating impact of bad data. Gartner found that the average cost to organisations is $15 million a year, while a report from the Royal Mail suggested that it causes a loss of six per cent of annual turnover. Why are businesses failing to address an issue with such a direct impact on their bottom line – especially given today’s fixation on data-powered insight?
The domino effect of bad data
You would expect that the figures listed above would provide plenty of food for thought, especially as every line of business, from marketing to finance, customer service to supply chain, is now so completely dependent on accurate data on which to base their insights. Yet in our pursuit of data quantity we seem to have forgotten one of the oldest tenets of the information age: ‘Garbage In, Garbage Out’.
Too often, businesses lack a coherent data integration strategy which means that inaccurate or incomplete data causes a domino effect through the organisation.
Nothing highlights the interconnected nature of modern business better than the issue of bad data. If a department does a bad job of keeping data clean, up-to-date, and accurate, it affects every other department that relies on that data. This means that the effects are not limited to those who are responsible for managing records and updating systems; instead, they spread throughout the organisation. This results in all manner of problems: from badly-targeted marketing campaigns to poor customer service outcomes, to errors in payroll, resource allocation and product development.
Another grave consequence of inaccurate data is that it can lead to people mistrusting the insights that they gain, and even resenting the data creators who have allowed erroneous information to creep into their systems.
A recipe for success
For all the hype around data-driven insights, businesses are facing a data credibility problem, with insights and performance metrics badly skewed by inaccurate information. So, while no-one discounts the importance of having large data sets from which to draw insight, the more urgent challenge facing organisations is to improve the quality and accuracy of the information that they hold.
Just as the food we eat has a direct effect on our wellbeing, so the quality of their information has a bearing on the health of a business. That’s why they need to treat data as a delicacy, rather than just fuel. By focusing on data quality, they can then ensure a positive domino effect throughout the organisation, with departments and workers able to trust to the insight they derive from it.
To do this, every organisation must undertake a regular data quality audit that not only verifies the accuracy of information that is kept, but also examines the internal processes and workflows associated with gathering and storing information.
For example, the organisation needs to have complete confidence that employees are capturing all relevant information in systems such as ERP systems, and that all data is entered accurately and kept up to date. This should include cross-referencing with information held in other systems such as CRM, ensuring that the business can have faith in the data on which it bases its most important decisions.
The recipe for success is simple: be as discriminating with your data as you would towards the food you put in your mouth: prioritise data quality to ensure you get accurate insights.
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 9.
From Nicole Alvino, Co-Founder and Chief Strategy Officer at SocialChorus:
The battle against “fake news” in the workplace will come to a head
In 2019, we saw the proliferation of “fake news” spill out of the political sphere and make its way into organisations. Companies are now battling new frontiers we’ve never seen before - from the fight against deepfakes, to “digital water coolers” running rampant among employees, to the spread of social and collaboration platforms that make it easy for anyone to spread disinformation. And all the while, employee engagement is only becoming more critical, as organisations must effectively engage and retain employees to win the war for talent. In 2020, these forces will come to a head for business leaders as they look to combat the infiltration of “fake news” and deliver a unified message to employees. Organisations will risk losing trust and transparency with their workforce or turn to new strategies, such as creating a single source of truth for company communications, establishing “truth ambassadors” as trusted sources and building mechanisms for transparency and feedback.
Expect increased investment in employee engagement to keep up with digital workplace challenges.
Amid the digital workplace shift, current technologies will be exchanged for new ones and new solutions adopted internally, as organisations race to create a more connected, engaged and productive workforce. The employee engagement space will be one to watch as investment continues in collaboration and “productivity” tools like Slack, which are often touted as an answer to this issue of employee engagement. But with so many devices and employee preferences in play, the type of peer-to-peer communication offered by these tools doesn’t deliver the alignment that companies are really after. Companies will need to focus on a multi-channel strategy and delivering information, from benefits updates to compliance training to company news, from a unified platform to reach all employees. These types of employee communications platforms will allow managers and supervisors to communicate quickly and seamlessly with team members whether they are behind a desk, in the field or on a factory floor.
Ethical leadership will make or break the bottom line
Throughout 2019, a myriad of factors have forced companies to recognise the importance of ethical leadership. From employee protests and walkouts to GDPR and the data privacy troubles of companies like Facebook, ethics has become the crux of both employee satisfaction and business success as employees demand more out of their employers. Especially with forecasts predicting a potential economic slowdown, in 2020 we will see the C-suite grasp ethical practices as a competitive advantage, revamping and restructuring corporate social responsibility programmes and efforts to demonstrate their commitment. Ethical leadership will no longer be an option, but an imperative that directly impacts the bottom line, pushing companies to build ethics into policies and practices, place a renewed focus on company culture and seek ways to measure the impact of their efforts.
IT will become more user experience-focused and drive the employee experience. IT can no longer be all about point solutions and ensuring governance, compliance and ticket velocity; it must connect to broader business objectives, as the need to recruit and retain top talent becomes more imperative. As the workforce continues to evolve and organisations shift toward the digital workplace, IT will increasingly focus on employee adoption, usage and the end-user experience, delivering technology and strategies that meet employee demands and rising expectations. That means everything from more automated processes to mobile-first platforms so employees can work faster, smarter and better, wherever they may be. In the coming year, we will see more technology-focused initiatives aimed at supporting a culture of transparency and collaboration and driving organisational alignment, all of which are central to improving the employee experience.
Employee engagement strategies will centre on the multi-generational workforce
We’ve all heard the talk of millennials and Gen Z taking over the workplace, and organisations can no longer ignore this seismic demographic shift when it comes to the employee experience. Businesses today are facing an employee engagement crisis, grappling with more factors and distractions among employees than ever before - from decreased attention spans to the proliferation of chat tools, social platforms and consumer-like technologies that have changed how employees consume information. With so many varied preferences, behaviours and devices across generations, organisations will need to adopt a multi-generational, multi-channel engagement strategy in order to win employees’ mindshare in 2020. This type of approach allows for flexibility, targeting and personalisation so businesses can deliver the right message to the right demographic - and retain workers before they go elsewhere.
IT hiring initiatives will become more “soft skills”-focused and personality-based. To succeed in digital transformation initiatives, the workforce must be flexible, creative and motivated to drive change. While IT has traditionally focused on solving problems in the quickest and most cost-effective way, this hasn’t left much room for creative problem solving, in turn stunting team collaboration and company growth. In fact, according to a Gallup poll, only 2 in 10 employees agree their performance is managed in a way that motivates them to do outstanding work, and this is costing businesses between $960 million and $1.2 trillion a year. In the coming year, IT leaders will increasingly look for employees who can see the bigger picture and work with a sense of purpose on top of knowing the tricks of the trade. Creative employees who can take a larger business problem and present a technical solution, or who can come up with “what if” scenarios to develop new solutions, will help teams become more collaborative and goal-oriented and improve the employee experience through technology.
Insights from employee communications will deliver organisational intelligence
Communications and HR teams will adopt a data-driven approach to employee engagement and communications, one that focuses on micro-moments and behaviour instead of relying on annual or even quarterly surveys. They will implement quantitative methods that correlate effectiveness of communications with business performance – from reduction of safety incidents to delivering business transformation to sales. This will allow leaders to get a real-time pulse on their organisation that will be invaluable to predict and enable high performers, as well as predict and intervene on retention issues.
By David Feller, VP Product Management and Solutions Engineering at Spectra Logic
Watch out for PCIe v4 and the death of Fibre Channel in the data centre in 2020
We will see dramatic physical changes in the data centre as 2020 will begin the changeover from fibre to IP connectivity. At the same time, 2020 will see a revolutionary acceleration of the data centre with PCIe v4 and NVMe over fabric connectivity. These new technologies can easily be retrofitted into current equipment and existing data centres, resulting in great cost savings.
Encryption of individual items for data privacy and security in 2020
Data privacy and security are no longer optional, yet organisations are struggling to rigidly secure their data while also ensuring compliancy with privacy regulations such as GDPR. Typically, data is encrypted in bulk but this makes it difficult to respond to requests to have an individual's data removed. The advent of new technologies will enable the encryption of individual items with its own keys, which will transform the ability of organisations to respond to 'the right to be forgotten' demands.
2020 will see broad adoption of a two-tier storage architecture
The traditional four-tiered storage pyramid comprising of flash (SAS then SATA), disk and tape has run its course. Instead, 2020 will see wider adoption of a more streamlined two-tier storage architecture that defines data by its usage rather than the underlying storage medium. Traditionally the two-tier model has been used by high performance computing, but recent software innovations are making it readily available for all market segments, such as media and entertainment, oil and gas, or even smaller businesses.
Many say that over 80% of the world's data is stored on the wrong tier of storage costing businesses millions of euros every year. The new two-tier model enables organisations to move the majority of their data, which is inactive, off of the expensive Primary Tier of storage (made up of flash, NVMe and other solid state technologies and high performance disk), to a more economical tier of storage, called the Perpetual Tier. Users can then keep multiple copies of data on multiple storage mediums, including cloud, NAS, object storage disk and tape. The Perpetual Tier also can be used for secondary storage, distribution, backup, archive and disaster recovery. In fact, the Perpetual Tier can be configured to be as responsive as customers’ workflows demand – enabling users to create responsive copies on NAS and disaster recovery copies on cloud and/or tape.
Tape no longer resembles its ancestors
The story of tape goes back to the very first computers from the 1950s and 1960s, and has been declared dead for the last three decades. However, over that time, tape technology has advanced in both capacity (from 1 to 2 MB per tape to a whopping 50 terabytes*) and performance from 10KB/s to 360 MB/s. In the last five years, vendors have continued to develop innovative tape drives, libraries and software to support the technology. Perhaps what is most interesting is that the everyday usage of tape has evolved beyond secondary storage for archive and backup, with customers starting to retrieve as much as they save. For example, most of the large television networks (typically data-hungry organisations) are using tape as a reliable landing place from which they regularly retrieve data in exceedingly high volumes. This more dynamic application of tape is enabled by various factors, including rapidly increasing performance and capacity of tape along with decreasing costs; innovations in tape robotics; developments in object storage; and innovative software that enables simple and fast data movement and retrieval. 2020 is the year we will see continued recognition that tape has evolved as the most economical and flexible medium for data storage, rather than as just a backup medium.
* 2.5:1 compression ratio
Some thoughts from Jonathan Rowan, Business Development Director at SSE Enterprise Telecoms:
Darkness bringing light
Dark Fibre has been a hot topic in telecommunications circles for a while now. The potential for it to boost connectivity through vast untapped capacity – and to aid with the ongoing rollout of 5G – means we can expect even more activity in 2020. Particularly with Ofcom’s announcement on Physical Infrastructure Access (PIA), which will open the door to many new providers.
Still, issues remain in its path. So it’s important that network providers in 2020 persevere with the mission to offer Dark Fibre (and Dark-Fibre like) services, partnering with other industry experts where necessary to ensure viable solutions are available.
Fibre in the shires
There have been innovative advances made in urban connectivity, and in overcoming network challenges in cities (our fibre in the sewers project being just one of them).
Yet, we’re a way off from the nationwide connectivity boost that’s going to be fundamental to the UK’s economic development. The fact remains that many rural areas struggle with basic connectivity needs. Never mind the super-fast challenges that trouble the metropolitan connectivity elite (to twist a popular 2019 term).
The fibre-to-the-home moves made this year have been great, and double the number of homes now have ‘full fibre’ from last year. However, despite significant growth, this still only equates to 10% of UK premises.
Clearly more needs to happen in 2020. Not least because solving regional connectivity has huge advantages for all sectors. The future of modern businesses will see more people operating remotely, away from the trappings of a conventional office environment. They’ll need fast, reliable access to data and tools, just like their colleagues at (what’s left of their) HQ. Better connectivity can also be a huge piece of the puzzle in regional regeneration work.
Outsmarted by our devices
Cities, cars, fridges. You name it, it’s got a smart future. And the 2020s will be the decade we will really notice things take a massive leap forward.
For one, we’ll see smart grids become part of everyday life, as energy firms look to move into a new world of highly efficient and robust provision to customers. We’re well-versed in this, working with Synaptec on a technical partnership to provide powerline condition monitoring services. This will give energy providers real-time data on a number of key variables, ensuring their power networks are at their most effective and efficient – a key step in becoming truly smart.
And smart won’t stop there. The 2020s will see big developments, with autonomous vehicles and connected devices growing in popularity, and people seeing real change from early on the decade.
In terms of what this looks like smart city-wise, we’ll see the continued roll out of IoT sensors to collect the data that will help urban planners make more informed decisions and enable self-driving cars to navigate congested city centres. While emergency services will benefit from more reliable data, supplied in real time, to assist people in their moment of greatest need. For example, by pre-empting potential issues in policing, or ensuring the right level of response in urgent care.
Public behaviour will evolve, too. Richer, technology-led retail experiences will change the high street, while consumer-led sectors like banking will continue to do more with tech. Real-time banking is a great example of this – an intensely data-dependent service, that more and more people are already coming to expect as standard.
The uniting factor here is capacity and pressure on the network. Some sources predict that there will be more than 75 billion IoT connected devices come 2025 (up from an estimated 30 billion in 2020). A sensor rich, data greedy, IoT and AI focused world demands speed, efficiency and reliability.
And it’s infrastructure that has to support this, providing the high capacity, robust and secure connectivity needs of an exciting, but complex, digital future.
Comment on biggest IT concerns for 2020 from Chris Hodson, CISO at Tanium:
As 2019 comes to a close, we’ve seen a steady increase in the number and modes of cyberattacks. In fact, more than half of all British companies reported cyberattacks in the last year alone. Going into 2020, Tanium looked into the biggest concerns for IT decision makers within organisations in the UK. This revealed that not having enough visibility over the increasing number of IT endpoints, such as laptops, servers, virtual machines, containers, or cloud infrastructure, leaving them unaware and unable to protect all systems, was the biggest concern for the coming year (25%). The next biggest area of concern for respondents is the sophistication of attackers rising (23%) followed by employees clicking malicious links (18%), and the complexity of managing physical, virtual, cloud and container infrastructure (15%).
What this all serves to underline is the fact that successful cyber attacks usually occur when businesses don’t get the foundational security concepts right. When an organisation doesn’t have visibility and, by extension, control of the potential entry points across its IT environment, they are inherently vulnerable to attack. To best equip organisations for the threats to come in 2020, CISOs must ensure that they are taking several important steps to build a comprehensive IT security strategy so that they can protect critical assets, monitor impact, and recover from any unexpected attacks or disruption. This includes:
In order to have an effective IT security strategy in place, an organisation must have two lines of defence; employee advocacy and comprehensive IT security structure. Crucial to combatting any type of threat – whether a sophisticated attack, employee clicking on a malicious link or one that exploits an out-of-date piece of software - is clear visibility of all of the endpoints across the network and the ability to stop disruption almost instantly.
By Craig Smith, Vice President of IoT & Analytics, at Tech Data
The end of the year is a time for the channel to take stock – although perhaps not in a literal sense given the shift in the market to SaaS. For many, thoughts are preoccupied with meeting sales targets and tying up loose ends before the festive season begins. However, the end of the year is also a good time to reflect on the year just gone and the preparations needed for the year ahead.
There have continued to be a lot of changes over the course of the year. IT investment, distribution and consumption structures have continued to undergo significant changes as we move away from linear channels supplying products from point-to-point.
It is crucial that partners use their time at the end of the year to identify the areas where they can add higher value to end customers through their products and services.
Critical to creating this higher value is identifying how to create business solutions with next generation technologies such as hybrid cloud, IoT and analytics, machine learning, AI and cyber security.
Plugging the network
2020 will see 5G mature as European rollout gathers momentum. Businesses are incredibly interested in potential commercial use cases and this in turn creates opportunities for the channel. It is not the only new networking technology on the market, however. Having been launched in 2019, WiFi6 will also become more commonplace in the coming year. Those partners that can bridge the gap between the two to deliver high speed connectivity for businesses whilst helping to manage costs and optimise investment in networking should have a successful 2020.
Analytics drives efficiencies
The next 12 months will be a tipping point for analytics and those that invest in it, whilst those that fail to grasp the opportunity will find themselves falling behind. Businesses from all verticals have realised the benefits of real-time insight, which is why we’ve seen market consolidation of analytics capabilities, particularly Salesforce’s acquisition of Tableau. Analytics creates a great opportunity for the channel, who can not only help customers navigate the complex business of choosing the right solution but also help them manage all the associated data and keep it secure.
AI, AI, Captain!
Robotic Process Automation (RPA) has been around for a while but interest in it has grown thanks to businesses looking for technology that will achieve operating efficiencies. There will be considerable growth in the RPA market as businesses look to use the technology to augment their workforce, but that also means someone needs to look after the RPA and businesses don’t always have those skills.
Of chief importance is security. No business wants to score a cyber security own goal by deploying RPA in an unsecure manner and leaving themselves open to being hacked, so the channel has a crucial role to play here.
Everything as a Service
The shift to as-a-Service has been taking place for a number of years but in 2020 it will continue to gain momentum in the channel, even in the most hardware driven industries. With the development of onsite, off-site, cloud and hybrid, on-premises-as-a-Service will become as commonplace as the various Software-as-a-Service offerings.
With the new year and the new decade continues sweeping digital transformation that will change the way in which businesses and society operate. AI, the Internet of Things (IoT) and biometrics are becoming increasingly integrated into our data-driven world. Though rapid technological change invites business opportunities and societal good, vulnerabilities can outpace security measures, with savvy cybercriminals all too willing to exploit such windows as they appear.
But what are the specific new threats that are likely to emerge this year and throughout the coming decade?
By Andrew Hollister, EMEA director of LogRhythm labs.
An insider will manipulate AI to frame an innocent individual
It’s an unfortunate fact that, because people train artificial intelligence (AI), AI adopts the same human biases we thought it might ignore. Despite this being the case, the legal system has been happy to employ the technology to try and secure prosecutions. This was seen in a judge ordering Amazon to divulge Echo recordings in a US double murder trial. Unless guarded against, this will allow nefarious insiders to feed AI false information to convict an innocent party – the criminal justice system may well have to wrestle with such circumstances in future.
The increasing spread of biometrics will bring unforeseen consequences
In another example of technology advancing beyond the pace of regulation, unfortunate members of the public stand to be victimised through their biometric data. Such intrinsically personal information may be stolen and used time and time again for fraud. Unlike stolen data such as credit card details, it is not possible to change the compromised information, outside of changing one’s face. Unfortunately, the industry might only standardise regulation once the dangers of biometric data have been made fully apparent.
Ransomware attacks will target critical business infrastructure
Ransomware has already become an oft-utilised weapon in the arsenal of the cybercriminal. Indeed, researchers found that the average pay-out by victims increased in 2019 to $41,000 USD. Given the success of this tactic, attackers will now train their sights upon critical business infrastructure. The health sector has already been hit across the Western world and power grids may now become the next lucrative target for ambitious cybercriminals. Such attacks may have worrying societal implications, eroding trust in a government’s ability to protect its citizens.
Iran’s offensive cyber operations will grow at a faster rate than China’s
Information surrounding Iran’s cyber operation capabilities is especially relevant given the extreme geopolitical tension between Iran and the U.S. that the new year has already heralded. Iran is now due to overtake China – long seen as the West’s biggest adversary in this sphere – and the current diplomatic climate points to the Islamic Republic exploring every avenue to bloody the American nose. There is far less in the way of a deterrent to stay Iran’s hand – unlike China, the theocracy has no diplomatic relations with the U.S. and the crippling sanctions already imposed upon it means that it cares little for trade ramifications. Combine these factors, and the pace of current events, and we are due to see far more cyber activity from this bellicose state.
Quantum computing will reach more widespread use, including towards malicious ends
Google’s ‘Sycamore’ project is leading the charge towards quantum computing and, although such projects are still far off advanced quantum computing, further progress is sure to be made in 2020. Such developments are sure to change the way in which we perform cryptography and this technology will eventually enter into widespread use. Already, Microsoft has announced a new Azure Quantum service, through which select customers will be able to run quantum code and make use of quantum hardware. These advancements are expected to lead to developments in modern AI, and we can expect more efficient AI data analysis and a better decision-making process. With quantum technology’s eventual uptake by the masses, we may see widespread development, adoption, and usefulness of quantum and modern AI throughout 2020. This technology will inevitably be put towards both legitimate and malicious ends.
Deepfakes as a defence
The phenomena of deepfakes has already entered the public consciousness, with Facebook recently announcing its intention to ban the technology. The pernicious use of such videos has also coloured debate in the Western world about its use by political operatives, and organised crime groups have already successfully used deepfakes to impersonate executives to secure the illicit transfer of large sums of cash. Without a doubt, this trend will continue.
Beyond the direct implications of their use itself, deepfakes will serve to further muddy the water surrounding who might have said or done something. Anything captured on video will be called into question as we truly continue our march into a post-truth era. In 2020 and beyond, we can expect to see deepfakes used as a defence against professional or legal repercussions of events purportedly caught on video. The advent of such technology means that ‘seeing something to believe it’ no longer holds weight.
Eavesdropping on smart speakers will result in a major political scandal
If our smart devices are listening to us to improve the decisioning in the devices’ AI, then a human needs to be listening too. Live microphones have caused enough embarrassment for our political class before such technology was even conceptualised, and we can expect the proliferation of listening ears to cause a corresponding uptake in scandal.
Behind-the-scenes, employees are well-positioned to become whistle-blowers and we are due an explosive political scandal in 2020 to originate from such a source. If it’s any consolation to politicians, they can always resort to the deepfake defence.
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 10.
David Parry-Jones, VP EMEA, Twilio
This has been a year of disruption, and in 2020 we’ll observe accelerators of change starting to drive a model where only innovators win. Businesses can’t rest on their laurels when nimble competitors are evolving to compete with the giants in their respective spaces - as we’ve seen in the banking sector, for instance - and 2020 will be the year that more large enterprises begin to rise to this challenge.
Communications and customer engagement are key areas in which incumbents need to catch up with the cloud-native, agile startups that have had responsive, intuitive customer service baked into their model. When you look at the benefits retail and banking giants such as M&S, John Lewis and ING have gained by applying these lessons and building flexible communications into their customer service models, it’s inevitable that more will start to follow suit and even rethink their strategy as a whole.
With the advent of APIs and cloud-based communications platforms, customers can now be easily reached across a multitude of different channels according to their preference, and call-time vastly reduced through the application of AI, so in 2020 more organisations should begin to ask – why can’t we do this too? And can we really afford not to?
Customers are at the heart of business success, and in 2020 it’ll be vital that businesses apply the lessons we’ve learnt from disruptors more widely.
Jay Gurudevan, Principal Product Manager, AI/ML
As enterprises embark on their transformation to become AI-first companies, a strategy focused on augmenting current systems and processes with AI vs. fully replacing them, will win.
We’ll see more enterprise and businesses leverage AI tools and automated communication to better understand the entire customer journey. As consumers become more comfortable interacting with AI agents, Natural Language Processing - the area of machine learning that allows humans and computers to communicate - will become more accurate and advanced and implementation will expand.
Len Shneyder, VP Industry Relations, Twilio SendGrid
The internet will welcome over one billion new internet users by 2022. Another billion users represent a target rich environment for cyber criminals that will use every means at their disposal to compromise their PII for the purposes of fraud. With this said, email will continue to be the most basic and foundational identifier on the internet because its reach connects today’s users and will connect tomorrow’s users as more of the world comes online
Email authentication will become increasingly more important in maintaining the health of the inbox ecosystem, protecting brands from spoofing and phishing and preventing phishing attacks. However, the need to enable and align email authentication will present new branding opportunities for legitimate senders. As Brand Indicators for Message Identification (BIMI) moves from limited testing and development into general availability it will give senders the ability to publish logos in DNS and have them displayed in email clients around the world that validate their email authentication.
The interactive inbox will only be possible for brands that authenticate their email—creating more interactive experiences using things such as BIMI, Schema and Amp for Email require senders to differentiate their mail traffic from spammers. Since brands will want to take advantage of increased visibility in the inbox through BIMI, they will be one step closer to take advantage of AMP.
Paul Farrington, EMEA CTO at application security firm Veracode, offers the following thoughts:
1. We should expect elections to be compromised
When it comes to election hacking, we’re in a time where we need to assume it’s happening across the globe until we can prove it isn’t.
There are plenty of reasons why foreign nation-states and big business would want to influence the results of an election, and the incentives – both monetary and power-based, are only going to grow.
From leveraging social media to create echo chambers that propagate certain agendas, to planting surveillance software on applications to monitor voter behaviour, bad actors are finding more and more ways to sway elections, and it’s going to take a lot of voter education and awareness to outmanoeuvre them.
I expect we’ll start to see new, even stealthier tactics aimed at influencing voter opinion, and the main targets will continue to be the political parties themselves. Our recent 2019 State of Software Security report found the government and education sector has the highest rate of security debt (unresolved software flaws) amongst the industries studied. Knowing this, all parties should assume there is a significantly increased risk of being targeted by attackers – and take appropriate steps to limit a breach, including addressing application flaws to minimise the risk of an attack.
2. Reducing mounting security debt will be paramount
One of the major reasons behind successful cyberattacks is the ability to exploit vulnerabilities in an application’s code. When organisations don’t address vulnerabilities, they leave themselves wide-open to attacks.
Our 2019 State of Software Security report found the median time to fix flaws is 68 days for applications scanned 12 or fewer times per year, but this decreases by 72% to 19 days when applications scanned are scanned 260 or more times a year.
In 2020, reducing cybersecurity debt (the amount of open vulnerabilities) by introducing more frequent scanning of code at regular intervals should be a focus for any organisation serious about making best practice cybersecurity a focus.
3. Leading development teams will incentivise secure coding
Most organisations today acknowledge that they could not do what they do, or remain competitive without software. Software runs the world. The absence of security isn’t always conspicuous, until you are confronted with the effects of being attacked. In 2020, we’ll see companies looking at ways to incentivise best-practice security at every point in the software delivery process.
In order for people to truly care about making security a priority, organisations must integrate security into the ways in which development teams and engineers are incentivised. We should be providing developers with simple cues to encourage the right behaviour, but this has to be realistic. Very few software applications will ever reach zero security defects, nor would that even be a good use of company resources to achieve this.
First, we should agree what the security standards are for the team. Next, we classify those security bugs that are the highest priority, those that are important but not showstoppers, and those which, whilst not ideal are acceptable to exist. Especially for the first two categories, we should track the average time to fix a security bug. Once a baseline is established, then we need to negotiate targets so that engineers and product owners can buy-in. These metrics may ultimately help to determine compensation, but perhaps initially are linked to softer benefits for the team.
At the end of the day, as businesses we are trying to sell more products at a higher margin than our competitors do, so one way to differentiate is by leveraging security as a strength. If an organisation and its developers can work together to create, and stick to, these accountability measures, security will improve, in turn creating a competitive advantage.
4. Developers will try to find the balance between security and innovation
One billion Docker images are downloaded every two weeks. This is empowering developers to have control over how their code is deployed in target systems, helping organisation scale far faster and ensure fidelity of thought is maintained.
Containers are enablers, but as of today, they do not adequately address security issues.
With containerisation now considered the standard when creating code, there is a greater need to ensure security is a core part of the process going forward, especially as the technology continues to grow, making it an increasingly lucrative target for cybercriminals.
There is a paradigm shift taking place and the rules are being made up as we go along. A lot more research and innovation needs to happen on a security level to empower developers, whilst giving them the tools necessary to do their job securely. In 2020, all organisations using containers need to make sure they are secure without stifling innovation.
5. DevSecOps will be key to clearing software flaws
In organisations, development teams are being asked to take ownership of integrating security earlier in the software development lifecycle. Likewise, security teams are more actively engaging on the development side, and there is less friction between the two than in the past.
As we approach 2020, organisations are looking at DevSecOps as a way to address the complexities of managing and securing cloud-native applications. Building understanding and cooperation between development and security teams, while also automating testing, can help organisations address security earlier in the development process while also creating secure code. Our latest State of Software Security report found that fixing vulnerabilities has become just as much a part of the development process as improving functionality, suggesting developers are shifting their mindset to view the security of their code as equal to other value metrics.
Going into the new year, development teams can’t ignore existing flaws nor choose to fix the new flaws before the old ones. Instead, they should be resolved in tandem, by fixing new flaws as they are discovered and using periodic ‘security sprints’ to fix unresolved flaws that could be exploited. There are challenges that can arise when incorporating security earlier in the development process, but with the threat landscape continuing to grow, organisations will see the benefits of making DevSecOps a priority in 2020.
6. Developers will select security tools which take less than ten minutes to run in a development pipeline
DevOps teams that are able to integrate security testing into their development pipelines are twice as confident in their security than those that don’t automate security tests. As engineers look to automate tests in the integration pipeline, those tools which complete in a short space of time will be selected as a priority. Typically, this will mean that results are expected back within ten minutes, and accuracy is paramount. Noisy, lengthy security tests will either not go into pipelines or will be kicked out of the automation process in 2020.
7. Cloud-native technologies will become the de facto choice for development teams – organisations will need to prioritise security
There has never been a better time to work in software than now. Developers are presented with an abundance of choice when designing and creating software applications. Systems of the past were designed as monoliths. Having core logic tightly bound to a huge blob of software is today seen as an anti-pattern for stability and development velocity.
Overwhelmingly, developers are choosing architectures that allow failure to happen in one part of the system, without having an impact on the remaining system. Microservices, containers, orchestrators, services meshes and serverless computing are all enabling technologies that are allowing developers to achieve greater velocity. This, however, brings a security challenge.
A 2019 survey found that 35% of respondents had a lack of understanding of how to deal with the attack vectors specifically relating to cloud-native applications. Interestingly, 33% admitted that their development teams don’t involve cybersecurity experts, for fear of being slowed down.
8. ‘Everything-as-code’ (EAC) will include security
Everything needs to be code, but that’s not always the case today. We need software to be deployable, and increasingly, not to have to worry about how that will happen. ‘Infrastructure-as-code’ takes care of that, by ensuring that the configuration of how software gets run, just happens without humans needing to execute manual steps to bring services up. This principle can apply to anything, including application security.
In 2020, performant DevOps teams will ensure that Security-as-code, is a design feature. Security tests will be written into the configuration of how software is checked (and increasingly patched) before deployment will be standard.
9. Supply chain security needs to be brought into the next decade
Supply chains are becoming more complex, and with this complexity comes more opportunity for cybercriminals to cause chaos.
Research shows that third-party software has more vulnerabilities than internally developed software. The people that care about the security of a particular software are more often than not the ones using it, not the ones creating it.
With developers releasing updates far more frequently, as well as leveraging different open source code or software from multiple parties, organisations need to ensure they have the full picture of the code they are using at all time.
Third-party penetration tests no longer work for the modern development cycle – which is now often daily rather than every few months. It is the organisation's job to make sure certain standards are upheld so security can be ensured throughout the supply chain in real-time.
Eighty-four percent of professionals agree that their companies are concerned about the potential data security risk posed by third-party applications. In 2020, the focus should be on making security part of an organisation’s competitive advantage, and this should start with the supply chain.
2020 is all about executing on enterprise digital transformation. Early digital adopters are now beginning to show real traction and gains – particularly when it comes to customer, and more specifically, human experience technology. Those who haven’t opted in, and who chose to focus on cost efficiencies at the expense of network and technology investment, are now or will soon be suffering from innovation attrition, which can leave them unable to effectively compete. The focus in 2020 needs to be on how organizations can use technology to help them deliver on their customers’ business objectives in uncertain times. They need to improve agility, security and performance, and focus on the customer experience – and both basic and niche technologies are the answer.
With this in mind, here is Verizon Business’ view of those enterprise technology trends that are most likely to impact our global business and government customers in 2020.
“In this new era of the democratization of data and technology in general, we’ll continue to see global innovation accelerate faster than Moore’s Law,” said George Fischer, President of Global Enterprise for Verizon Business. “Businesses and governments need a well-defined strategy and execution plan to avoid losing ground due to analysis paralysis. We are entering a new age where these technologies enable highly customized experiences built around end-user and corporate values.”
Fischer continued: “We’ve been talking about the importance of operating within a Real-Time enterprise environment for some time now, which is all about using technology to get insights into customer and operational activities. We think that 2020 will be a real breakthrough year for early digital adopters, who now find themselves well-placed to reap the benefits of disruptive technologies like 5G, but also niche technologies that have the potential to differentiate their competitive service proposition. There’s no doubt but that we live in uncertain times – but they are also exciting times when it comes to technology potential. I’m looking forward to helping my customers drive their business forward.”
Kofax®, a leading supplier of Intelligent Automation software to digitally transform end-to-end business operations, has revealed 10 Intelligent Automation Predictions for 2020 – looking at how Intelligent Automation is poised to transform organisations over the next 12 months. Kofax believes there are reasons for optimism at forward-looking organisations.
“Intelligent automation technologies are primed and available to help companies realise their automation aspirations,”says Dan Kuenzig, Vice President of Strategy for Kofax’s Center of Excellence. “Dreams of automating end-to-end operations can become a reality faster than might be expected, especially as many organisations are ready to implement an integrated Intelligent Automation platform.”
Kofax’s 10 Intelligent Automation predictions for 2020:
Mike Smith, Managing Director (Direct) at Virgin Media Business, comments:
“There’s a new decade on the horizon, and just like the ones before it, it will be shaped by innovation that will positively change the everyday experience of customers and employees. In 2020, technology will become faster and more flexible than ever before.
Since 2008 there has been a 74% rise in remote working in the UK, with more than 1.54 million of us now working away from the office. Remote workers enjoy flexibility, freedom and more streamlined use of working time. No more time wasted on the daily commute.
For businesses, more flexible, more empowered people means increased productivity, efficiency and creativity.
With the expansion of ultrafast broadband across the country, working from home is set to boom. Virgin Media’s ‘Project Lightning’ has already begun expanding gigabit capable technology to homes.
For the first time in the UK, home broadband offering speeds of more than 1Gbps is available to hundreds of thousands of homes in and around Southampton, Manchester and Reading. By 2021, 15 million homes across the UK will have received this Gig1 broadband boost.
Hyperfast speeds mean that home workers can share large files, complete data-rich projects and clearly communicate without delay or the dreaded buffer wheel. With the infrastructure in place, expect to see more remote workers than ever before.
The networks of the 2020s need to be rooted in customer experience. Because no matter what the business, customers expect more than ever. They don’t want their online experience of a company to be hampered by time outs and website crashes. They want to be in and out like the Flash—fast and effortless.
It’ll be vital that businesses use a network with the agility, flexibility and resiliency that won’t let down their customers. SD-WAN is one such technology which can transform legacy infrastructure into a more secure and responsive platform. It can give a business full control of network traffic, allowing near-real-time changes to optimise performance across every aspect of a corporate network.
That means employees can benefit from greater productivity and efficiency, no matter how they work. And it means customers can interact with a business when, where and how they expect.
Whatever 2020 brings, the networks to support growth and change the technology around us are already being put in place. Next time you catch yourself wondering how the 2010s went by so quickly, just think about the innovation and prosperity that lies ahead.”
Streaming movies, playing computer games or finding the latest hidden musical gems are getting easier with each passing year. Just look at the last two months of 2019. We’ve seen the launch of Disney Plus, Google Stadia and Spotify providing its 248 million users with custom made playlists. Powering all of this and more is the hidden force in our internet age – the data center. However, while being a booming industry, all is not quite so rosy for data center businesses. The growth in usage coupled with the 3.6 billion people who are now connected is putting pressure on the internet infrastructure like never before.
By Jonathan Leppard, Director at Future Facilities.
The pressure is being felt too. In fact, in a recent independent report, commissioned by Future Facilities, it was found that over three quarters (77%) of decision-makers are feeling the strain of a consistent increase in demand. This increase in data center density and workloads, naturally results in more power being needed, which in turn generates more waste heat. Consequently, areas such as cooling and power are now monitored more than ever before to ensure downtimes are avoided.
Juggling the need to avoid downtime and prevent costs from dramatically increasing has seen data center operators and business decision-makers search for solutions that can help. While adding in extra cooling or power can often feel like the answer, these are expensive solutions that overlook the true state of the data center.
Increase capacity, and capital expenditure
This is a worry for both the operations and business teams. As found in the research, the average cost of downtime is £122,000. That’s a large sum of money, and can quickly multiply if a business suffers multiple outages across a year. This figure doesn’t even begin to cover the loss of loyal customers and damaged reputations. It’s no wonder then that businesses are so keen to avoid outages. We need to find solutions that lower the risk of downtime, but also operations terms to increase the capacity of the data centre; and all in a way that is both cost-effective and quick to deploy.
So how are businesses currently coping? Well, as found in the research the three main areas that businesses focus on are power, cooling and networking solutions. In fact, 45% of all businesses admitted they are focused on investing in these specific areas in order to increase capacity while maintaining or decreasing the risk of an outage. The result is though, that with all the extra cooling and power installed businesses are operating their data center far below its optimised level. The net result is that data center capacity is being left on the table, at exactly the time when it is most valuable. In fact, as an industry average, businesses are over-provisioning to the tune of 36%! This is not just wasting power and capacity but also hitting the bottom line of businesses too.
Digital twin on the rise
With a lack of knowledge on how data centers are performing, businesses are blind to simple, cost-effective alternatives. A key solution to this challenge is to effectively utilise any stranded capacity without having to spend on additional infrastructure. Enter the digital twin. A Digital Twin is a 3-dimensional physics based computer model that replicates the precise behavior and performance of the real data center. The Digital Twin can be used to predict the response of the real data center to any change, but in a risk free virtual environment. By quickly trying out different setups and layouts, businesses can make small adjustments that have a huge impact on the thermal efficiency of a data center. Once a setup has been found that achieves the desired results, it can then be implemented in the physical data center itself, all safe in the knowledge that it has already been rigorously tested. This process saves money both in designing and operating a data center.
As well as helping to reclaim stranded capacity, digital twins have also proven very effective in reducing downtimes. In fact, the research found that three times as many businesses that have used a digital twin did not see an outage in a 12 month period, than those that did not have a digital twin. With its ability to save time and reduce outages, it is no surprise that over two thirds (67%) of businesses expect to have a digital twin in place within the next 12 months.
Demand for data center infrastructure is not going to slow down or go away. The growth of over-the-top (OTT) services like Disney Plus, as well as increased AI applications, means that data center capacity and density is going to be pushed to even greater limits. However, the good news is that there is a proven solution that is ready to help. Already delivering benefits for data center businesses, the digital twin has become a key tool across the industry in 2020.
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 11.
1. Targeted ransomware attacks will focus on smart cities and big businesses
The growth trend of targeted ransomware attacks we've seen in 2019 will continue. Attackers will focus more on large business, where each minute of downtime cost enormous money, as well as government and healthcare organizations that are more likely to make substantial payments in order to recover their data.
Attackers who was distributing Ransomware through mass spam campaigns may start to change their strategy and instead of going for as many victims as possible, they will focus more on high profile targets where they can demand ransom amount in hundred thousand or even millions of dollars. This might result in less number of attack on individual but increased attacks on MSPs, Government organization, schools, Banks, telecom, and manufacturing industries.
Attacks through third parties and MSPs will become more popular as it is one of the easy steps into the company’s infrastructure to execute ransomware attack. While phishing is still expected to be number one infection vector here, more attacks to be performed via environment via SMB and RDP protocol vulnerabilities that are common used protocols.
It is very important to have proper data protection in place in this context, simple backup solutions as we saw though 2019 won’t work. Backup software being attacked, disabled and compromised and we continue to see this trend next year. That means data protecting against ransomware should not only include top detection technologies but also should be able to withstand attacks on its desktop agents.
2. Social-engineering attacks as the main vector
Prevalence and complexity of social-engineering attacks, including phishing, will increase. It’s getting more and more expensive to have remote code execution exploits (like 1M for zero-click for windows), so email will remain as leading attack vector. Criminals will also take advantage of social media platforms tricking victims into giving up personal information, login credentials, or even sending money.
Attackers will abuse common browser's bugs to hang victim’s browser/environment which lead to scamming money (for fake support).
3. High profile APTs against critical infrastructure and nation-wide attacks
Advanced persistent threats or attack (APTs) against critical infrastructure (energy sector, healthcare, financial institution and other governmental organization) for political and financial gain will be on rise globally and in developing nations of Asia pacific.
It is likely that more cases of endangering Nation's sovereignty will be seen in 2020 by meddling with election process, attacking political parties for stealing their agenda, increasing biases using social media through breaching user's private and confidential data.
4. Attackers get stealthier but still use Red Team tools
Attackers will use more public file sharing and hosting over secure connection (SSL) to deliver malware, phishing, etc. This will cause trouble to cyber security vendors to provide detection. Even though file sharing services like Google try to prevent sharing malware – cybercriminals will encrypt the payload to stay effective.
We will see hackers and malware authors taking advantage of new techniques developed for preserving privacy like DOH (DNS over HTTPS) or ESNI (Encrypted SNI) and end to end encryption to conceal their activities which will make the job of AV and cybersecurity companies even harder.
These attacks may be focused on data alteration as well, potentially very serious threat industry been talking about for a while. As data authenticity become more and more important, it is better to be ready for that and use software and technologies that deliver immutability of sensitive data via blockchain.
At the same time, various Red Team attack frameworks are still being used widely to penetrate modern defenses (Metasploit, Empire, Powersploit, Coreimpact).
5. Cryptojacking fading out
Cryptojacking and crypto mining malware presence will overall decrease. There are few reasons explaining that. First most of the security vendors offer crypto mining protection.
Second some web browsers already offer cryptojacking protection as built in feature and for all other protective plugins are available. Finally dropping prices of cryptocurrencies proven mining less profitable. The only segment where crypto mining malware could stay is compromised servers where protection capabilities are less developed than for endpoints.
7. More adoption of A.I. in cybersecurity
Artificial intelligence will be adopted more into cyber security. With increasing number of devices requiring protection and limited number of human analysts to process the incoming security data, A.I. adoptions is the only reasonable and cost effective solution to detect and neutralize threats before they can cause any significant harm. Cybercriminals starting to use AI for attacks and they will do it more and more, and to combat these you need AI as well, otherwise it will be close to impossible to recognize and process. A.I. can help to uncover new exploits using guided fuzzing, can be used more in data protection and authenticity verification and threat similarity/origin analysis.
6. More attacks on the Cloud and in Internet-Of-Things area
Attackers will go for poorly protected services running in cloud in order to compromise organizational business strategy, stealing intellectual properties, financials data and employee data. As more and more services are provided in the Cloud, more and more cybercriminals attention will be focused on this area.
More attacks against IOT ecosystems should be expected, with a number of smart devices growing exponentially. Still lacking in security, they are easy target for attackers. Criminals will explore ways to profit more from IoT attacks, targeting consumer devices to snoop on personal or business conversations and industrial machinery to disrupt assembly lines.
LAN trends 2020
Wetzikon, London - Reichle & De-Massari (R&M) the globally active Swiss developer and provider of cabling systems for high-quality network infrastructures presents again its market outlook for 2020.
In our recent ‘Trends for 2020’ article, we described several developments in Public, LAN and DC networks. Let’s take a closer look at the challenges and trends, along with the solutions …
Cloud services, Fiber to the Home, 5G, IoT and smart buildings will continue to change the network landscape. More 5G phones will enter the market, driving bandwidth demand. Next generation WLAN access points are now available and new smartphones from manufacturers such as Apple and Samsung are Wi-Fi 6 enabled, further boosting bandwidth and backbone requirements.
LAN trends are, to a significant extent, being driven by the need for intelligent building infrastructure, in which a wide range of functionalities is managed and monitored over a converged network. This network needs to be capable of powering large numbers of remote (data gathering and processing) devices, such as sensors and peripheral equipment. Ideally, data and power are integrated – which is possible with an ‘everything over IP’ approach…
Single Pair Ethernet (SPE): uniform, application and manufacturer-independent continuous IP-based transmission
The connectivity landscape is becoming increasingly standardized and unified, with IP as a common medium for previously disparate systems. R&M, closely involved in the standardization of Single Pair Ethernet (SPE), sees this type of connectivity as a key future technology, for example in smart buildings and industry 4.0. Using SPE without interfaces to replace the traditional field bus can help realize high connection density required for the networks of today and tomorrow, and makes installation faster and easier. SPE works with 10BASE-T1 to 1000BASE-T1, offers 15 - 1,000 meter link ranges at up to 1Gbit/s transmission rates, and can supply terminal equipment with up to 50 watts with Power over DataLine (PoDL).
SPE based on xBASE-T1 protocols uses a single twisted pair for data transmission. LAN is compressed into a thin two-core cable with miniaturized connectors, making it possible to significantly increase terminal equipment connection density. IT and field bus components are integrated, installation and maintenance are simplified and the costs of material and operating expenses are reduced. Compared to traditional Ethernet cabling, this approach offers a significantly higher number of possible connection points. Connection to the LAN is done with switches either centrally in the floor distributor or distributed in the zone at the service outlets. Ethernet/IP transmits large quantities of (complex) data faster than field bus systems, allowing the collection and distribution of data from the entire network. Synergies reduce operating expenses and manufacturer-neutral standard products can be used.
‘All over IP’: an integrated approach to networking smart buildings
R&M has united LAN and Ethernet/IP cabling with related technologies such as Wireless LAN, Power over Ethernet (PoE) and Single-Pair Ethernet (SPE). The resulting ‘All over IP’ approach enables digital building automation exclusively using Internet Protocol. This provides high levels of standardization, availability and reliability, with LAN providing the physical communication layer and Power over Ethernet. IP devices and networks speak the same language ‘end to end’ and don’t need ‘translation’ between servers, operating systems (e.g. via gateways), cabling and end devices. Buildings can be connected and controlled digitally throughout. SPE is ideal for connecting large numbers of small sensors and actuators. What’s more, devices and systems that work with Ethernet/IP technology are comparatively inexpensive. The current Internet Protocol version (IPv6) can theoretically allocate around 1,500 IP addresses per square meter. In practice, there is no limit to the number of devices that can be addressed. The star-shaped topology reduces the number of connection points and improves IP networks’ operational reliability. Access controls and authentication measures incorporated in IP improve building automation security.
The digital ceiling
Cabling of smart buildings should be application-neutral and manufacturer-independent. Combining structured cabling for data networks with IP offers a perfect solution for this. ‘All over IP’ also makes R&M’s ‘digital ceiling’ concept possible. This approach extends the data network through an entire building’s ceiling in a ‘honeycomb’ fashion to so called zones. Within a zone it is possible to connect devices to building automation via pre-installed overhead connecting points (service outlets). Real estate managers or tenants can benefit from digitization with ‘Plug and Play’ – fit for purpose, without barriers, fast and at low cost. All they need to do is plug in network switches, sensors, controls, WLAN access points and other distributed building services. PoE makes it possible to connect applications with just one cable. R&M’s contacts and connectors are ready for this. The R&M package also supports Passive Optical LAN (POLAN). This fiber optics cabling for extended systems such as airports, malls, resorts, and hotels delivers virtually unlimited bandwidth for miles.
Introduction of smart, converged networks means new energy-conserving technologies and applications can be introduced, such as intelligent management of building space, resources and LED lighting. PoE can power LED lighting throughout entire buildings and address each luminaire via its own IP address. Infrastructure companies can integrate more and more devices in their systems, leveraging the benefits of a unified network.
By John Gentry, CTO, Virtana
New technologies and architectures bring with them much promise of increased performance, great productivity and, ultimately, higher revenues and lower costs. And while it can take a few months for tangible results to be visible, there comes a time when all that matters are TCO and, even more, RTO. 2020 is that time. Organisations have been exploring and adopting cloud-based strategies, AIOps, automation and several other new technologies in a bid to make their businesses more agile, adaptable and generally better performing. The coming year will see the same organisations take stock of their datacentre investments in the name of accountability. Below are my predictions for 2020 in more detail:
In 2020 the IT infrastructure will be recognised as the business itself. Whereas in the past it would support the business, moving forward it will become the business. It will allow the creation of IT-based businesses that could not have existed in the past. The availability of technology and its use in innovative ways will allow entrepreneurs to take a traditional business and turn it into a brand new one where the technology is such an enabler that without it the business would not exist. Examples include RealPage and Plex Systems.
Automation: last year I predicted that in 2019 automation would become intelligent and there is no doubt that this trend has started in earnest. In 2020 we will witness ever greater levels of intelligent automation in the datacentre, with organisations leveraging this development to increase the efficiency and reliability of their IT infrastructure services. Governance will have a key role in this development.
AIOps: although there are still a number of companies laying claim to the term without actually having a real solution, AIOps has undergone some maturation, evolving beyond hype into some real world applications, with significant interest in its benefits from end users. Differently from general AI, AIOps is not deployed through a singular solution but, rather, it is an approach that relies on an ecosystem involving several providers. 2020 will see the evolution of the AIOps ecosystem into a more established option for organisations, one they can rely on to bring a new level of innovation to the business. There will also be a realisation that laying logic on top of existing noise is not productive, resulting on some AIOps players’ claims becoming de-bunked.
Cloud: back in 2018 I predicted that the cloud would continue to see a slowdown in its adoption in the form of repatriation. Earlier this year the US federal government shifted its cloud strategy from cloud first to cloud smart. Bank of America claims it saved $2 billion by turning its back to the public cloud. So there has been broad acknowledgment that cloud is not necessarily less expensive than on-prem. 2020 will be the year of the shift to ‘cloud accountability’, when organisations take a more deliberate approach to the cloud with a financial focus and we will see a continued increase in hybrid environments rather than 100% cloud strategies. There will also be a more purposeful approach as to what goes into the cloud versus on-prem, based on specific use cases and true business requirements, paying particular consideration to cost, performance and capacity (with a realisation of the impact of underutilisation).
Containers: 2020 will be the year that marks the shift towards the purposeful placement of workloads throughout hybrid and mutlicloud environments enabled by private cloud container-as-a-service infrastructure on-prem. This is driven by the need to manage where workloads reside. Container-as-a-service will allow companies to move cloud-native workloads and hybrid applications on or off-prem, between public and private clouds, based on specific workload requirements such as elasticity, geolocality or seasonality, providing a true multi-cloud approach. With this capability, organisations will be able to dynamically optimise their business spend to return.
CIOs and CEOs: in 2020 CIOs and CEOs will take a closer look at the return on their technology investments. As the IT infrastructure becomes even more critical to the ongoing success of the organisation, company executives will become more involved in strategic decisions related to the agility, cost and performance trade-offs of the IT infrastructure, both on premise and in the cloud.
Geo-political impact: throughout 2020 organisations will become increasingly aware of the impact of geo-political factors on IT and on business overall. The increase of manufacturing costs in China by 20% - 30%, Brexit, the US presidential elections, all create uncertainty and influence the global economy. As a result in 2020 we might see a significant proportion of businesses hitting the pause button on aggressive strategies with increased caution and more purposefulness with regards to expenditure.
After years spent focussing on technology adoption, speeds and feeds, the industry is now undergoing a real shift towards closing the gap between the business and the underlying IT infrastructure, be this on premise or in the cloud. Decisions about critical technology investments are moving firmly up the executive ladder and into the boardroom, and are now dependent on the ultimate impact on the organisations’ bottom lines. The coming 12 months will be critical for vendors, suppliers and end users to understand how to best manage this change.
Maria Sirbu is the VP of Business Development at Voxility.
In 2019, the hype around 5G, artificial intelligence (AI), big data, cloud, and the Internet of Things (IoT) was in overdrive. Each of these innovations captured a lot of headlines but it is time to go beyond the hype. In 2020, there will be shift from inflated expectations to looking to these innovations to deliver real-world outcomes.
Instead of obsessing over new tech or being first with a product launch, enterprises and service providers should be focused on successful transformation and using technologies to serve new customer demand. The technology isn’t as important as finding ways to deliver outcomes and enabling customers to solve challenges and grow.
In the IaaS industry, this means exploring new models that go beyond basic cloud-based IaaS delivered by tech giants. AWS, Google Cloud, Microsoft Azure and few other cloud giants are showing rapid growth, but they are not the only way forward for enterprises and service providers. As cloud service providers (CSPs) battle each other, organisations throughout 2020 will be looking at ways to remove the limitations on cloud-based IaaS and deliver security and simplicity, while gaining new levels of control over their services.
Despite being in a large market led by some of the top global businesses today, IaaS is still evolving and adapting in a changing market. Enterprises and service providers are increasingly looking for alternative IaaS models with better solutions than those offered by the global tech giants.
Here are some of the trends we predict will shape the IaaS market in 2020:
“Cloud-Only” Isn’t the Future
It is no longer a question of cloud versus on-premises. Enterprises are looking at a range of models to serve different needs and outcomes. “Cloud-only” or “all-in-the cloud” strategies don’t account for the complexities of enterprise IT environments. Cloud will no doubt influence a trillion dollars in IT spend by 2020, as noted by Gartner, but that doesn’t mean all data, applications and services will be hosted in the cloud. It won’t be that simple.
New models need to offer the same flexibility and scalability as the cloud, plus the data integrity and control of on-premises. Enterprises shouldn’t have to sacrifice one for the other.
Moving from the Cloud Back on Premise
Enterprises are moving applications and services to the cloud then back on-premises. A report by IHS Markit and Fortinet noted that 74% of companies have done such relocation, with the top two reasons, each selected by 52% of respondents, being performance and security. Some applications and services are better suited to an on-premises environment or need the support of an IaaS model that can deliver more accountability and control than CSPs can offer.
Demand for New Levels of Control
Enterprises increasingly want greater control over their infrastructure and are under growing pressure to show who is managing their data, as well as where and how it is being hosted. When data is hosted with a public cloud provider, enterprises and service providers lose control over it and have to simply trust their CSPs. They sacrifice their control for an easy OPEX-based model.
In 2020, there will be greater scrutiny of how data is managed, and a growing number of organisations will look at alternative models for IaaS. An alternative IaaS model competing with the cloud-based IaaS provided by cloud giants, is renting physical hardware in physical locations. This OPEX-based model provides a comprehensive solution when bundled with networking, security and global hubs, which is a key differentiator providing increased efficiency.
A report by Risk Based Security found 2019 to be a record breaking year for data breaches. Three breaches were in the top ten largest breaches on record, and the number of reported breaches was 54% higher than 2018. The average cost of a data breach is also on the rise, with an IBM report recording a 12% increase to $3.92 million over the last five years
With data breaches rising so rapidly, enterprises are looking for alternative solutions to cloud-based IaaS that can provide more security and control over their infrastructure. Security should be seamlessly integrated into IaaS offerings and easily accessible for customers. In 2020 data breaches will only increase, and organisations need to protect their IaaS deployments.
Growth in Multi-Cloud and Edge Computing
Multi-cloud strategies went mainstream in 2019, and edge computing is set to follow. A Gartner survey found that 81% of respondents were working with two or more cloud providers simultaneously in 2019, and multi-cloud strategies will grow even further in 2020. This will add complexity to enterprise operations. As they manage a growing number of CSPs, more resources will need to be allocated to management and it will show the limitations of cloud-based IaaS.
These challenges will only get bigger as edge computing adoption is driven by IoT deployments throughout 2020. According to Gartner, by 2022, $2.5 million will be spent every minute on IoT, and 1 million new IoT devices will be sold every hour.
With current models, enterprises will find it difficult to efficiently serve this exciting market. As we enter a new decade, approaches to IaaS and cloud-based infrastructure models will go beyond current offerings and provide new opportunities.
The Need for Comprehensive IaaS Models
Exploring different approaches to IaaS will be vital in 2020, particularly as more tech giants increase prices like Google did in 2019. With a shift in mindset from enterprises and carriers, alternative IaaS solutions can be found to provide more opportunities and better align with the needs of their business.
If we only accept what is offered by the biggest players, we risk losing out on new competitive advantages and greater profitability. Organisations of all kinds can remove the limitations on their hosted infrastructure and find a new path forward with IaaS.
2020 will see a willingness to explore new models that go beyond the basics and offer different kinds of capabilities and characteristics. A continual hype can be expected for technologies including cloud, big data, AI and IoT, but the real focus should be on creating a model that can succeed and continually meet the changing needs of our industry into the future.
In the Age of Software, competitive advantage—or disadvantage—is determined by the velocity, quality and efficiency with which organisations can continuously turn digital ideas that matter into digital experiences that customers care about. Large enterprises that used to dominate their markets are today scrambling to compete against nimble digital disruptors who are flexed to respond to customers’ always escalating expectations for more, better, faster.
By Stuart Ashby, DevOps Specialist at Compuware.
Companies will move toward creating high performance development teams
In this new world order, where every company is a technology company, the role of the developer is appropriately changing for the better. These digital artisans are no longer order takers expected to bend to the will of the business. More and more—thanks in part to leaders who understand the immense value they bring to the business—developers are empowered to innovate on existing core systems, as well as deliver and support new means of digital engagement with customers.
But to do so, they require a milieu only afforded through Agile and DevOps, namely an open and collaborative culture, inspiring and challenging projects, modern methods of working, and tools and processes that continuously improve their abilities. As agents of innovation, they must be coached like high-performance athletes based on KPIs of velocity, quality and efficiency to ensure their ongoing success.
Enterprises will place a greater focus on automated testing (and it’s a long time coming)
Enterprises are continuing to lose vital mainframe development and operations skills. Automating processes like testing helps to mitigate the effects of that lost knowledge.
However, unit and functional testing in the mainframe environment have traditionally been manual and time consuming for experienced developers and prohibitively difficult for inexperienced developers to the degree that they skip it all together.
According to an independent study commissioned by Compuware, the vast majority of IT leaders believe that test automation is the single most important factor in accelerating innovation, but less than 10 percent of organisations automate tests on mainframe code. Arcane manual testing practices are creating a bottleneck that hinders the delivery of innovation and prevents organisations from meeting their business goals.
The good news is modern mainframe testing tools enable developers to automatically trigger tests, identify mainframe code quality trends, share test assets, create repeatable tests and enforce testing policies. Empowered with these capabilities, developers can confidently make changes to existing code knowing they can test the changes incrementally and immediately fix any problems that arise so they can deliver updates faster.
Development organisations will experiment with coupling test automation with a “shift-left” approach
Businesses expect to achieve significant benefits by not only automating more testing on the mainframe, but also doing it at every stage of the development process.
To that end, as companies ramp up automation, they are also experimenting with coupling test automation with a “shift-left” approach—where developers write unit tests at the same time as they write source code. This enables teams to focus on quality as soon as a project is kicked off instead of waiting for defects to be surfaced later in the app dev lifecycle—defects that could disrupt operations, introduce security risks, hinder customer experiences or impact business revenues.
While a shift-left approach can help reduce the number of bugs that make their way into production, it can put more pressure on developers. That’s why it’s imperative developers have access to tools that enable them to automate the creation and execution of unit, functional, integration and regression testing on the mainframe, while empowering even novice developers to validate COBOL and Pl/1 code changes with the same speed and confidence as they can with other code.
Automation coupled with a shift-left approach improves the quality, velocity and efficiency of mainframe software development and delivery.
Enterprises will continuously measure to improve software delivery quality, velocity and efficiency
Today, 57% of firms run more than half their mission-critical applications on the mainframe, and 72% of organisations say their customer-facing applications are completely or very reliant on mainframe processing, according to a 2018 Forrester Consulting study commissioned by Compuware.
As organisations work to create high performance development teams to support accelerated mainframe application development and delivery, they need a way to continuously measure and improve mainframe DevOps processes and development outcomes. A program of KPIs is necessary for accomplishing this.
Measures such as mean time to resolution (MTTR), code coverage and number of defects trapped in test vs. production can provide a picture of developer efficiency and quality metrics. Machine learning can be leveraged to continually monitor and analyse behavior patterns enabling teams to continuously improve on essential measures. Enterprises that strategically leverage their data to tackle development and delivery constraints will see significant improvements at the individual, team and organisational levels.
In 2020 it’s not enough for enterprises to adopt Agile and DevOps. They must recognise that their development teams hold the key to customer satisfaction. As such developers must be treated like high-performance athletes and given challenging projects that are meaningful, modern methods of working, and tools and processes that continuously improve their abilities. Their behaviors must also be continuously measured so their performance will consistently improve, benefitting them as well as the business. To improve software delivery quality, velocity and efficiency, mainframe organisations must adopt more automation, especially automated testing, and test earlier and often in the DevOps lifecycle. And, they must take a granular look at their software delivery process to uncover bottlenecks and resolve friction points so ideas that matter can be turned into customer deliverables that make a difference—continuously.
Bearing in mind this note of caution expressed by Sir Winston Churchill, enjoy our final set of 2020 and beyond predictions, assembled from a range of technology experts from right across the IT, data centre and wider business sectors. Part 12.
Commentary from three of WWT’s key subject matter experts, covering topics including:
1. AI driven, hyper-personalised experiences
In 2020 AI and ML will be at the core of every application with a user interface. The hyper-personalisation of customer experiences will be on most companies’ agendas. They will leverage applications to deliver highly personalised, and relevant, predictive outcomes to end users. This means delivering omnichannel experiences which use new technologies to engage with a business’s audience, for example, advertising on Spotify, through video games, or in specific regions of a city.
Be it health, shopping or advertising, every app that has a user interface will have an AI element. The benefits for enterprise will be clear, as companies are able to process data, and deliver results at a scale that was simply unachievable before. AI will help to correlate vast troves of information from different sources for use in multiple use cases. 2020 will be the year AI and ML really begin to deliver to users and enterprise.
Deepening this ‘hyper-personalisation’ will be the proliferation of interlinked APIs. End users will be delivered highly unique experiences, with truly individual pathways between apps which in turn create unique outcomes. Enterprises will be able to cross-reference different, and vast, data sets to create flexible models that will deliver the best return.
2. The rise of the ‘Chief Trust Officer’
2019 has been a year of failing trust from the private to the public sector, and there needs to be a huge swing to regain confidence. Corporate mismanagement or abuse of data has been a topic of constant media interest and consumers are increasingly aware of the information that businesses hold about them. Customers in 2020 will demand more transparency and will need to be sold on the benefits they can reap before they feel comfortable sharing information with businesses.
So in 2020, we will see companies prioritise identity management, security and privacy. To help win back public trust we could see the rise of the ‘Chief Trust Officer’ as businesses seek to rebuild a trusting relationship with the public. People will need to be given reasons to trust organisations. Amid rising awareness of the value of their personal data, consumers need to feel in control, and be given a return of significant worth.
To further secure personal data we will see blockchain technology, as it matures, move from cryptocurrency to trusted content. This technology can help with verifying and securing data. If it’s underpinned by a blockchain hash it ensures the source is trusted and untampered.
The march towards a cashless society will continue apace in 2020. As countless new tech-driven financial services develop, consumer financial data will grow exponentially, making it more difficult to identify suspicious activity.
Data breaches have long displaced the classic “bank robbery” as the biggest security threat to the banking industry. In 2020, we can expect financial institutions to begin establishing adaptive security systems to manage the ever-evolving threat of cyber crime.
Banks and fintechs alike will begin adopting artificial intelligence as a means of protecting both their assets and consumers. Due to the quantity and the breadth of data circulated throughout the banking industry, the use of AI is now a necessity to provide an additional level of security by detecting suspicious patterns and automatically taking steps.
Insider threats, unusual network activity, or changes to a business’s data will be analysed in real-time and prevented. Technology like this will allow banking security to be more agile, more capable of managing larger volumes of data and free leaders to focus on broader projects which drive business value.
2. A win-win ecosystem for banks and tech companies
Tech companies will continue to push into financial services in 2020. Although a number of big tech companies have recently announced their intentions to move into the financial sector with the introduction of various fintech products, banks must actively remind themselves to not misconstrue this move as a threat to their business.
These moves represent the beginning of a customer service shift which will change financial services from siloed organisations (where money is stored) towards a larger ecosystem of data and enhanced end user experiences.
Tech companies are ultimately data companies, and working alongside banks promises will give them access to a huge source of data. Banks will set up the financial back-end of tech companies’ services (such as Apple Card and Facebook Libra), fulfilling the traditional functions.
But it's not just tech companies who stand to benefit from this ecosystem. Banks will also benefit by recognising the skills that tech companies have with integration, gaining access to customer data, in turn becoming empowered to deliver stronger, more innovative customer experiences. These customer benefits could include quicker AI-enabled banking or tailored product offers based on past transactions.
There is plenty of excitement around the rollout of 5G. But despite the promise of supporting more devices, at lower latency and higher speeds, enterprises still have a long way to go before they are able to deliver useful 5G solutions. Operators are important to the UK rollout of 5G, but enterprises will likely be the key drivers of innovative solutions for consumers.
Businesses which have a direct relationship with their customers will understand the kinds of experiences consumers want 5G to deliver and will fear being left behind by competitors. This will drive new innovations and business models which add real value, from augmented reality for retail shopping to 360 degree 1st view experience for live sports. 5G is simply a new platform on which innovation will take place.
Secondly, the rollout of 5G will be accompanied by specific deployments of mobile edge computing to start addressing particular use cases. We are unlikely to see public edge cloud computing in 2020, but even at this early stage of 5G deployment, there will be a recognition that some processing is better managed at the network edge.
2. Balancing personalisation with data privacy
Hyper-personalisation of customer experiences will be on every company’s agenda, but this will contrast with a greater consumer focus on data privacy. A more digitally fluid 2020 landscape will mean more opportunities to engage a business’s audience with omnichannel experiences, for example, advertising on Spotify, through video games, or in specific regions of a city.
Going beyond mobile and desktop into a digitally fluid landscape requires enterprises to reduce data silos and gather targeting information from a wider variety of sources and platforms. This will lead to more conflict between wanting a personalised experience, and strong data protection. While this promises greater engagement and closer bonds to customers than ever before, it will also invite more questions on data privacy and the way private institutions protect and store data. GDPR regulations will continue to be talked about prominently, and scrutiny of tech companies is likely to increase.
Having AI systems which can be clearly and ethically justified will be essential to building and maintaining public trust. We may even see some companies opening up the workings of their AI systems to demonstrate how they make decisions in more detail.
1. Big Data is well and truly dead, but the data lake looms large. Large scale, feature rich data warehouses, in cloud and on premises, have improved radically, to provide multi-petabyte scale using MPP architectures. That scale is made effectively possible by pushing compute and data closer together, and allowing SQL to express JOIN semantics and aggregations, which can be optimized by the database. These factors have killed “big data” as we knew it, but one element of big data lives on --- the data lake.
2. Best-of-Breed cloud is coming – under the name of Hybrid. Public cloud vendors have extortionately high prices. The public cloud makes sense for small-and-medium sized businesses. Those businesses don’t have the scope to amortize their engineering spend. Public clouds don’t make sense for technology companies. Companies like Bank of America have gone on record as saving 2 billion dollars per year by not using the public cloud. A best-of-breed architecture envisions building blocks within the technical stack, then selects not from a single cloud vendor, but from the variety of service providers. Assumptions that a given cloud provider has the lowest or best prices, or that the cost of networking between clouds is prohibitive, becomes less and less true.
3. Data Exchanges are the exciting data trend – but must evolve to Data Services. While industry specific data exchanges have been around for a decade, datasets that can be trivially loaded into a database using cloud methodologies, at the click of a button, seem exciting and new. The industry needs lower friction means, more standardization, and applying cryptographic tools (such as blockchain) to custody, correctness, and access. These are not supported by today’s data exchanges. Until these features are added, data exchanges will languish.
4. Database innovation will be linked to hardware improvements. The most exciting and innovative databases are leveraging hardware innovation to bring the next levels of price and performance. The cloud enables this innovation. Cloud companies roll forward their hardware plans without on-premises installations, and users can trial innovative hardware easily and experience the power of innovation. You’ll be running your databases on more and more specialized hardware, but you’ll never even know it.
5. AI is becoming a standard technique. Between random forests, linear regression, and other search patterns, AI has become a standard technique. AI, like standard numeric techniques, is best done with compute close to data. This means the techniques of “big data” (separating compute and data) are a poor choice just like they were for a majority of analytics. Running AI as code on a compute grid, or within your database, does not allow the kinds of optimizations that an AI framework, or an AI-centric query system can provide. In 5 years, we’ll wonder why custom code lasted so long in the AI space.
Top Trends Driving Technology Innovation Investments in 2020 Revealed by Virtusa
Financial Services, Insurance, and Life Sciences Leading in Innovation to Start the New Decade
Virtusa Corporation has published the findings of detailed research that identifies trends that will drive emerging technology investments in 2020. The third annual Virtusa xLabs’ Trend Almanac details the top ten technology trends that will strategically align with business and technology investments.
The major trends, which Virtusa identified in collaboration with technology and business leaders, focus on the Financial Services, Insurance, and Life Sciences industries. Aligning investments to the most influential technology trends will help businesses to stay competitive and drive growth in 2020 and throughout the new decade.
The full report identifies three overriding themes:
● The democratisation of emerging tech, reducing the investment, and skill needed to develop and capitalise on tech across infrastructure, applications, and data.
● The need for transparency, ethical practice, and good governance to balance the manic rush to explore and exploit "The New."
● A change in how companies manage tech-enabled innovation programs, ensuring that tech is seen as a means to drive better commercial outcomes, rather than being an end in itself.
“The Virtusa xLabs’ Trend Almanac exposes the new business imperatives that business and technology leaders need to invest in to remain competitive,” said Senthil Ravindran, EVP and global head of cloud transformation and digital innovation, Virtusa. “Digital transformation demands a new way of thinking and getting a jump start on the trends that will impact customers tomorrow. This Trend Almanac will be a key resource in helping businesses do just that.”
The top ten trends include:
Open Banking Goes Global: Open Banking is starting to become a global movement. To date, actual regulations and market changes have only occurred in Europe, even though, as a topic it has been on everybody's tongue. But the move by Australian regulators to implement substantial regulatory reforms this year, giving consumers greater control of their data, signals the paradigm shift to the API economy is well underway. We explore this trend and classify the current approaches to Open Banking across the world.
No Child's Play in the Sandbox. A new breed of regulatory sandbox is making waves. Thematic regulatory sandboxes are credited with taking a more focused approach to sandbox design, where the policy objectives, as well as the problems tackled by the sandbox, are clearly defined. This appeals to regulators as much as fintechs and big firms, who are looking to build engagement and accelerate promising innovations to markets in a controlled and safe environment.
Value Over Volume in Healthcare. Healthcare systems around the world are sinking under the weight of escalating costs. However, there are positive signs that a transition is underway from the reigning fee-for-service delivery model, which has contributed to the cost overhang in many countries, towards the value-based care model. Regulatory and technology forces are key factors that are driving this shift. In 2020, we believe this pivot to value-based care will gain further momentum.
Shop Now, Pay Later. Point of sale finance is getting red hot. In recent years, digital startup lenders have carved up the segment, leveraging the latest tech to introduce instalment loans to the millennial market, but also thinking outside the box by devising new business models. Consumers are on board, as are retailers. Meanwhile, banks have mostly been onlookers, having decided not to develop their own customer-facing solutions. In 2020, we think banks will get into the game.
The End of All Disease? CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) is a breakthrough technology lauded for its extraordinary precision and ease of use. Since Doudna and Charpentier’s landmark paper was published in 2012, there has been a flurry of research activity and much progress as well as media interest in the field. CRISPR is now poised to move into the next phase in the journey from lab to market.
From Factory to App: Automobiles Take a New Route. Consumer adoption of “mobility as a service” (MaaS) is challenging the auto manufacturers' traditional approach to "mobility as an asset." Tech-enabled innovations, such as ride-sharing, last-mile micro-mobility solutions, and the rise of autonomous cars as a utility, are driving the growing popularity of MaaS offerings. In 2020, auto manufacturers will join this trend by creating new propositions around MaaS.
Ready, Set, Quantum! After decades in the wilderness, quantum computing is finally hitting its stride. Instead of being held back by hardware hurdles, in a surprising twist, enterprises have been powering ahead with innovations on “near- term” quantum machines. No longer is this esoteric field the domain of researchers alone, it is now being addressed by a broader set of players, helped along by greater access to tools and collaborations.
Corporate Innovation Hubs Grow Up. The reputation of corporate innovation hubs has taken a hit in recent years, causing businesses to rethink their operating model. This year, there will be a greater focus on empowering a network of affiliate labs in a federated model, which will enable labs to pursue innovation rather than incremental optimisations. With this model, there will be greater alignment between the business unit’s objectives, its funding, and the activities of the innovation program.
Self-Aware Infrastructure. In a digital world run by infrastructure, machine learning is weakening the dependence of human supervision. Progressively, smart infrastructure will be able to self-govern, self- optimise, and self-heal, resulting in highly optimised, fault-tolerant infrastructure. As cloud service providers continue to expand their footprint in the infrastructure market, we will see the first signs of self-aware infrastructure in 2020.Machine Learning for All.
In 2020, we believe there will be more efficient algorithms to automate Machine Learning (AutoML). This will spur the adoption of AutoML at the enterprise level, helping non-tech firms access the capabilities to build ML applications quickly. This democratisation of machine learning will also make AI experts and data scientists more productive and advance the field of AI to new frontiers.
Several DCA Partners are now working within the concepts of the ‘Circular Economy’ – here at the DCA we thought we’d find out a little more about this and have been reading more about this subject.
The definition of the circular economy made us realise that the it’s a very simple concept – a little bit like the ‘Circle of Life’ in the Lion king. Every manufactured product on its demise returns to its creator to be re-purposed and contribute to the next version of the product.
A circular economy follows the principles of 3R: reduce, reuse and recycle.
During our investigations we read an interesting case study that we would like to share with you.
Written by Deborah Andrews, Associate Professor of Design and Beth Whitehead Associate Sustainability Engineer, Operational Intelligence Ltd; here are the first few paragraphs, a link to the entire case study can be found at the end.
The case study provides predictions for Data Centres in ten years’ time, key areas covered include:
Data Centres in 2030: Comparative case studies that illustrate the potential of Design for the Circular Economy as an enabler of Sustainability
By Deborah Andrews, Associate Professor of Design and Beth Whitehead Associate Sustainability Engineer, Operational Intelligence Ltd
During the 1980s the British engineer and computer scientist Sir Tim Berners-Lee developed a digital information and communication language and network, which subsequently evolved to become the World Wide Web in 1989. Since then the user group has expanded from ‘geeks’, researchers and academics and over 4.2 billion people and 55% of the global population are now ‘connected’.
While ‘devices’ (desk and laptop computers and mobile phones) serve as human-digital data interfaces, the hidden but critical enabler of connectivity is data centres (DCs). These facilities may be cupboard-sized or, like the largest in the world, equivalent in area to 93 football pitches, but all house digital data processing, networking and storage (ICT) equipment. Such is the popularity of the internet that since its launch the number of DCs around the world has grown to 8.6 million (Infiniti Research, 2015) with a total floor space of 180 million m2; 10 million m2 of which is in Europe with 70% concentrated in North West Europe (NWE). The main concern of the DC industry is 100% uninterrupted operation for customers and consequently, focus within the sector has been technical and product development, manufacture and operation with limited consideration of treatment at end-of-life. This paper considers two potential scenarios and their impacts for the data centre industry (DCI) in 2030; the scenarios are speculative and are based on past and present trends in and experience working with this unique sector.
Current and future growth in Connectivity and the Data Centre Industry
Such is the popularity and success of the internet that in Europe and the USA 85% and 95% of the population are connected respectively and more and more businesses, education and other service providers are becoming increasingly reliant on connectivity; in Africa and Asia even though the percentage of connected individuals is lower (36% and 49% respectively) population groups are much larger and consequently many more people are connected due to cheaper mobile devices (Miniwatts Marketing Group, 2018). Patterns of internet use vary according to user age, location and affordability: in developed countries such as the UK typically adults spend 4.75 hours per day online (IPA, 2018). In addition, data consumption has increased exponentially and concurrently with the number of work and leisure services on offer: for example, in 2016 the demand for data centre storage capacity increased by 1 Petabyte every day (Brewer et al, 2016). Growth will continue in order to process the increasing volume of data that will be generated by expansion of services via the Internet of Things (IoT), and commerce, healthcare, education, leisure services alongside population and economic growth in countries such as China and India.
It is apparent that there are differences in connectivity according to geographical location but there are even more extreme examples: in Iceland 98% of people are connected while in Somalia and Eritrea connectivity is limited to 2% and 1% respectively. There is also a disparity among demographic groups and women, the rural poor and residents of remote islands ‘are substantially excluded from education, business, and other opportunities that the internet can provide’. Sadly, since 2007 growth in many developing countries has slowed due to a number of factors including: limited and/or no 3G, 4G and wi-fi infrastructure, and the cost of network access, smart phones and computers (A4AI, 2018). As a result, the connectivity gap between different social and national groups is growing.
Reliance on and demand for data centres will increase as more people, smart products and services are connected. In NWE alone capacity will increase 15%+ per year (300%) by 2025 and a global increase of 500% is predicted by 2030. DC operational energy consumption will rise concurrently to facilitate this growth and even though DCs are becoming more energy efficient it is predicted that by 2025, 20% of global energy will be consumed by the sector (Andrae, 2017).
Environmental and social impacts
At present the largest environmental impact from DCs derives from operational energy; this is being addressed by improved operational efficiency and the use of renewables. However, in view of the above growth the embodied impact of DCs must not be ignored. During overall DC building life (60 years) 15% of embodied environmental impact derives from the building and facilities while 85% derives from IT equipment (Whitehead etc al, 2015). Impact is high because equipment is regularly refreshed (servers every 1-5 years, batteries every 10 years and M&E equipment every 20 years). Although specific sectoral data has not been published, the DCI is a significant contributor to the global total of 11.8 Mt/year of Waste Electrical & Electronic Equipment (WEEE), which is one of the fastest growing waste streams across Europe and the world.
DC equipment is typically composed of ‘common’ metals (steel, copper, aluminium, brass and zinc), polymers (ABS, HDPE, PUR, PVC, GPPS, PBT, EVA) and 10 critical raw materials (CRM) - Sb, Be, Cr, Co, Li, Mg, Pd, Si, Dy, Nd, Pr, Tb. They are vital for economic growth but risk to supply is high and is affected by: their abundance/scarcity in the earth’s crust; their geological and geographical location (which influences technical ease of extraction and political circumstances); current recycling rates; and potential substitution by more readily available materials. DC equipment is comprised of 99%+ ‘common’ metals and polymers and 0.2% CRMs; however, their importance cannot be underestimated because electronics cannot work without them. Gold, tin, tantalum and tungsten are similarly essential to electronic products; they are identified as Conflict Minerals because they are produced in central Africa and specifically the Democratic Republic of Congo where their (unethical) mining and sale funds armed conflict and political instability. The extraction processes of many of these and other materials also involves hazardous substances (e.g. arsenic, mercury, sulphides) and because a lot of their mining is unregulated and/or illegal the associated negative environmental and social impacts are high.
Read the full case study here: https://www.cedaci.org/publications
The Circular Economy
By Robbert Hoeffnagel, Green IT Amsterdam
Currently, only 10 percent of the so-called 'critical raw materials' used in data centres are recovered. If we want to further reduce the impact of data centres on the environment and our living environment, the percentage of devices and materials that are re-used or recycled will have to be drastically increased. That is why a group of companies, universities and other parties - including Green IT Amsterdam - are starting a research programme under the name 'CEDaCI' into circular models for data centres. Organisations from the four main data centre countries in Europe - the Netherlands, Germany, France and the United Kingdom - are participating in the project.
"North-West Europe - and in particular the UK, Germany, France and the Netherlands - is the EU's data centre hotspot," says Julie Chenadec, Project Manager at Green IT Amsterdam. "Servers and other hardware in data centres often have a replacement period of 1 to 5 years. This contributes substantially to the production of 11.8 megaton WEEE per year. These four letters stand for 'Waste Electrical & Electronic Equipment. This makes WEEE one of the fastest growing waste streams in the European Union”.
This waste contains so-called critical raw materials (CGs). These CGs are also referred to as 'critical raw materials'. These are raw materials that are of great technological and economic importance and whose supply is vulnerable to interruption. "With the CEDaCI project we facilitate the creation of a circular economy for data centres in North-West Europe. This circular economy reduces the impact of data centres on the environment. This will be possible if we are able to recover more raw materials, reduce the use of new raw materials and develop a safe and economically healthy chain for critical raw materials”.
Currently, only 10% of critical raw materials are recycled and recovered. CEDaCI wants to increase this to 40% for the baseline (107 tonnes) at the end of the project in 2021. And further to 400% or 242 tons of WEEE after 10 years.
"At the moment, the greatest environmental impact of data centres comes from the substantial use of energy," says Chenadec. "This is being addressed through improved operational efficiency and the use of renewable electricity generation technologies. However, given the enormous growth, the impact of data centres on the availability of resources such as the critical raw materials mentioned should not be overlooked”.
Over the lifetime of a data centre an estimated 15 percent of the environmental impact comes from the building and its installations, while 85 percent comes from IT equipment. The impact is high because equipment is typically renewed every 1 to 5 years. "Although accurate data is not published, the data centre industry makes a significant contribution to the global total of 11.8 million tonnes of waste electrical and electronic equipment (WEEE)," says Chenadec. "This is one of the fastest growing waste streams in the EU. WEEE contains critical raw materials of high economic importance and vulnerable to supply disruption. In addition, production is energy-intensive and thus contributes to the environmental impact of the sector”.
Both the speed and volume of growth of 'digital waste' is unprecedented, but this is not accompanied by the development of a recycling infrastructure. Moreover, it is clear that the reuse of components, as well as the recycling and reuse of materials, is low.
Chenadec: "Currently, recycling of WEEE in North-West Europe is limited to 26.9 percent in the United Kingdom, 26.3% in France, 36.9% in Germany and 38.1% in the Netherlands. A large part of the remaining equipment is exported and reprocessed or sent to landfills. These exports waste millions of tonnes of valuable resources from this sector every year or are no longer accessible. While some of these substances are dangerous and have harmful effects on the environment and the living environment. Yet these materials are often simply considered as 'waste'. It is important that these critical raw materials remain available or become available for reuse, precisely because access to them is threatened and substitution by other materials is currently not feasible".
For Green IT Amsterdam CEDaCI is more or less the successor of the ReStructure project. The latter project was purely Dutch and aimed to map the entire chain involved in the responsible use and disposal of IT and other data centre equipment. "ReStructure also looked at the possibilities of creating digital marketplaces where used data centre equipment can be sold or bought," explains Chenadec.Robbert Hoeffnagel is communications manager at Green IT Amsterdam
Mat Jordan – Procurri
We all try to ‘do our bit’ for the environment where we can but when it comes to business, is the circular economy yet in force?
‘The circular economy’ refers to the abandonment of the traditional linear economy of commerciality; make, use and dispose; instead, maximising the usage of resources to increase their value and then recovering, recycling or regenerating products at the end of their life.
Procurri subscribe to the circular economy and are one of very few innovators offering such services within the data centre sector. Whilst it seems the logical thing to do, there’s no doubt that the circular economy is still not the most commonplace solution for data centres, but it’s growing in popularity; and once it’s understood what can be done, it fast becomes a no-brainer for businesses.
Procurri’s lifecycle services for data centres incorporate many more options than the industry standard, offering more opportunity for businesses to enhance their corporate social responsibility by recycling, reusing and extending equipment use. The following three unique services offered work to the benefit of the business, their customers and the environment.
A great deal of unnecessary e-waste is produced as a result of companies upgrading their data centre hardware for no reason other than it is soon to be out of warranty or no longer supported by their vendor. Whilst this may cause concern internally within your firm, there’s no need to operate unsupported – Procurri offer global comprehensive support, completely neutral of vendor, through its team of product experts deployed across the globe. What’s more, support is available 24/7 and in multiple languages through a single touchpoint: and can include parts planning should it be required for the installation of new or like-for-like hardware when needed.
IT Asset Disposition
ITAD is all too often taken lightly and considered only as an afterthought by businesses, because the equipment they’re paying to dispose of is simply no longer of benefit for them. In reality, disposing of hardware should only be done with the utmost care and consideration; ensuring data is properly and securely cleansed and removed, relevant legal requirements and legislation is followed, and the equipment itself is disposed of in as eco-friendly a way as possible.
By managing ITAD as a full end-to-end process from verification right through to the ultimate disposal and destruction or recycling and resale of equipment, Procurri are able to continue the circular economy and not ‘break the loop’ by simply destroying hardware without any consideration as to the potential alternatives.
Procurri offers the ability to sell data centre hardware as well as to buy it, so you’re able to gain an income stream from an otherwise redundant piece of kit. Trade-in and buy-back programmes work to allow businesses to generate income from hardware once it is no longer of use to them.
Procurri’s hardware resale portfolio includes a wide range of enterprise servers, storage and networking products from vendors of all shapes and sizes. The system is popular amongst those looking to buy because of its hassle-free global inventory system – deploying equipment quickly and easily wherever it’s needed. This popularity means that we always need to re-stock, so if your data centre hardware is at the end of its useful life to you, but you think it may be of use to someone else, get in touch to discuss options!
By Rich Kenny, IT Director at Techbuyer
When talking about the Circular Economy, it’s always good to start with the “why”, so here goes... Earth Overshoot Day (when humanity’s demand for ecological resources that year exceeds the amount what Earth can regenerate) is 29th July in 2020, but this is a global average. In most developed countries, the date will be a lot earlier (mostly clustered between March and May), meaning we are using over double our budget when it comes to the environment. Unless we change our approach to one of reusing, repurposing and recycling, there will be serious consequences. The other side of the coin is that transitioning to a circular economy will improve the security of raw materials supply, increase competitiveness, stimulate innovation, boost economic growth and create 580,000 jobs in the EU alone according to the European Parliament.
The Carbon Cost of Digital
A significant proportion of this is tied to the manufacture of our hardware. Around 121 million servers are due to be deployed between 2019 and 2023. Each one of these contains a high proportion of steel, aluminium and plastic, three of the top five materials for industrial greenhouse gas emissions worldwide. In addition, servers and other IT hardware contain copper, gold and 12 of the 27 materials identified by the EU as in low or politically unstable supply.
According to the European Commissions’ JRC Science and Policy Report, the percentage of materials that are recoverable by conventional recycling technologies ranged from 0% to 93%. Many rare earths are in the “zero” category, which may be why there have been moves towards deep sea mining as an alternative. Neither solution seems optimal and the second raises the possibility of further environmental harm.
Shining a light on materials usage
As a company that specialises in buying, refurbishing and selling servers, storage and networking, Techbuyer has a strong interest, and knowledge of, materials usage. In January 2020, we became an Associate Partner in the CEDaCI project, a three-year collaboration between industry and academics from the entire supply chain for equipment in the data centre sector. Running across France, Germany, the Netherlands and UK, it aims to provide solid data on the materials usage involved, increase recovery rates and provide a decision-making tool for upgrades and refreshes.
Materials usage in the sector is an important issue given the high refresh rate in many data centres. However, finding reliable information on this is no easy feat. For one thing, Original Equipment Manufactured servers are assembled using components from a wide variety of suppliers, not all of whom publicly release the materials contained within them. In addition to this, is the myriad of makes and generations in the market at any one time, which will all vary slightly. There is good information from organisations such as Deloitte but these are relatively old. Given the high rate of manufacturing technology the information is likely outdated now.
What about Carbon Emissions?
In amongst all this is the energy question. $1 invested in digital technology in 2019 resulted in 37% more energy consumption than it did in 2010. CO2 emissions from the sector have risen by around 450 million tons since 2013 in OECD countries, whereas global emissions decreased by 250 million tons in the same period. A significant amount of this are Scope 3 emissions from the pre-use phase. However, emissions at use phase cannot be completely ignored. While most of the hyperscalers are striving toward 100% carbon neutral energy mixes, not all of the smaller players are able to follow suit yet.
Running alongside this is the data we have on server refresh and the impact of energy efficiency, which we know is significant. A recent study from the Uptime Institute shows that aging IT kit (older than five years) accounted for 66% of IT energy use but contributed just 7% of the compute capacity over 300 sample data centres. As the sector accounts for around 20% of the digital contribution towards greenhouse gas emissions, which in turn account for 3.7% of total global emissions, this is significant. By 2025, the impact of the digital sector is expected to rise to 5.5% and possibly 8% in the worst-case scenario. This means we need to do everything we can both to save on manufacturing emissions and materials and optimize efficiency at use phase… which is no easy feat.
Balancing the books
Help is at hand in the form of recent research carried out by Techbuyer in partnership with the University of East London. Beginning with the premise that refreshing IT hardware is an environmental as well as a business imperative, we set up a Knowledge Transfer Project co-funded by Innovate UK in order to discover the best solution on this from a performance, efficiency and environmental standard. In blunt terms, we wanted to find out if data centres need the latest and greatest in order to reap the full benefits on the energy bill and bottom line.
The answers we found were interesting. For one thing, we demonstrated that there is no discernable difference comparing like for like refurbished and new equipment. For another, we demonstrated that a previous generation of server was able to outperform the latest generation in terms of both performance and energy efficiency with the addition of extra RAM and an upgrade on the processor. While this won’t apply to pre-2015 servers this does demonstrate that performance gains can be made by upgrading existing hardware at component level. Initial study demonstrates that performance gains can be made by upgrading existing hardware, saving on materials use without losing on energy.
Our initial findings are groundbreaking because it proves that the right approach to systems can yield great results when it comes to efficiency, compute power and the bottom line. This is particularly important because the energy intensity of digital is predicted to rise by 4% per year, in contrast to global GDP’s energy intensity, which is reducing by 1.8% year on year. If our sector adopts more sustainable solutions, we could make a massive difference overall.
One of the best things to come out of our work with the University of East London is a confidence in component level upgrades. We are about to apply a similar mindset to laptops. The average lifecycle on these is around three years, by which time the hardware is often outpaced by software performance gains. Upgrading at component level rather than buying a complete new machine saves significantly on outlay as well as keeping resources in use for as long as possible. We are confident we can push a laptop’s lifecycle to up to six years; something that environmental think-tanks such as the Shift Project recommend. It is something that will benefit the bottom line as well as the environment, but that is a big part of what the Circular Economy is all about.