The January issue of Digitalisation World contains a substantial amount of predictions for 2020 and beyond – following on from a similar series of articles in both our November and December 2019 issues. The good news is that…the February issue will include/conclude the predictions series. So, by then you’ll have a substantial quantity of ideas and opinions which will, hopefully, help you make your own decisions as to what technology developments and what business trends are worth investigating and/or implementing during the coming year.
If I might be allowed to add a prediction of my own? Well, I just do not believe that the hype around 5G will turn into serious deliverables for quite some time. By this I mean that, while 5G may well find deployments within organisations – whether in the office or on the factory floor – the wholescale rollout of 5G infrastructure across cities, regions and, ultimately, whole countries, is some way off. Primarily, because of the expense of building out the required infrastructure against a fairly unpredictable return.
Of course, it maybe that our friends the hyperscalers come up with some killer apps and decide that, rather than wait for the telcos to do the expensive ‘heavy-lifting’, they’ll build out the infrastructure themselves, but it’s not immediately obvious what those apps might be.
Right now, as I may well have mentioned before, when I make the drive home from London to Wiltshire, as soon as I leave the M4 in Berkshire and join the A4 (still a major road) my mobile phone signal varies from an unreliable 4G, through 3G and GPRS to absolutely nothing. If the telcos don’t think it financially worth their while to provide decent mobile phone connectivity right now, why would they go to the expense of building out a 5G network for no obvious return?
And don’t be fooled into thinking that autonomous vehicles will be that killer app. The more I think about it, I just can’t see AVs ever gaining serious traction, unless, and this is a big unless, the insurance industry has a massive overhaul, and the automotive industry is prepared to cannibalise itself – or a new breed of vehicle manufacturers springs up.
So, the multi-billion car industry is going to stand by and watch as pretty much commoditised AVs replace the massive variety of vehicles currently available. They are barely accepting of electric vehicles…
However, even if AVs do gain momentum, there is still the issue of responsibility. The ‘dream’ AV experience is supposed to be one where a central pool of vehicles can be called upon when required. So, a vehicle is ordered for a night out, duly collects and delivers passengers to their required destination, and picks them up at the end of the night and drops them off home. And if a few drinks have been taken, then no worries, because the AV has everything under control.
Right now, AI-assisted vehicles do a great job of helping drivers manage their speed, their parking and various other tasks, but will any AV manufacturer, or, more importantly, an insurer, ever remove any shred of responsibility from the driver? Unlikely.
Oh, and as above, who’s going to build the required, reliable, resilient control network for those stretches of road which are used by a handful of cars a day?
And we haven’t even started to talk about the security aspect.
The vast majority (92%) of debt and equity investors surveyed expect the overall value of investment into Europe’s data centre infrastructure to increase over the next 24 months, according to research commissioned by DLA Piper and published today.
Data centres are used by organisations for the remote storage, processing and distribution of large amounts of data and are currently estimated to use 3-4% of world’s power. According to DLA Piper’s report European Data Centre Investment Outlook: Opportunities and Risks in the Months Ahead, investors anticipate an investment increase in data centres of between 10% and 29% over the next two years.
Data from Acuris in the report shows that the first half of 2019 has already seen a notable rise in investment - with EUR1 billion flooding into the data centres market in H1 alone, compared with a total of EUR1.5 billion for the whole of 2018.
Data centre investment levels in Europe has been impacted by Brexit uncertainty. All respondents agreed that it has negatively impacted the data centre infrastructure market since June 2016, with 56% of equity investors going as far as to say that the negative impact has been ‘significant’. On the flip side, the continuing weakness of the sterling means UK assets may look like a bargain for Eurozone investors.
In an increasingly interconnected world, with an ever-expanding need for data storage facilities, respondents are expecting rent charges to increase for data centres with superior technology, with over a third expecting the increase to be 10% or more.
The majority of respondents chose Germany as the European country that will see the biggest growth in data centre project investment over the next 24 months. Investors also expect the UK to see some of the biggest investment growth in the industry, followed by the Netherlands and France.
Commenting on the findings, partner and head of the Infrastructure sector, EMEA and Asia Pacific, at DLA Piper, Martin Nelson-Jones, said: “Investment into European data centres has spiked recently, with transaction values reaching a new high. Figures for the first half of 2019 suggest strongly that another record year could be in sight. While not without risks, data centres are attractive to many infrastructure investors.”
Intellectual Property & Technology partner at DLA Piper, Anthony Day, said: “What makes data centres so attractive to many investors? Strong fundamentals help. While data centre investment can involve a higher level of risk as compared to other types of infrastructure assets, demand for big data, cloud computing, artificial intelligence and the Internet of Things is rising significantly. The macro trend is that these technologies drive significantly increased demand for data and digital services and, by extension, the buildings and equipment that make them possible.”
Survey with IDG reveals that empowering business units has led to more complex audits, unchecked costs and security vulnerabilities.
In a new report from IDG Connect and Snow Software 67% of IT leaders said at least half of their spend is now controlled by individual business units. While most believe this is beneficial for their organisation, it presents new challenges when combined with increased cloud usage – 56% of IT leaders are concerned with hidden cloud costs and nearly 90% worry about the prospect of vendor audits within cloud environments. The survey, conducted to understand how the rise of infrastructure-as-a-service (IaaS) and democratised IT spending is impacting businesses, found that more than half of IT leaders expressed the need to gain better visibility of their IT assets and spending across their organisation.
Business Units Control a Significant Share of Tech Spend – Which is a Mixed Bag
Traditionally, technology purchasing and management was controlled by IT departments. The cloud and as-a-service models shifted this dynamic, enabling employees throughout the organisation to easily buy and use technology without IT’s involvement. IT leaders are embracing this trend, with 78% reporting that the shift in technology spending is a positive for their organisations. But decentralised IT procurement also creates new complexities for organisations as they try to manage their increasingly diverse IT estates.
The IT leaders in the study voiced concern that the shift in spending to business units:
In fact, more than three-fourths (78%) said audit preparation is growing increasingly complex and time consuming.
Executives are Justified in Worrying About Audits
Results suggest that annual audits are now the rule rather than the exception – 73% of those surveyed said they have been audited by at least one software vendor in the past 12 months.
When asked which vendors they had been audited by within the last year, 60% said Microsoft, 50% indicated IBM and 49% pointed to SAP. Such enterprise software audits can put a tremendous strain on internal resources and result in six, seven and even eight-figure settlement bills.
The vast majority of IT leaders surveyed said they are concerned about the looming possibility of audits, specifically when it comes to IaaS environments. When asked if the thought of software vendor audits for licensed usage on the IaaS front worries them, 60% responded “yes, very much so” and 29% said they are somewhat concerned.
The Roles and Requirements for IT Have Changed
Survey respondents also voiced concern that with the decentralisation of IT spending within their organisations, they will be held responsible for something they currently can’t control. More than half (59%) said that in the next two years they need to gain better visibility of the IT estate. Just slightly less than that (52%) said in that same timeframe, they would have to obtain an increased understanding of who is spending what on IT within the larger organisation.
“As the research highlights, the shift to cloud services coupled with democratised technology spend is fundamentally changing the way businesses and IT leaders need to operate,” said Sanjay Castelino, Chief Product Officer at Snow. “Empowering business units to get the technology they need is largely a positive development, but it creates challenges when it comes to visibility and control – and that can put organisations at risk of having problematic audits. It is more important than ever for organisations to have complete insight and manageability across all of their technology in the IT ecosystem.”
Over half of European manufacturers are implementing AI use cases in the sector with Germany a frontrunner on 69% AI adoption versus US at 28% and China at 11%.
A new report from the Capgemini Research Institute highlights that the European market is leading in terms of implementing Artificial Intelligence (AI) in manufacturing operations. 51% of top global manufacturers in Europe are implementing at least one AI use case. The research also analyzed 22 AI use cases in operations and found that manufacturers can focus on three use cases to kickstart their AI journey: intelligent maintenance, product quality control, and demand planning.
Capgemini’s report entitled ‘Scaling AI in manufacturing operations: A practitioners’ perspective’analyzed AI implementation among the top 75 global organizations in each of four manufacturing segments: Industrial Manufacturing, Automotive, Consumer Products and Aerospace & Defense. The study found that AI holds tremendous potential for industries in terms of reduced operating costs, improved productivity, and enhanced quality. Top global manufacturers in Germany (69%), France (47%) and the UK (33%) are the frontrunners in terms of deploying AI in manufacturing operations, according to the research.
Key points from the report include:
AI is being utilized and making a difference across the operation value chain
Leading organizations are using AI across manufacturing operations to significant benefit. Examples include food company Danone which has succeeded in reducing forecast errors by 20% and lost sales by 30% through using machine learning to predict demand variability. Meanwhile, tire manufacturer Bridgestone has introduced a new assembly system based around automated quality control, resulting in over 15% improvement in uniformity of product.
Manufacturers tend to focus on three main use cases to kickstart their AI journey
According to the report, manufacturers start their AI in operations journey with three use cases (out of 22 unique ones identified in the study) as they possess an optimum combination of several characteristics that make them an ideal starting point. These characteristics include: clear business value, relative ease of implementation, availability of data and AI skills, among others. Executives interviewed by Capgemini commented that product quality control, intelligent maintenance, and demand planning are areas where AI can be most easily implemented and deliver the best return-on-investment. For instance, General Motors piloted a system to spot signs of robotic failures before they occur. This helps GM avoid costs of unplanned outages which can be as high as $20,000 per minute of downtime. While there is consensus on which use cases are best to get started with AI in operations, the study also points out the challenge of scaling beyond the first deployments and then systematically harvest the potential of AI beyond those initial use cases.
“As implementation of AI in manufacturing operations matures, we will see large enterprises transitioning from pilots to broader deployment,”said Pascal Brosset,Chief Technology Officer for Digital Manufacturing at Capgemini. “Quite rightly, organizations are initially focusing their efforts on use-cases that deliver the fastest, most-tangible return on investment: notably in automated quality inspection and intelligent maintenance.
“The executives we interviewed were clear that these are functions which can deliver considerable cost savings, improve the accuracy of manufacturing and eliminate waste. However, the leaders do not solely focus on these use cases but, in parallel with their deployment, prepare for the future by reinvesting part of the savings into building a scalable data/AI infrastructure and developing the supporting skills.” He further added.
The market for extended reality devices shows the typical signs of early volatility, but the long-term outlook for the technology remains positive.
The market for extended reality products, which comprises virtual reality (VR) and augmented reality (AR) devices, will enjoy 21% growth in 2019, to slightly more than 10 million devices, according to the latest forecast by technology analyst firm CCS Insight. Marina Koytcheva, vice president of forecasting, notes, "Although this growth rate might seem disappointing for a market that has had so much hype, it should be assessed pragmatically. Right now, there are still just a handful of successful devices, and a lot rides on every new iteration or new model brought to market by the established players Sony, HTC and Oculus".
CCS Insight believes that Sony is on track to record a solid 2019, and Facebook has also started gaining meaningful revenue from its Oculus devices. HTC, however, despite leading in the premium end of the market, looks set for less growth in 2019. But Koytcheva notes, "It's still early days in the nascent VR market and I have positive expectations for this exciting product category".
The recently published forecast projects that market demand will grow sixfold to 60 million units in 2023. Leo Gebbie, senior analyst at CCS Insight, covering wearables and extended reality devices, explains, "Our optimism is supported by our consumer research of early technology adopters: those who don't yet own a VR device show a strong willingness to buy an extended reality device within the next three years, particularly as more attractive and affordable products with richer content and experiences become available. I believe this is particularly good news for entry-level devices like the standalone Oculus Go headset, which retails at less than $200".
CCS Insight also notes that the variety of VR devices is growing. The company expects new products in 2020 from Oculus, HTC and others, including the highly anticipated Sony PlayStation 5, which is widely expected to be accompanied by an updated PlayStation VR headset.
A further boost to the VR market is coming from the advent of 5G mobile networks. In South Korea, currently the most advanced 5G market in the world in terms of adoption, mobile operators have successfully positioned 5G as a technology that can deliver an attractive VR experience. "Operators in other markets should be encouraged by the success of their South Korean counterparts, and we expect more consumer offerings to be built on the coupling of VR and 5G in the near future", says Koytcheva.
China will also play an important role in the growth of the VR market. The Chinese government's initiative to be a global leader in VR and AR technologies by 2025 will have an impact in the near future. Education is being targeted as one of the major sectors where VR should be adopted. This tallies with CCS Insight's prediction that extended reality will become a standard educational tool in schools in at least two countries by 2025.
"All these trends support our view that the VR market is just heating up. Some quarters will be better than others, but the direction of travel is onward and upward", concludes Koytcheva.
When it comes to AR, adoption by business users is picking up, although the numbers are currently very small. CCS Insight expects about 150,000 AR devices will be sold globally in 2019.
Adoption of AR devices in logistics and remote assistance continues to rise, and other industries are also starting to follow suit, with important vertical markets such as medicine, entertainment and travel beginning to show signs of growth. Importantly, end-to-end AR solutions, consisting of hardware, software and support, are also improving, making adoption much easier than it was a couple of years ago.
The consumer market for smart AR glasses, however, is still a few years away. Gebbie comments, "We've seen some very exciting products starting to emerge, but they're generally prototypes or early iterations of future device designs. It will be some time before mass-market products capable of delivering significant volume hit the market".
CCS Insight also notes that component miniaturization remains a major challenge. Unlike warehouse workers and other enterprise users, consumers are unlikely to be willing to accept anything heavier than and much different in appearance from traditional eyeglasses in their everyday lives. However, Gebbie concludes, "We're confident that this technology will improve rapidly, and predict that by 2022 a major consumer electronics brand will enter the consumer smart glasses market, opening the gates for strong growth in years to come".
98% say improved mobile access would benefit business outcomes.
Domo has released new research that finds business executives want greater access to company data on their mobile devices to power digital transformation across the enterprise. In fact, 98% of respondents believe that improved mobile access to decision-making data would benefit business outcomes.
Decision makers increasingly use mobile devices to conduct business but they often struggle to access the most up-to-date information while on the go. According to the study conducted by Dimensional Research, 96% of teams say it would help decision-making if stakeholders at all levels of the organization had access to up-to-the-minute data on their phones. 87% want to instantly share reports with their team and collaborate on their phones.
According to 2018 McKinsey & Company research1, deployment of mobile internet technology has emerged as the most impactful technology deployed in successful digital transformation efforts across organizations of all sizes. McKinsey’s research found that 68% of organizations that deployed mobile internet technologies reported successful digital transformations compared to 53% of companies that did not.
Other key findings about decision-making data on mobile devices from the Dimensional survey include:
“When teams have immediate access to their data, they can make better and more timely decisions,” said Josh James, founder and CEO, Domo. “The global mobile workforce2 is expected to reach 1.87 billion workers by 2022, and Domo is the best and easiest platform for customers to have the same experience with real-time data on a mobile device as they do on their laptop. Business leaders are spending more time away from their desks so that means they need data on their mobile devices to make key decisions on the go.”
The vast majority (94%) of environmental services managers in councils, local authorities, government or infrastructure agencies admit their organisation is yet to digitalise operations. Budget restrictions (47%) and outdated systems architecture (50%) are seen as two of the main barriers to digitalisation.
Nearly three quarters (72%) of those surveyed noted, however, that they expect the pace of digital transformation to accelerate in the next three to five years. Cloud architectures are expected to have the biggest impact in shaping the future of environmental services over that timeframe (42%), while mobile technologies were referenced by more than a third (35%) of respondents.
Artificial intelligence and machine learning were regarded as impactful by a third of the sample (33%), while Internet of Things (IoT) technologies, including sensors were close behind, with 25% of respondents citing them among the technologies likely to have the most impact in shaping the future of environmental services over the next three to five years.
Tim Woolven, product consultant at Yotta, said: “While most environmental services teams are well underway on their journey to digital or even nearing completion, there is still work to do. Outdated technology systems are seen as one of the main barriers but we would expect that to change over the coming years as mobile systems and cloud architectures become ever more pervasive and advanced technologies like AI, sensors and machine learning continue to mature”.
The study discovered that just under half (48%) of those polled expect improved operational efficiency to be one of the main benefits digital transformation would bring to their organisation’s environmental services provision in the future. Further to this, 44% cited ‘better employee morale’, clearly showing the importance of giving a workforce the latest technology to assist them in their roles. In total, 39% highlighted ‘enhanced data quality’ as a main benefit.
Major global data centre markets are seeing soaring construction costs as development in new and emerging hubs continues to heat up, according to research from global professional services company Turner & Townsend.
The Data Centre Cost Index 2019 highlights the intensification of investment in leading locations in the global data centre network as a trigger for escalating costs. Globally, over 40 per cent of markets surveyed are showing 'hot' construction conditions – where competition for supply chain resources is putting pressure on budgets.
The research analyses input costs – including labour and materials – across 32 key markets, alongside industry sentiment and insight from data centre professionals.
The 2019 report points to the rise of new hotspots across the globe as technological investment in emerging economies takes hold. In Nairobi, Kenya, average build costs stand at $6.5 US per watt on the back of investment required to meet the government’s focus on digitisation of the economy and in response to the arrival of tech giants.
Cost pressures are contributing to the growth of secondary markets in key geographies – including in the US and Europe. In California, Silicon Valley has risen to be the third most expensive place to build globally at a rate of $9.4 US per watt – with unprecedented construction market conditions. Inter-state competition to attract hyperscale investment in the US continues and the study indicates that construction costs in both Dallas and Phoenix ($7.4 US per watt and $7.1 US per watt respectively) are favourable over the world’s largest data centre market of Northern Virginia where costs stand on average at $8 US per watt.
European markets are seeing a significant shift, with capital costs of hyperscale development in the dominant markets of Scandinavia - in Stockholm ($8.6 US per watt) and Copenhagen ($8.5 US per watt) - now exceeding those of the established FLAP markets Frankfurt ($7.6 US per watt), London ($8.5 US per watt), Amsterdam ($7.8 US per watt) and Paris ($7.7 US per watt). Zurich remains the most expensive market to feature in the report but is also expected to be one of the hottest markets for Europe in 2020.
In this environment, respondents to Turner & Townsend’s survey view delivering within budget as a critical challenge, with 90 per cent seeing this as more important than innovation.
Global demand for new space looks set to continue into 2020, with just nine per cent of respondents to the research believing that data centre demands have been met in their markets in the last year – down from 12 per cent in 2018. 70 per cent of those surveyed highlighted the impact of data sovereignty and data protection acts – including those being brought in by the EU, Switzerland and Kenya – as a major catalyst for demand.
The most significant limiter on growth over the next five years is seen as availability of power, especially in the context of pressure on the industry to decarbonise. In Turner & Townsend’s survey, the industry is split 50:50 on whether technological advances with solid state batteries alongside green energy sources can render traditional fossil fuel generators obsolete.
Dan Ayley, global head of hi-tech and manufacturing at Turner & Townsend, said:“Data continues to be one of the most valuable commodities. As deals get bigger and more profitable, we are seeing investment in both established hot spots and emerging markets heat up – putting pressure on cost and resources.”
“Although our report points to certainty in delivery as the key issue for the sector across global markets, sustainability is one of the most pressing challenges coming down the track. With power density requirements for data centres increasing by as much as 50 per cent year on year, demonstrating steps towards decarbonisation needs to be a priority for how hubs are conceived, built and operated across their lifecycle.”
Leaseweb USA, a leading hosting and cloud services company, has released the results of its “Developer IaaS Insights Study,” based on a survey conducted at DeveloperWeek Austin. The research revealed that 61.4% of companies see hybrid cloud (31.6%) or private cloud (29.8%) as the infrastructure for the future of their company, and 75.9% of developers prioritized scalability, speed, ease of use and cost as top factors when choosing their IaaS hosting solution.
“As companies evolve, their hosting needs and capabilities also evolve,” said Lex Boost, CEO of Leaseweb USA. “Understanding why companies choose to migrate to an IaaS solutions vendor provides insight to not only the marketplace, but what value vendors can bring to the businesses. This survey is a microcosmic example of the current industry trend. The power, speed, flexibility and functionality of dedicated, hybrid and private cloud infrastructure environments are undeniable. Companies are shifting back to custom solutions designed to fit their exact needs, in this precise moment of their company lifecycle. The mettle of metal cannot be ignored.”
The results are particularly significant when considering that less than 20% of respondents believe they are using the industry standard, are happy with the performance of their infrastructure and have no plans to change. Further, 51.7% plan on migrating in the next two years while 26.7% have not yet made a decision as to whether they will migrate in the same timeframe. Clearly, most companies are reevaluating their DevOps infrastructures, and will either be migrating to solutions or considering solutions that more thoroughly meet their hosting and infrastructure requirements.
Barriers to Migration
To assist developers who are looking for the right fit for their company’s DevOps infrastructure needs, providers need to address cost of migration and the size of the job which rose to the top as barriers to migration that remain a prohibitive factor. Cost of migration or the size of the job were cited as top barriers to migration by 37.5% of companies, while 24.2% of respondents identified cost alone as the top barrier.
The survey also revealed 15% aren’t sure what to outsource, 8.3% are not finding the right partner and 3.3% are held ransom by public cloud providers.
Global report examines latest trends in business communications and how they affect employee productivity and the bottom line.
Mitel has published the results of its latest global report on workplace productivity and business communications trends. The independent research study - conducted by research firm Vanson Bourne with advisement and analysis from KelCor, Inc -surveyed 2,500 business professionals in five countries across North America, Western Europe and Australia to examine overarching trends in business communications and how existing communications and collaboration practices are impacting both workforce productivity and the bottom line.
Key findings from the report:
The report also looked at which methods of communications and collaboration employees find most efficient and effective, offering helpful insights for IT and business leaders looking to ensure digital transformation initiatives and related technology investments are successful.
Given the significant waste surrounding resource costs and time, it’s no surprise that 74% of respondents felt that more effective use of technology within their organization would improve their personal productivity. However, the report brings to light several interesting findings on what is causing communications inefficiency and what organizations should consider to reduce it:
Businesses have an opportunity to lessen the impact of lost productivity by aligning their tools, processes and culture to achieve better results. When it comes to communications and collaboration, clear leadership, improved planning, effective training and employee education on the goals and benefits of the tools could have a strong positive impact on adoption and ROI.
Training remains a critical form of defence against cyber-attacks.
Organisations are leaving themselves unnecessarily exposed to significant security risks. This is according to data from, Databarracks, revealing over two-thirds of IT decision-makers believe their employees regularly flout internal IT security policies.
With industry practitioners speculating on how the cyber security landscape will evolve in 2020, Peter Groucutt, managing director of Databarracks, highlights why training is still a critical form of defence against cyber-attacks.
“People are often the weakest link in the information security chain and to prevent your organisation being caught, it’s important you make employees aware of the risks. Our research has revealed two-thirds (67 per cent) of IT decision-makers believe their employees regularly circumvent company security policies.”
Groucutt continues, “Employees flouting security policies are never deliberately threatening the business – either they don’t know the possible consequences of their actions or feel too restricted by the policies in place. In any case, this neglect for security leaves an organisation exposed to threats.
“To reduce the danger, there are practical steps an organisation can take. Firstly, to develop a culture of shared responsibility, where the cyber security burden doesn’t just rest with the IT department. We understand this in the physical working environment – an unknown person would not be allowed to walk in to an office, and start taking belongings unchallenged – so why should digital security be any different?
“Secondly, lines of communication between the IT department and the rest of the business need to improve. For users to feel like they are part of the solution, they need to be aware of the ongoing battle IT face. Often, IT teams handle incidents in the background with only key senior individuals being informed, but if threats aren’t communicated internally to all employees, they won’t know how to change their behaviour in future. The IT department has a responsibility to educate the entire business on why an incident took place, what the implications were and, most importantly, what can be done to prevent this from happening again.”
Groucutt continues, “When security processes hinder an employee’s performance, they will often find a way to get around them to get a job done quicker. To avoid staff taking the easy route security must be built into an organisation’s overall strategy and communicated down through employees’ objectives. Equally, IT need to be receptive when policies are flagged for being too restrictive. That creates the dialogue and an understanding of a shared goal for IT and users.
“Finally, regular training and education is vital. Awareness training is typically only carried out annually or as part of an initial induction, but this should be increased. Employees need ongoing security refreshers throughout the year, at least twice annually, to address any new threats, and ensure security remains front of mind.”
Almost a quarter of security leaders are experimenting with quantum computing strategies, as more than half worry it will outpace existing technologies.
More than half (54%) of cyber security professionals have expressed concerns that quantum computing will outpace the development of other security technologies, according to new research from the Neustar International Security Council (NISC). Keeping a watchful eye on developments, 74% of organisations admitted to paying close attention to the technology’s evolution, with 21% already experimenting with their own quantum computing strategies.
A further 35% of experts claimed to be in the process of developing a quantum strategy, while just 16% said they were not yet thinking about it. This shift in focus comes as the vast majority (73%) of cyber security professionals expect advances in quantum computing to overcome legacy technologies, such as encryption, within the next five years. Almost all respondents (93%) believe the next-generation computers will overwhelm existing security technology, with just 7% under the impression that true ‘quantum supremacy’ will never happen.
Despite expressing concerns that other technologies will be overshadowed, an overwhelming amount (87%) of CISOs, CSOs, CTOs and security directors are excited about the potential positive impact of quantum computing. The remaining 13% were more cautious and under the impression that the technology would create more harm than good.
“At the moment, we rely on encryption, which is possible to crack in theory, but impossible to crack in practice, precisely because it would take so long to do so, over timescales of trillions or even quadrillions of years,” said Rodney Joffe, Chairman of NISC and Security CTO at Neustar. “Without the protective shield of encryption, a quantum computer in the hands of a malicious actor could launch a cyberattack unlike anything we’ve ever seen.”
“For both today’s major attacks, and also the small-scale, targeted threats that we are seeing more frequently, it is vital that IT professionals begin responding to quantum immediately. The security community has already launched a research effort into quantum-proof cryptography, but information professionals at every organisation holding sensitive data should have quantum on their radar. Quantum computing's ability to solve our great scientific and technological challenges will also be its ability to disrupt everything we know about computer security. Ultimately, IT experts of every stripe will need to work to rebuild the algorithms, strategies, and systems that form our approach to cybersecurity,” added Joffe.
The latest NISC report also highlighted a steep two-year increase on the International Cyber Benchmarks Index. Calculated based on changes in the cybersecurity landscape – including the impact of cyberattacks and changing level of threat – November 2019 saw the highest score yet at 28.2. In November 2017, the benchmark sat at just 10.1, demonstrating an 18-point increase over the last couple of years.
Rethink Technology Research finds that smart building penetration of the total commercial and industrial building stock will reach only 0.49% in the period, with huge room to grow.
The smart building market will grow to $92.5bn globally, by 2025, according to our research, up from around $4.2bn in 2019, with the primary driver being the desire to improve the productivity of the workers that are housed within those buildings’ walls. In most instances, no matter how you slice it, when you look at the costs of occupying a building in terms of square-meters, human capital is almost always the largest single component.
To this end, if you want to use smart building technologies to save costs or increase margins, the main use case you should be targeting is human productivity. While the technologies can certainly help manage operating costs, such as energy bills, or provide improved services such as secure access or usage analytics, on a per-dollar basis, these should not be the priority targets for new installations.
This is something of a surprise for many in the technology markets. We are accustomed to IoT technologies being used for process or resource optimization, such as smart metering providing better purchasing information for energy providers, or predictive maintenance helping to reduce operational costs and unplanned downtime.
This line of thinking is not typically extended to human workers however, but when you evaluate how buildings are used, it becomes clear that getting more out of your workforce is a better use of your budget. To this end, the IoT technologies needed to better understand and optimize a building’s internal processes and the patterns of its workers are vital, and will account for a large number of the devices installed in the smart building sector.
There’s a rule of thumb used, called the ‘3-30-300 Rule,’ which was popularized by real estate firm JLL. The gist of it is that for every square-foot of space that a company occupies in a building, it will spend $3 annually on utilities, $30 on rent, and $300 on its payroll. Based on this ratio, you can see how smart building efforts should be coordinated. A 100% efficiency improvement would only save $1.50 per square-foot per year, which is the equivalent of a 0.5% change in the payroll costs.
Using the rule, JLL argued that if you were to reduce employee absenteeism by 10%, this would equate to $1.50 per square-foot annually, and a 10% improvement in employee retention would translate to an $11 per square-foot annual saving. If you were to increase employee productivity by 10%, this would translate to $65 per square-foot per year – and it points to the World Green Building Council’s (WGBC) decree that an 18-20% improvement is quite easily achieved in the right environment.
The WGBC published a quite influential meta-study back in 2013 that combined the findings of dozens of other pieces of research, to examine the impact of sustainable building design on employee health. JLL was interested in this from the productivity perspective, and the WGBC found that eight primary factors had direct positive impacts on building occupants – in this case, workers.
The factors were natural light, good air and ventilation, temperature controls, views, and green spaces. The WGBC then posited that the following increases in productivity could be achieved by making better use of the factors: better lighting (23% increase), access to green natural spaces (18%), improved ventilation (11%), and individual temperature control (3%). JLL calculated the returns based on an algorithm some of its real estate brokers developed, but of course, those figures aren’t applicable to every building or task.
Payroll, in the commercial sector at least, is usually north of 80% of a company’s operating costs, often much closer to 90%. Industrial output has much more materials costs, and so JLL’s rule is not so applicable. However, given that the commercial sector accounts for around 63% of global GDP, with Industry on 30% and Agriculture at about 7%, the rule is still quite useful for evaluating the value of smart buildings across the spectrum.
To this end, if a company has a given budget to invest, it seems prudent to spend that cash on trying to make employees more productive, rather than save on energy bills. That’s a message that isn’t going to go down well in this environmentally-charged climate, but thankfully, many of the energy providers and associated systems integrators will install Demand Response (DR) and automation technologies through regular upgrade and replacement cycles, which will help optimize energy usage in these buildings.
Collectively, buildings and construction account for around 30-40% of global energy use and energy-related carbon dioxide emissions. Because of this, the per square-foot energy efficiency of buildings needs to improve by around 30% in order to meet the Paris Agreement environmental targets. By 2060, it is expected that the total buildings sector’s footprint will have doubled – reaching around 230bn square-meters.
This forecast examines the value of smart building technology globally, covering the proportion of the hardware that can be directly attributed to smart buildings, the associated software and management platform services, and installation and management related consulting. It does not try to forecast the total value created by the technology, nor the installation and upkeep revenues. That would be such a large number that it would not be useful.
In terms of market variation, we expect North America and Europe West to be the strongest initial market, with parts of APAC (China, Japan, South Korea) making up for the rest of that region’s low adoption. This is a pretty similar story to many of our other IoT forecasts, and there is not really reason to think that this one will be markedly different.
This is a trend that is going to take longer to emerge too, and we expect the years immediately after the forecast period to post some impressive growth. We foresee this market being more gradual than the explosive growth curves seen in other IoT markets, but due to its potential size, this slower penetration is not to be seen in a negative light.
Gartner, Inc. has highlighted the trends that infrastructure and operations (I&O) leaders must start preparing for to support digital infrastructure in 2020.
“This past year, infrastructure trends focused on how technologies like artificial intelligence (AI) or edge computing might support rapidly growing infrastructure and support business needs at the same time,” said Ross Winser, senior research director at Gartner. “While those demands are still present, our 2020 list of trends reflect their ‘cascade effects,’ many of which are not immediately visible today.”
During his presentation, Mr. Winser encouraged I&O leaders to take a step back from “the pressure of keeping the lights on” and prepare for 10 key technologies and trends likely to significantly impact their support of digital infrastructure in 2020 and beyond. They are:
Trend No. 1: Automation Strategy Rethink
In recent years, Gartner has detected a significant range of automation maturity across clients: Most organizations are automating to some level, in many cases attempting to refocus staff on higher-value tasks. However, automation investments are often made without an overall automation strategy in mind.
“As vendors continue to pop up and offer new automation options, enterprises risk ending up with a duplication of tools, processes and hidden costs that culminate to form a situation where they simply cannot scale infrastructure in the way the business expects,” said Mr. Winser. “We think that by 2025, top performing leaders will have employed a dedicated role to steward automation forward and invest to build a proper automation strategy to get away from these ad hoc automation issues.”
Trend No. 2: Hybrid IT Versus Disaster Recovery (DR) Confidence
“Today’s infrastructure is in many places — colocation, on-premises data centers, edge locations, and in cloud services. The reality of this situation is that hybrid IT will seriously disrupt your incumbent disaster recover (DR) planning if it hasn’t already,” said Mr. Winser.
Often, organizations heavily rely on “as a service (aaS)” offerings, where it is easy to overlook the optional features necessary to establish the correct levels of resilience. For instance, by 2021, the root cause of 90% of cloud-based availability issues will be the failure to fully use cloud service provider native redundancy capabilities.
“Organizations are left potentially exposed when their heritage DR plans designed for traditional systems have not been reviewed with new hybrid infrastructures in mind. Resilience requirements must be evaluated at design stages rather than treated as an afterthought two years after deployment,” said Mr. Winser.
Trend No. 3: Scaling DevOps Agility
For enterprises trying to scale DevOps, action is needed in 2020 to find an efficient approach for success. Although individual product teams typically master DevOps practices, constraints begin to emerge as organizations attempt to scale the number of DevOps teams.
“We believe that the vast majority of organizations that do not adopt a shared self-service platform approach will find that their DevOps initiatives simply do not scale,” said Mr. Winser. “Adopting a shared platform approach enables product teams to draw from an I&O digital toolbox of possibilities, all the while benefiting from high standards of governance and efficiency needed for scale.”
Trend No. 4: Infrastructure Is Everywhere — So Is Your Data
“Last year, we introduced the theme of ’infrastructure is everywhere’ that the business needs it. As technologies like AI and machine learning (ML) are harnessed as competitive differentiators, planning for how explosive data growth will be managed is vital,” said Mr. Winser. In fact, by 2022, 60% of enterprise IT infrastructures will focus on centers of data, rather than traditional data centers, according to Gartner.
“The attraction of moving selected workloads closer to users for performance and compliance reasons is understandable. Yet we are rapidly heading toward scenarios where these same workloads run across many locations and cause data to be harder to protect. Cascade effects of data movement combined with data growth will hit I&O folks hard if they are not preparing now.”
Trend No. 5: Overwhelming Impact of IoT
Successful IoT projects have many considerations, and no single vendor is likely to provide a complete end-to-end solution. “I&O must get involved in the early planning discussions of the IoT puzzle to understand the proposed service and support model at scale. This will avoid the cascade effect of unforeseen service gaps, which could cause serious headaches in future,” said Mr. Winser.
Trend No. 6: Distributed Cloud
Distributed cloud is defined as the distribution of public cloud services to different physical locations, while operation, governance, updates and the evolution of those services are the responsibility of the originating public cloud provider.
“Emerging options for distributed cloud will enable I&O teams to put public cloud services in the location of their choosing, which could be really attractive for leaders looking to modernize using public cloud,” said Mr. Winser.
However, Mr. Winser points out that the nascent nature of many of these solutions means a wide range of considerations must not be overlooked. “Enthusiasm for new services like AWS Outposts, Microsoft Azure Stack or Google Anthos must be matched early on with diligence in ensuring the delivery model for these solutions is fully understood by I&O teams who will be involved in supporting them.”
Trend No. 7: Immersive Experience
“Customer standards for the experience delivered by I&O capabilities are higher than ever,” said Mr. Winser. “Previous ‘value adds’ like seamless integration, rapid responses and zero downtime are now simply baseline customer expectations.”
Mr. Winser warned leaders that as digital business systems reach deeper into I&O infrastructures, the potential impact of even the smallest of I&O issues expands. “If the customer experience is good, you might grow in mind and market share over time; but if the experience is bad, the impacts are immediate and could potentially impact corporate reputation rather than just customer satisfaction.”
Trend No. 8: Democratization of IT
Low-code is a visual development approach to application development that is becoming increasingly appealing to business units. It enables developers of varied experience levels to create applications for web and mobile with little or no coding experience, largely driving a “self-service” model for business units instead of turning to central IT for a formal project plan.
“As low-code becomes more commonplace, the complexity of the IT portfolio increases. And when low-code approaches are successful, I&O teams will eventually be asked to provide service,” said Mr. Winser. “Starting now, it is in I&O leaders’ best interest to embed their support and exert influence over things that will inevitably affect their teams, as well as the broader organization.”
Trend No. 9: Networking — What’s Next?
In many cases, network teams have excelled in delivering highly available networks, which is often achieved through cautious change management. At the same time, the pace of change is tough for I&O to keep up with, and there are no signs of things slowing down.
Mr. Winser said that the continued pressure to keep the lights shining brightly has created unexpected issues for the network. “Cultural challenges of risk avoidance, technical debt and vendor lock-in all mean that some network teams face a tough road ahead. 2020 needs to be the time for cultural shifts, as investment in new network technologies is only part of the answer.”
Trend No. 10: Hybrid Digital Infrastructure Management (HDIM)
As the realities of hybrid digital infrastructures kick in, the scale and complexity of managing them is becoming a more pressing issue for IT leaders.
Organizations should investigate the concept of HDIM, which looks to address the primary management issues of a hybrid infrastructure. “This is an emerging area, so organizations should be wary of vendors who say they have tools that offer a single solution to all their hybrid management issues today. Over the next few years, though, we expect vendors focused on HDIM to deliver improvements that enable IT leaders to get the answers they need far faster than they can today.”
Infrastructure-led disruption will drive business innovation
By 2025, 60% of infrastructure and operations (I&O) leaders will drive business innovation using disruptive technologies, up from less than 5% who do so today, according to Gartner, Inc. This trend of infrastructure-led disruption will support the growth of the I&O function within the enterprise.
“As businesses face increased pressure to lower operating costs, many I&O leaders have been siloed into a tactical role rather than a strategic one — essentially, becoming custodians of legacy infrastructure,” said Katherine Lord, research vice president at Gartner. “The result is stunted I&O maturity over the past decade. I&O leaders who harness the power of disruptive technologies, such as cloud and artificial intelligence (AI), will discover new opportunities to serve as business innovators.”
Infrastructure-led disruption is the use of I&O technologies, processes, people, skills and capabilities to promote disruption and embrace risk. “I&O leaders who champion infrastructure-led disruption are constantly looking for new ways to use technology to deliver business value, rather than just remaining reactive to stakeholder needs,” said Ms. Lord.
New Technologies Make Way for I&O Maturity
IT infrastructure remains in a period of protracted change, spurred by new technologies that only increase the complexity of modern, distributed infrastructures. Recognizing the need for growth to remain relevant in the digital age, 45% of respondents in the Gartner 2019 I&O Executive Leaders Survey* indicated that improving maturity was among their top three goals for their I&O organizations. Embracing technologies such as automation, edge and quantum computing can help I&O leaders mature their infrastructure for the next wave of digital.
“Infrastructure-led disruption is about more than avoiding obsolescence,” said Ms. Lord. “It presents an opportunity for I&O to purposefully take an entrepreneur-like approach to extend the function beyond existing organization charts, and even reinvent the role itself. Keeping a pulse on emerging technologies will help center I&O as a part of the business innovation and market disruption they support.”
Evolve Talent, Culture and Practices for the Future of I&O
Through 2022, traditional I&O skills will be insufficient for more than half of the operational tasks that I&O leaders will be responsible for. Further, while 66% of I&O leaders believe that behaviors related to culture hinder their agility, 47% have not adapted these behaviors to align with their cultural and organizational transformation.
“I&O leaders are at a critical decision point, in which they can either choose to embrace disruption or sit in a ‘watch and wait’ mode,” said Ms. Lord. “For those who choose to take the proactive approach, new skills, talent and culture will be critical for driving change in the early stages.”
Strategic alignment and purposeful alliances with the C-suite will also be crucial for the future of I&O. As business becomes increasingly digital, I&O leaders have the chance to use their technology expertise to collaborate with other leaders on breakthrough opportunities.
“Digital transformation presents a real opportunity for I&O to align more closely with the CEO and other key business stakeholders, elevating the function to a more strategic role within the organization and helping the enterprise stay ahead of the curve when it comes to embracing disruptive technologies,” said Ms. Lord.
In its first worldwide 5G forecast, International Data Corporation (IDC) projects the number of 5G connections to grow from roughly 10.0 million in 2019 to 1.01 billion in 2023. This represents a compound annual growth rate (CAGR) of 217.2% over the 2019-2023 forecast period. By 2023, IDC expects 5G will represent 8.9% of all mobile device connections.
Several factors will help to drive the adoption of 5G over the next several years:
"While there is a lot to be excited about with 5G, and there are impressive early success stories to fuel that enthusiasm, the road to realizing the full potential of 5G beyond enhanced mobile broadband is a longer-term endeavor, with a great deal of work yet to be done on standards, regulations, and spectrum allocations," said Jason Leigh, research manager for Mobility at IDC. "Despite the fact that many of the more futuristic use cases involving 5G remain three to five years from commercial scale, mobile subscribers will be drawn to 5G for video streaming, mobile gaming, and AR/VR applications in the near term."
In addition to building out the 5G network infrastructure, mobile network operators will have a lot to do to ensure a return on their investment:
According to the International Data Corporation (IDC) Worldwide Quarterly Server Tracker, vendor revenue in the worldwide server market declined 6.7% year over year to $22.0 billion during the third quarter of 2019 (3Q19). Worldwide server shipments declined 3.0% year over year to just under 3.1 million units in 3Q19.
In terms of server class, volume server revenue was down 4.0% to $17.9 billion, while midrange server revenue declined 14.3% to $3.0 billion and high-end systems contracted by 23.7% to $1.1 billion.
"While the server market did indeed decline last quarter, next generation workloads and advanced server innovation (e.g., accelerated computing, storage class memory, next generation I/O, etc.) keep demand for enterprise compute at near historic highs," said Paul Maguranis, senior research analyst, Infrastructure Platforms and Technologies at IDC. "In fact, 3Q19 represented the second biggest quarter for global server unit shipments in more than 16 years, eclipsed only by 3Q18."
Overall Server Market Standings, by Company
Dell Technologies and the combined HPE/New H3C Group ended 3Q19 in a statistical tie* for the number one position with 17.2% and 16.8% revenue share, respectively. Revenues for Dell Technologies declined 10.8% year over year while HPE/New H3C Group was down 3.2% year over year. The third ranking server company during the quarter was Inspur/Inspur Power Systems, which captured 9.0% market share and grew revenues 15.3% year over year. Lenovo and Cisco ended the quarter tied* for the fifth position with 5.4% and 4.9% revenue share, respectively. Lenovo saw revenue decline by 16.9% year over year and Cisco saw its revenue grow 3.1% year over year.
The ODM Direct group of vendors accounted for 26.4% of total revenue and declined 7.1% year over year to $5.82 billion. Dell Technologies led the worldwide server market in terms of unit shipments, accounting for 16.4% of all units shipped during the quarter.
Top 5 Companies, Worldwide Server Vendor Revenue, Market Share, and Growth, Third Quarter of 2019 (Revenues are in US$ Millions)
3Q19 Market Share
3Q18 Market Share
3Q19/3Q18 Revenue Growth
T1. Dell Technologies*
T1. HPE/New H3C Groupa *
3. Inspur/Inspur Power Systemsb
Rest of Market
Source: IDC Worldwide Quarterly Server Tracker, December 5, 2019.
* IDC declares a statistical tie in the worldwide server market when there is a difference of one percent or less in the share of revenues or unit shipments among two or more vendors.
a Due to the existing joint venture between HPE and the New H3C Group, IDC will be reporting external market share on a global level for HPE and New H3C Group as "HPE/New H3C Group" starting from 2Q 2016.b
Due to the existing joint venture between IBM and Inspur, IDC will be reporting external market share on a global level for Inspur and Inspur Power Systems as "Inspur/Inspur Power System
Top 5 Companies, Worldwide Server Unit Shipments, Market Share, and Growth, Third Quarter of 2019 (Shipments are in thousands)
3Q19 Unit Shipments
3Q19 Market Share
3Q18 Unit Shipments
3Q18 Market Share
3Q19/3Q18 Unit Growth
1. Dell Technologies
2. HPE/New H3C Groupa
3. Inspur/Inspur Power Systemsb
T5. Super Micro*
Rest of Market
Source: IDC Worldwide Quarterly Server Tracker, Dec 5, 2019
* IDC declares a statistical tie in the worldwide server market when there is a difference of one percent or less in the share of revenues or shipments among two or more vendors.
a Due to the existing joint venture between HPE and the New H3C Group, IDC will be reporting external market share on a global level for HPE and New H3C Group as "HPE/New H3C Group" starting from 2Q 2016.
b Due to the existing joint venture between IBM and Inspur, IDC will be reporting external market share on a global level for Inspur and Inspur Power Systems as "Inspur/Inspur Power Systems" starting from 3Q 2018.
Top Server Market Findings
On a geographic basis, Asia/Pacific (excluding Japan) (APeJ) and Japan were the only regions to show growth in 3Q19 with Japan as the fastest at 3.3% year over year and APeJ flat at 0.2% year over year. Europe, the Middle East and Africa (EMEA) declined 9.6% year over year while Canada declined 4.7% and Latin America contracted 14.2%. The United States was down 10.7% year over year. China saw its 3Q19 vendor revenues remain essentially flat with year-over-year growth of 0.7%.
Revenue generated from x86 servers decreased 6.2% in 3Q19 to $20.6 billion. Non-x86 servers declined 13.1% year over year to $1.4 billion.
According to the International Data Corporation (IDC) Worldwide Quarterly Converged Systems Tracker, worldwide converged systems market revenue increased 3.5% year over year to $3.75 billion during the third quarter of 2019 (3Q19).
"The converged systems market continues to grow despite a challenging overall datacenter infrastructure environment," said Sebastian Lagana, research manager, Infrastructure Platforms and Technologies at IDC. "In particular, hyperconverged solutions remain in demand as vendors do an excellent job positioning the solutions as an ideal framework for hybrid, multi-cloud environments due to their software-defined nature and ease of integration into premises-agnostic environments."
Converged Systems Segments
IDC's converged systems market view offers three segments: certified reference systems & integrated infrastructure, integrated platforms, and hyperconverged systems. The certified reference systems & integrated infrastructure market generated roughly $1.26 billion in revenue during the third quarter, which represents a contraction of 8.4% year over year and 33.7% of all converged systems revenue. Integrated platforms sales declined 13.9% year over year in 3Q19, generating $475 million worth of sales. This amounted to 12.6% of the total converged systems market revenue. Revenue from hyperconverged systems sales grew 18.7% year over year during the third quarter of 2019, generating nearly $2.02 billion worth of sales. This amounted to 53.7% of the total converged systems market.
IDC offers two ways to rank technology suppliers within the hyperconverged systems market: by the brand of the hyperconverged solution or by the owner of the software providing the core hyperconverged capabilities. Rankings based on a branded view of the market can be found in the first table of this press release and rankings based on the owner of the hyperconverged software can be found in the second table within this press release. Both tables include all the same software and hardware, summing to the same market size.
As it relates to the branded view of the hyperconverged systems market, Dell Technologies was the largest supplier with $708.4 million in revenue and a 35.1% share. Nutanix generated $262.2 million in branded hardware revenue, representing 13.0% of the total HCI market during the quarter. There was a 3-way tie* for third between Cisco, Hewlett Packard Enterprise, and Lenovo, generating $109.0 million, $91.9 million, and $91.5 million in revenue each, which represents 5.4%, 4.6%, and 4.5% share of the market share respectively.
Top 3 Companies, Worldwide Hyperconverged Systems as Branded, Q3 2019 (revenue in $M)
3Q19 Market Share
3Q18 Market Share
3Q19/3Q18 Revenue Growth
1. Dell Technologiesa
T3. Hewlett Packard Enterprise*
Rest of Market
Source: IDC Worldwide Quarterly Converged Systems Tracker, December 12, 2019
The EMEA purpose-built backup appliance (PBBA) market rose in value 2.7% year-on-year to reach $325.5 million in the third quarter of 2019, according to International Data Corporation's (IDC) Worldwide Quarterly Purpose-Built Backup Appliance Tracker. This follows the 6.3% YoY decline seen in the second quarter of 2019, bringing the market back to growth.
Total EMEA PBBA open systems shipments were valued at $307.5 million, which represented an increase of 5.7% year-on-year. Conversely, mainframe system sales decreased 31% year-on-year in 3Q19.
The PBBA tracker for Western Europe indicates a nearly flat performance of this region in terms of value, with 0.9% year-on-year growth, reaching $262.9 million in the third quarter of 2019.
The DACH market became the largest in Western Europe in 3Q19, responsible for 36.2% of the market's value and growth of 33.3% year-on-year.
The United Kingdom lost 6.3% of market value, gaining second place in the Western European PBBA market, suffering a contraction of 19% year-on-year in value.
The French PBBA market ranked third 39.3% growth year-on-year in value, giving it a 14.2% market share.
"The United Kingdom and Germany are the main drivers in the development of data protection technology in Europe," said Jimena Sisa, senior research analyst, EMEA Storage Systems, IDC. "Organizations are becoming increasingly disposed to update their legacy or third-platform technologies with tools that provide more functionality in terms of automation, better monitoring deployment, data management, analytics and orchestration. This is creating more desire to engage in cloud-based data protection-related projects that would help companies to grow their business in a digital transformation era."
The PBBA market in Central and Eastern Europe, Middle East and Africa (CEMA) again recorded growth in value (12.2% YoY) in 3Q19, reaching $58.38 million.
The Middle East and Africa (MEA) market was the subregion that prevented the EMEA backup appliances market recording a decline. The major vendors in the open systems space recorded significant growth. The Central and Eastern European (CEE) region had more subdued performance, but nevertheless most companies closed a successful quarter.
"In countries like Saudi Arabia, Egypt, and Israel, there was a demand for larger-drive systems with increased data reduction, backup, and restore rates," said Marina Kostova, research manager, EMEA storage systems, IDC. “In CEE, the large countries of Poland and Russia saw increased shipments for both incumbents and data protection companies, while smaller countries experienced and overall slowdown in infrastructure spending, affecting PBBA as well."
The importance of proactive performance monitoring and analysis in an increasingly complex IT landscape. Digitalisation World launches new one-day conference.
The IT infrastructure of a typical organisation has become much more critical and much more complex in the digital world. Flexibility, agility, scalability and speed are the watchwords of the digital business. To meet these requirements, it’s highly likely that a company must use a multi-IT environment, leveraging a mixture of on-premise, colocation, managed services and Cloud infrastructure.
However, with this exciting new world of digital possibilities comes a whole new level of complexity, which needs to be properly managed. If an application is underperforming, just how easily can the underlying infrastructure problem be identified and resolved? Is the problem in-house or with one of the third party infrastructure or service providers? Is the problem to do with the storage? Or, maybe, the network? Does the application need to be moved?
Right now, obtaining the answer to these and many other performance-related questions relies on a host of monitoring tools. Many of these can highlight performance issues, but not all of them can isolate the cause(s), and few, if any, of them can provide fast, reliable and consistent application performance problem resolution – let alone predict future problems and/or recommend infrastructure improvements designed to enhance application performance.
Application performance monitoring, network performance monitoring and infrastructure performance monitoring tools all have a role to play when it comes to application performance optimisation. But what if there was a single tool that integrated and enhanced these monitoring solutions and, what’s more, provided an enhanced, AI-driven analytics capability?
Step forward AIOps. A relatively new IT discipline, AIOps provides automated, proactive (application) performance monitoring and analysis to help optimise the increasingly complex IT infrastructure landscape. The four major benefits of AIOps are:
1) Faster time to infrastructure fault resolution – great news for the service desk
2) Connecting performance insights to business outcomes – great news for the business
3) Faster and more accurate decision-making for the IT team – great news for the IT department
4) Helping to break down the IT silos into one integrated, business-enabling technology department – good news for everyone!
AIOps is still in its infancy, but its potential has been recognised by many of the major IT vendors and service and cloud providers and, equally important, by an increasing number of end users who recognise that automation, integration and optimisation are vital pillars of application performance.
Set against this background, Angel Business Communications, the Digitalisation World publisher, is running a one day event, entitled: AIOPs – enabling application optimisation. This one-day event will be dedicated to AIOPs - as an essential foundation for application optimisation – recognising the importance of proactive, predictive performance monitoring and analysis in an increasingly complex IT landscape.
Presentations will focus on:
Companies participating in the event to date include: Bloor Research, Dynatrace, Masergy, SynaTek, Virtana, Zenoss.
To find out more about this new event – whether as a potential sponsor or attendee, visit the AIOPS Solutions website: https://aiopssolutions.com/
Or contact Jackie Cannon, Event Director:
Tel: +44 (0)1923 690 205
When it comes to 5G, we’re just getting started. Entire industries are simultaneously planning for a new era of connectivity, bracing themselves for what is set to be one of the most influential roll-outs in technological history. Enabling higher speeds, lower latencies, and more machine-to-machine connections, the arrival of fifth generation networking is already sparking a revolution amongst data centres. And there’s more to come.
By Eric Law, Vice President Enterprise Sales Europe at CommScope.
In the next year alone, Gartner has forecast worldwide 5G network infrastructure revenue will reach a staggering $4.2 billion. That’s almost double its current value. The analyst house has claimed that, despite still being in the early days, ‘vendors, regulators and standards bodies’ alike all have preparations in place.
Meanwhile, the UK government is pushing back against planning authorities that are attempting to halt the installation of 5G masts, with a spokesperson for the Department for Culture, Media and Sport (DCMS) expressing the government’s commitment to 5G, highlighting the economic benefits of the roll-out.
Even as the first 5G services are starting to go live, however, there are still many questions that remain unanswered. To ask how fifth generation networks will affect life inside the data centre, therefore, is similar to asking how a city would stand up to a natural disaster. It depends on the city and the storm.
Moving to the edge
We now know enough about 5G to know that it will without doubt change how data centres are designed and, in some cases, will even alter the role that they play in the larger network. From a technical standpoint, 5G will have several defining characteristics.
The most obvious of these is the use of the 5G New Radio (NR) air interface. Exploiting modern spectrums and providing latency capabilities in just milliseconds, this enhanced performance will drive the deployment of billions of edge-based connected devices. It will also create the need for flexible user-centric networks, pushing the compute and storage resources closer to both users and the devices.
The only way to meet these ultra-reliable, low latency requirements will be to deploy edge nodes as mesh networks, with an east-to-west flow and parallel data paths. In some cases, these nodes may be big enough to classify as pod-type data centres or micro data centres in their own right, similar to those being used by both telecom and cable providers.
Preparing for disruption
Cloud-scale data centres – as well as larger, enterprise facilities – may be the only ones to see just some impact of this move. They are already using distributed processing and have been designed to deal with increased data flow from the edge. At the other end of the scale, retail multi-tenant data centres (MTDCs) will likely incur the most disruption, as they have traditionally grown in response to rising demand for cloud-scale services.
The biggest changes, however, will be seen among service providers. As they begin to refine their relationship between core data centres and evolving centralised RAN (CRAN) hubs, adaptation will become a choice of sink or swim. Increasing virtualisation of the core networks and the radio-access networks will be key when it comes to handling the anticipated 5G data flow. This will also enable service providers to be more flexible with compute and storage capacity, easily moving it to where is most needed.
More broadly across service provider networks, increasing virtualisation may have a more direct effect in the core data centre, with wireless and wireline networks becoming more converged. This will generate an even stronger business case for a single physical layer infrastructure, it then just depends on the degree of convergence that occurs between the core network and the RAN. Whether this occurs in a central office or in a data centre, we still don’t know.
The role of new tech
Aside from considering cloud models, we also need to focus on the impact that new technologies such as artificial intelligence (AI) and machine learning (ML) will have on data centres when 5G makes its mark. These technologies will not only require accelerated server speeds, but also higher network capacity to enable greater volume of growing edge services. When building these data models, processing vast data pools will be essential which, in most cases, are best suited to core data centre capabilities.
Most of the data that goes on to develop AI models will come from the edge. This nods to a potential shift in how more established, cloud-scale data centres will support the network. One potential use-case involves using the power of the core data centre to assemble data from the edge to develop the AI models. These would then be pushed out to deliver localised, low-latency services. The process would then be repeated, in turn creating a feedback loop that revolutionises the operating model.
A balancing act
As with any other aspect of digital transformation, the level and requirement of change within various data centre environments will depend on the individual application. Data streams generated by the billions of sensors and devices may well produce a steady flow of data, whereas others might be delivered intermittently or in irregular bursts. Either way, it’s very much out of our control. It’s how the data is then collected, processed and analysed that needs to be optimised, considering factors such as how much data should remain local to the edge device, and how much should be processed in the core centre?
Once these questions are answered, network engineers need to determine the best way to move the data through the network. Different latency and reliability requirements require the ability to prioritise data traffic at a very granular level. What can be off-loaded onto the internet via local Wi-Fi versus having to be backhauled to the cloud service provider (CSP) data centre? And remember, edge networks must fit into a financial model that makes it profitable.
Infrastructure as an enabler
The widespread roll-out of 5G is still a few years away, but there is no better time to start preparing for what’s to come. At the very core of this evolution, infrastructure must adapt to support higher wireless bandwidth and more universal data usage. Behind closed doors, organisations and building owners are considering more than just Wi-Fi to enable robust and dependable in-building mobile wireless with distributed antenna systems (DAS). Outdoors, it’s a different story. Service providers are upgrading and expanding their fiber networks to carry wireless data back to the core of the network, or in many cases, to edge data centres.
What we really need to think about, is the applications and innovations that these changes will develop as part of a 5G era more broadly. Self-driving cars, facial recognition, smart cities and industrial automation will all be made possible and more advanced by 5G. The problem remains, however, that each of these applications all have a varied set of requirements regarding reliability, latency, and the type/volume of data traffic generated. Unless you therefore understand the parameters of the situation, it’s difficult to pinpoint it’s exact impact on the data centre.
Something we do know, however, is that the avalanche of new data from the network edge, and thanks to 5G, goes hand-in-hand with high compute and storage power. Exactly how much power? You guessed it: that depends!
The continued growth of the Internet of Things (IOT) depends on ICT infrastructures that can support its data processing and communications bandwidth requirements. Accordingly, 5G, with its ultra-low latency and high transmission rates, has a vital role as an IoT enabler. However, implementing the new standard has major implications for data centres in terms of their size, distribution and internal ICT hardware.
In this article, Alex Emms, Operations Director at Kohler Uninterruptible Power (KUP), looks at this data centre evolution, and discusses how modern UPS technology can be deployed to meet the challenges it presents.
Ever since analogue mobile phones first appeared in the early 80s, we’ve become used to a steady evolution of performance and functionality, through 2G, 3G, 4G and their variations.
This evolution has led us to 5G; a concept that is now becoming reality. Services are already starting to roll out in the UK, US and South Korea, and another 80 operators in 46 countries plan to launch 5G services between now and 2022.
The standard is truly a game-changer because its performance, in terms of speed, latency and connection densities, is such a significant improvement over 4G (or even the ‘4.5G’ technologies, 4G LTE-Advanced and LTE Advanced pro) that it will enable entirely new applications that simply weren’t viable with the earlier standards.
Remote surgery is a good example of this; because a 5G network virtually eliminates the small lag between pinging the network and receiving its response, a surgeon can control a robot arm remotely, without having his precision compromised by feedback delay.
5G could also facilitate driverless cars; the network’s instant response and ubiquitous coverage will allow the cars to use 5G to talk to other cars and sensors built around the city, from street lamps to petrol stations.
Many of these emerging opportunities owe as much to timing as they do to the promise of 5G. Progress in cellular wireless technology has been matched by equally impressive developments in the Internet of Things (IoT) – and the IoT currently comprises sets of geographically distributed smart devices that must communicate, often wirelessly, with a local facility and on to a centralised – typically cloud-based - data processing and analysis resource. So, by giving the IoT wireless channels a step increase in power, 5G will enable a new generation of remote smart devices that reach similarly heightened levels of performance and functionality.
But this new 5G landscape has major implications for the data centres that must handle the data they generate, in terms of performance, capacity, and geographical distribution. Below, we take a closer look at this impact on data centres and their associated power requirements; finally, we show how modern, modular UPS technology can meet this new set of challenges.
Expected 5G impact on data centres
The relationship between 5G and the IoT is highlighted by a Gartner survey[i], in which 57% of respondents reported that their main interest in 5G was to drive IoT communications - and with potentially billions of IoT-connected devices, the amount of data generated will be huge. 5G is estimated to be able to handle 1000 times more connected devices per metre than 4G without being affected by bandwidth congestion, meaning more devices can work in closer proximity.
Additionally, 5G’s bandwidth and speed - estimated to be between 20 and 100 times faster than 4G - will allow it to compete with ISPs’ traditional broadband wired connections; so creating a further huge demand for data processing and storage capacity, and hence data centres. Data centres will also need to compress data to handle the increase in, for example, video streams. This will require new technologies and compression algorithms.
Data centres and telecom providers will therefore require the right high-performance hardware and technology to handle the routing and switching necessary to match the flow of data; additionally, they will need to manage their power consumption. This is not only for cost efficiency, but also due to increasing pressure to demonstrate a greener footprint, and in many cases, the need to avoid upgrading power feeds into the data centre. One solution is installing software defined power (SDP) based on both hardware and software that can skilfully allocate power throughout the data centre, and another is to select more energy efficient equipment; this will save both direct energy costs, and costs associated with keeping the equipment cool.
Data centres that have set up for 4G will have the capacity for handling 5G data. However, they will have to change their infrastructure to cater for 5G’s frequencies. 5G uses short wavelengths, which means small cells rather than large cell towers scattered around the country. These super-high frequencies (30 GHz to 300 GHz) will only work if devices are in close proximity to antennas. Therefore, it is likely we will see multiple input and output antennas (MIMOs) and many more small cells installed around public infrastructure.
Small cells have a clear advantage over traditional cell sites as they can be located in areas which lack space for large cell towers. What’s more, they are cheaper to deploy. Data centres will, however, need to be close enough to these cells to maintain 5G’s low latency performance and meet service-level agreements. In some cases micro data centres might even be deployed at the base of cell towers, allowing limited data processing with even faster response times for critical applications like autonomous vehicles.
These trends will likely lead to the break-up of larger data centres into smaller, more local data centres close to these cells. The new, low latency IoT-type applications such as the above remote surgery and driverless cars examples will clearly depend on data processing, communications channels and power infrastructures that offer very highly reliability alongside ground-breaking performance. This will call for Tier 3 and Tier 4 data centres with near 100% redundancy.
Implications for UPSs
The above predictions for 5G indicate that there will be a proliferation of small and very small data centres - but these data centres, however small, must be just as reliable as the best of their larger counterparts. So how can UPS technology facilitate the scaled-down yet highly-reliable power solutions essential to such environments?
The answer lies with today’s transformerless modular topology, as exemplified by Kohler Uninterruptible Power’s PowerWAVE 8000DPA UPS – See Fig.1. Designed for low to medium-scale high power density applications, the UPS offers high energy efficiency, 99.9999% (‘Six Nines’) availability and flexible scalability in a tower or rack-mountable configuration.
While a single module can be used to deliver 10 kW or 20 kW, further modules can be added incrementally to increase capacity or provide N+1 (or N+n) redundancy. Up to 80kW (60kW N+1) rack-mounting capacity can be provided, with an ultimate capacity of 400kW for two fully-populated tower versions in parallel.
The PowerWAVE 8000DPA’s rack-mounting capability and high power density allows it into restricted spaces, while its high energy efficiency minimises the heat to be extracted from such spaces. As well as the completely self-contained UPS, each module includes battery and communications capability. This allows monitoring of UPSs in remote data centres from a centralised location.
The system’s Decentralised Parallel Architecture (DPA) means that single points of failure are eliminated, and uptime is maximised. Mean time to repair (MTTR) is minimised, as the modules are ‘hot swappable’ and can be safely removed and replaced without interrupting power to the critical load.
Effective maintenance is essential too
While providing a UPS of suitable size, scalability, efficiency and availability is essential for the new generation of small 5G-oriented data centres, this alone is not enough. To be kept in prime condition and optimally reliable, the UPS, especially its battery, must be regularly monitored and maintained. This can be challenging for an implementation comprising many small data centres distributed over a wide geographical area.
An experienced and well-resourced UPS supplier like Kohler Uninterruptible Power can solve situations like this. Firstly, they offer remote monitoring services including PowerREPORTER Remote UPS Monitoring, and PowerNSURE Battery Monitoring.
PowerNSURE not only provides battery monitoring, but also performs proactive maintenance. It checks every battery’s internal resistance, temperature and voltage, and reports on the results – see Fig.2. Additionally, it executes an equalisation process to keep battery charging voltages within operating range. By preventing gassing, dry-out and thermal runaway, this assures continuous battery availability.
PowerREPORTER communicates constantly with UPS systems, and detects any fault conditions or alarm messages. While customers can monitor all of their UPS systems centrally, any incident is reported to KUP’s Service Centre by email, complete with status and any available details. The Centre’s personnel can pass this information to the field team, allowing them to reach the UPS site within the contracted service agreement time, already briefed on the issue and equipped to resolve it rapidly.
KUP’s remote monitoring services are complemented by their nationwide team of trained field engineers. Covering the entire UK, this team not only responds to alarm callouts as described, but also performs scheduled maintenance visits to comply with any established service agreement. Emerging problems are fixed before they can cause failures, so the power protection system’s maximum possible availability is assured.
In recent years, popular news has focused on the drive by organisations like Google and Facebook to build hyperscale data centres. However, the rise of the IoT and expected growth in 5G means that, for many users, the emphasis will shift towards smaller, sometimes micro-scale facilities.
These will call for UPSs that are suitably scaled down, while still offering levels of availability associated with Tier 3 and 4 installations. They must be compact, scalable, simply installed, efficient, and easy to manage remotely.
But it’s not just about providing the right hardware. Critically, the UPS supplier must have the remote monitoring technology and field service infrastructure needed to guarantee that UPSs in multiple facilities spread over a wide geographical area unfailingly achieve their full performance and availability potential.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 1.
2020 will be the year of API connectivity, says Carolyn Crandall, Chief Deception Officer at Attivo Networks:
“Driven by the need for on-demand services and automation, there will be a surge in requirements for the use of technology that interconnects through APIs. Vendors that don’t interconnect may find themselves passed over for selection in favour of others with API access that add value to existing solutions.
DevOps capabilities will continue to increase their significance in moving projects to products, with only 9% of technology professionals responsible for the development and quality of web and mobile applications stating that they had not adopted DevOps and had no plans to do so. This will drive an increased focus on DevSecOps and how opensource software is managed within projects.
We will begin to see more examples of the theft of encrypted data as cybercriminals begin to stockpile information in preparation for the benefits of quantum-computing where traditional encryption will become easy to crack. The advances in quantum computing that Google has recently published bring this possibility closer to becoming reality.
Significant issues will surface around the lack of adequate detection of threats that have bypassed prevention defences. To combat this, in 2020, we will see the addition of deception technology into security framework guidelines, compliance requirements, and as a factor in cyber insurance premiums and coverage.”
“The continued growth of bespoke in 2020” - comment from Philip White, Managing Director, Audacia:
Over the last few decades we’ve seen a constant shift in approaches to IT solutions, from taking out-of-the-box software and heavily customising it to fit business processes, through to changing business processes to work the way the out-of-the-box system does things. More recently we’ve seen a hybrid approach with out-of-the-box software being integrated with bespoke systems.
These changes have not stopped, and in 2020 we will see the shift towards more bespoke solutions continue.
The old argument that ‘we’re too big for bespoke’ is no longer valid. Some of the world’s biggest and fastest growing businesses, such as Google, Uber and Amazon, are continuously and successfully delivering large scale bespoke software projects.
This change of attitude is proof that bespoke projects are not inherently bad, however, they must be carefully managed and gain the full support and engagement from across the business in order that project goals can be achieved.
The amount of time to develop bespoke software can be seen as a barrier, but agile working practices have definitely helped in this regard, giving people the ability to see their system developing in increments, with the ability to provide feedback and react to change throughout the process. There is, however, a danger that businesses over complicate agile practices. Businesses delivering bespoke projects should stick to the basic principles of delivering something early, working in continuous increments and adapting to feedback, otherwise there is a high probability that they will come undone. This isn’t the fault of choosing bespoke over out-of-the-box, rather an illustration that clear objectives and carefully plotted milestones must be agreed before any development commences.
With a younger and more tech-enabled workforce, who have an app for everything, the need for tablet and mobile adoption will only continue, as they are becoming both customers with expectations of mobile offerings, as well as employees with the capability to adopt mobile working. The need to engage and empower end users in the development process will also increase and we’re already seeing more non-technical users taking cloud based wireframing and prototyping tools to drive digital projects forward, with some continuing into the domain of low-code/no-code development platforms.
As businesses strive to find ways to differentiate themselves against the competition, the case for building bespoke is a very strong one. While out-of-the-box can deliver satisfactory results and get you to where you want to go quickly and easily, it’s unlikely to give you a competitive advantage as your rivals are able to buy and install exactly the same piece of software.
With attitudes changing at board level and technology moving from a supporting framework for operations to be a tool for real competitive advantage, CEO’s are demanding far more. It’s this demand that will see the need and desire to develop bespoke systems grow in 2020.
2020 cybersecurity predictions, the next web
By Josh Lemos, VP of Research and Intelligence, BlackBerry Cylance:
Uncommon attack techniques will emerge in common software
Steganography, the process of hiding files in a different format, will grow in popularity as online blogs make it possible for threat actors to grasp the technique. Recent BlackBerry research found malicious payloads residing in WAV audio files, which have been utilised for decades and categorised as benign. Businesses will begin to recalibrate how legacy software is defined and treated and effectively invest in operational security around them. Companies will look for ways to secure less commonly weaponised file formats, like JPEG, PNG, GIF, etc. without hindering users as they navigate the modern computing platforms.
Changing network topologies challenge traditional assumptions, require new security models
Network-based threats that can compromise the availability and integrity of 5G networks will push governments and enterprises alike to adopt cybersecurity strategies as they implement 5G spectrum. As cities, towns and government agencies continue to overhaul their networks, sophisticated attackers will begin to tap into software vulnerabilities as expansion of bandwidth that 5G requires creates a larger attack surface. Governments and enterprises will need to retool their network, device and application security, and we will see many lean towards a zero-trust approach for identity and authorisation on a 5G network. Threat detection and threat intelligence will need to be driven by AI/ML to keep up.
2020 will see more cyber/physical convergence
As all sectors increasingly rely on smart technology to operate and function, the gap between the cyber and physical will officially converge. This is evident given the recent software bug in an Ohio power plant that impact hospitals, police departments, subway systems and more in both the U.S. and Canada. Attacks on IoT devices will have a domino effect and leaders will be challenged to think of unified cyber-physical security in a hybrid threat landscape. Cybersecurity will begin to be built into advanced technologies by design to keep pace with the speed of IoT convergence and the vulnerabilities that come with it.
State and state-sponsored cyber groups are the new proxy for international relations
Cyber espionage has been going on since the introduction of the internet, with Russia, China, Iran and North Korea seen as major players. In 2020, we will see a new set of countries using the same tactics, techniques, and procedures (TTPs) as these superpowers against rivals both inside and outside national borders. Mobile cyber espionage will also become a more common threat vector as mobile users are significant attack vector for organisations that allow employees to use personal devices on company networks. We will see threat actors perform cross-platform campaigns that leverage both mobile and traditional desktop malware. Recent research discovered nation-state based mobile cyber espionage activity across the Big 4, as well as in Vietnam and there’s likely going to be more attacks coming in the future. This will create more complexity for governments and enterprises as they try to attribute these attacks, with more actors and more endpoints in play at larger scale.
Sumant Kumar, Director of Digital Tranfromation at CGI, shares some thoughts:
As margins grow tighter and businesses feel the squeeze even more in 2020, you can expect to see higher investment in practical digitisation, in fact 57% of our clients have a digital transformation strategy in place and we expect this to keep growing. No longer can companies afford to treat digitisation as a way to spend money on new, shinier, and fancier toys. Instead, you can expect to see its implementation being used to drive business outcomes, cut down cost base and improve customer experience. In 2020 as markets compress under pressure, it’s going to be more about asking where the value lies and how business outcomes are achieved. Think, less flash more cash.
Nowhere is this more apparent than the retail sector, which has famously struggled to adapt to rapid digital transformation. If established businesses don’t begin viewing digitisation as a necessary tool in their armoury, they’ll be knocked out of the game by start-ups and others who see digitisation for what it is, and use it to disrupt markets.
2. AI and Machine Learning
81% of our clients already see Advanced Analytics as a top investment area, so in my opinion AI and Machine Learning will continue to become incredibly influential in 2020, helping businesses save costs. We work closely with a lot of UK political manufacturing & utility companies, and all of them want to use IoT and AI for things like predictive maintenance. Using AI in this way would prevent assets in the field from breaking down, as well as aid in the optimisation of a scheduled workforce. If the UK were to be hit by another Beast from the East, utilities companies, for instance, would be able to use AI to predict the likelihood of increased callouts and prepare their staff numbers accordingly.
At CGI we found that 71% of our clients are either using a public, private or hybrid cloud and 41% use the cloud to store or process data for their customers. Cloud migration will remain a significant trend in 2020 as it continues to remove the barrier to entry, allowing start-ups to create a product with minimum investment. You can now open an AWS account and build an application. This would have been impossible not too long ago, you would have needed a data centre and the money for one, but now, due to cloud, it’s no longer an issue.
As well as lowering the barrier to entry, cloud applications offer a host of benefits including increased agility and greater personalisation. The use of cloud-based technologies for instance, allows businesses to provide a personalised service to customers by leveraging big data and AI capabilities in a ‘pay as you go’ model. This in turn could lead to an increase in customer loyalty which results in potentially better profits.
2020 will see RPA implantation take off continue to scale with organisations as it allows businesses to save costs by automating low-value tasks and letting the rest of the workforce focus on high-value tasks such as customer focus. Not only can an RPA complete tasks three times faster than its human counterpoint, but it minimises risk for businesses through increased visibility and audit trails.
Ultimately, RPA allows organisations to free-up time for their employees, which in turn means they can focus on achieving specific business outcomes rather than losing time and money on admin.
5. Customer experience
Ultimately, the biggest trend facing the tech industry for 2020 is older than all the technology put together; creating a positive customer experience. I think we can expect to see a greater integration of technologies in 2020 to achieve this outcome. Afterall, we live in world where we no longer just want a highly personalised experience but expect one. We see ‘Human centred design’ awareness & adoption growing to design engaging customer journeys.
Integrating technology such as IoT and AI will allow all businesses, including traditional brick and mortar companies, to provide this personalised service. In fact, businesses have already started to seize this opportunity to make themselves stand out in the crowd, and you can see native digital brands creating highly personalised experiences for their customer base.
If this last year has taught us one thing, it is never to take cyber security for granted. Threats continue to develop as quickly as new technology emerges. Hackers continue to use age-old techniques on new vulnerabilities. Organisations who fail to adequately protect their data continue to suffer the ramifications of large fines. As we move into a new decade, there is no room for complacency.
New Year, old tricks
Another key learning point from the year has been to never underestimate the opportunistic skills of hackers. When there’s economic gain to be had, they’ll always find innovative ways to use existing malware and ransomware techniques to mount new attacks.
For example, we are seeing image-based spam re-emerge as a major threat in the cyber space. Thirteen years after image-based spam was at its peak, a newer, darker version is back. If a message is composed using an image, there’s a chance it can evade most spam filters, even today. Steganography is a technique which was prevalent over a decade ago that we are seeing used by hackers leveraging the rapid growth of social media to deliver a malicious payload. If recipients are not cyber aware, then it’s easy to take the bait and with nearly half of data breaches in the last year being caused by internal error, it’s no wonder hackers circle back and take their chances using old-school methods.
To help put these threats to bed, optical character recognition and anti-steganography technology within data loss prevention solutions need to be adopted as standard by organisations as we move into 2020. With these technologies applied, images can be processed just like a normal electronic document where they are analysed, and any sensitive information is redacted or removed from the file.
Clearly, in the same way that we know not to trust an image just because it looks innocent, we must not assume that just because certain threats haven’t been seen for a while, they won’t occur again moving forward.
Innovation will threaten security
The latest claim from Google of achieving ‘quantum supremacy’ reignites fears that quantum computers could break most modern cryptography, and effectively be the end of encryption as we know it. Though this threat is hypothetical, it feeds into the wider concern that the increasing sophistication of the technology used by both businesses and cybercriminals will boost the ability to exploit vulnerabilities in 2020.
The reality is, as new technologies are emerging, new threats are appearing alongside them. With the increasing use of biometric data by the public and private sector alike, critical national infrastructure (CNI) organisations are storing more valuable data than ever. The defence sector will increasingly be targeted due to the lucrative nature of the data they store, and this is something that will continue to rise.
To deal with these threats effectively, an evolving cyber strategy for 2020 is needed which can adapt as technology advances. Regular reviews of cyber strategies are imperative, and any new technologies introduced throughout the year must be assessed and paired with appropriate levels of security. If we’re prepared for what’s around the corner, we can make the most of new technology at the same time as protecting our businesses.
In my last few talks on 5G, I frequently got asked that given there are close to a billion cellular IoT devices deployed today and with introduction of 5G, will it replace all the existing IoT technologies and would current investment in IoT become obsolete? The simple answer is no, but to understand this better, let’s dive a little deeper into existing IoT technologies.
By Adrian Taylor, Regional VP of Sales for A10 Networks.
Internet of Things (IoT) is growing exponentially with some forecasts estimating total number of IoT devices by 2025 to be north of 25 billion devices. For a layman, IoT devices are small, low-powered devices that generate infrequent low-volume traffic, such as smart meters. While that is generally true, IoT devices cover a much larger segment. One could classify IoT into two categories – wide-area IoT and short-range IoT.
Wide-area IoT: Low Power and Long Range
Wide-area IoT devices are generally low power and transmit infrequent short bursts of data over a low-power wide area network (LPWAN). These devices are generally installed in remote areas with no external power supply (i.e., they are running on batteries) and therefore are expected to be very power efficient. LPWANs can work in both licensed spectrum as well as unlicensed spectrum. Two of the most common LPWAN technologies that work in unlicensed spectrum are SigFox and long-range wide area network (LoRaWAN). Those that work in licensed spectrum are narrowband (NB) -IoT and LTE-M or LTE machine type communication, also known as Cat-M1, both of which are specified by 3GPP in Release 13 and are collectively known as cellular IoT.
SigFox is a proprietary technology owned by a French company of same name. It is ultra-narrowband technology enabling wireless signals to pass freely through solid objects making it suitable for underground and in-building operations. SigFox operates in 868MHz band in Europe and 902MHz band in U.S. It is designed to support arrays of devices likely to send small amounts of data (100 bytes per second) in short bursts (e.g. parking sensors, utility meters). As of end of 2018, a little more than 60 countries had SigFox-based deployments.
LoRaWAN is a spread-spectrum technology with a wider band than SigFox. LoRaWAN is actually a medium access control protocol running on top of the Long Range (LoRa) physical layer (PHY) protocol operating in unlicensed bands such as 433 MHz and 868MH in Europe, and 915 MHz in Australia and North America. LoRaWAN technology is licensed by SemTech and has been developed under the LoRaWAN alliance since 2015. As of end of 2018, about 100 countries had LoRaWAN-based IoT deployments.
Short-range IoT: Higher Power Requirements, PAN or LAN Connected
Before we jump into cellular IoT, let us briefly look at short-range IoT devices. These generally need higher power, but still significantly lower than most smartphones, and exchange higher amounts of data. Because of the higher power requirement, these devices either need to be frequently recharged or need an external power supply source. These devices can be further divided into personal area network (PAN) or local area network (LAN). PAN IoT devices have a range of few feet and the most common technologies connecting these devices are Bluetooth, ZigBee and 6LoWPAN or IPv6 low-powered wireless personal area network. The most common PAN IoT devices are headphones, fitness tracking devices, etc. LAN IoT devices have a range of hundreds of feet generally and use some version of 802.11 wireless LAN technology for the underlying connectivity. PAN IoT devices generally run on rechargeable batteries, while LAN IoT devices run on both rechargeable batteries, as well as external power sources. The most common LAN IoT devices are security cameras, home appliances such as refrigerators, etc. A single device can be part of both PAN as well as LAN IoT networks. Such a device could aggregate information from a PAN IoT network into a LAN IoT network, which could then connect to a WAN or the internet through an IoT gateway.
Per Ericsson’s Mobility Report, June 2019, wide-area IoT devices, numbering at about one billion, make up a little more than one-tenth of all IoT deployments today and are growing at a 27 percent CAGR to reach 4.5 billion devices by 2024. Short-range IoT devices make up about 90 percent of deployments and are growing at a 15 percent CAGR to about 18 billion devices by 2024. Short-range IoT devices are going to continue to exist in the market in the foreseeable future.
Wide-area IoT devices are going to see exponential growth, especially with introduction of 5G service-based architecture (SBA) core, which will scale much more compared to the 4G LTE packet core. Similarly, we expect LPWAN deployments based on unlicensed spectrum technologies such as SigFox and LoRaWAN to continue to exist and proliferate, albeit at a much lower rate than cellular IoT.
Current Cellular IoT Technologies Bridge 4G LTE and 5G NSA
Cellular IoT technologies, NB-IOT and LTE-M, are deployed as part of the LTE network today and are expected to work with 5G networks using the non-standalone (NSA) architecture. In 5G NSA, the radio interface will still be LTE-based, while the core network will be 5G SBA. As 5G new radio (NR) technology for IoT in Release 16 and beyond is developed, we will start seeing 5G NR-based IoT devices deployed in standalone (SA) mode with the 5G SBA packet core.
What is the difference between NB-IoT and LTE-M?
Before we go any further, one question that usually comes up is why we need NB-IoT and LTE-M as two different technologies in Release 13. LTE-M supports lower device complexity, massive connection density, low device power consumption, low latency and provides extended coverage, while allowing the reuse of the LTE installed base. NB-IoT is characterized by improved indoor coverage, support of massive number of low throughput devices, low delay sensitivity, ultra-low device cost, low device power consumption and optimized network architecture.
This makes them sound awfully similar. However, there are some notable differences: 1) LTE-M supports higher bandwidth [1.4MHz with data rates of up to 1Mbps], while NB-IoT supports lower bandwidth [200 MHz with data rates < 100 Kbps]; 2) LTE-M supports device mobility, while NB-IoT does not; 3) LTE-M supports voice traffic using VoLTE, while NB-IoT is restricted to data only; 4) LTE-M works at higher frequency band, while NB-IoT works in a lower frequency band making it excellent for indoor uses, while LTE-M may not be as good for indoor use; 5) finally, LTE-M has much lower latency (50 – 100 ms) as compared to NB-IoT, which is usually 30 to 100 times of LTE-M at (1.5 to 10 sec).
LTE-M and NB-IoT integration into the existing LTE network. Service Capability Exposure Function (SCEF) is the new entity added into the LTE network in Release 13. To help meet the low power requirements for LTE-M and NB-IoT, the power-hungry protocol to establish IP data bearers has been replaced by extending the non-access stratum (NAS) protocol to allow small amounts of data to be transferred over the control plane. In this case, the IP stack is not necessary, so this type of transfer has been named Non-IP Data Delivery (NIDD) and is shown with the red large dashed line in Figure 1 below.
3GPP defined SCEF to be the interface for small data transfers and control messaging between enterprises’ IoT application servers and the operator’s core network. SCEF provides APIs to the enterprises for the small data transfers and control messages and uses 3GPP-defined interfaces with the network elements in the operator’s core network to perform its functions. Release 13 also supports IoT IP data delivery with NAS control plane, which is shown in Figure 1 below with green small dashed line.
Cellular IoT in 4G LTE Network
In a 5G SBA core, Network Exposure Function (NEF) provides small data delivery service similar to the Non-IP Data Delivery (NIDD) service of 4G SCEF, exposing APIs to application servers. Figure 2 below shows existing NB-IoT and LTE-M networks connecting to the 5G SBA core in non-standalone mode. NEF is the interface for small data transfers and control messaging between enterprises’ IoT application servers and the operator’s 5G core network. The NIDD data transfer is shown with red large dashed line in the Figure 2 below. IoT IP data delivery with NAS control plane is shown with green small dashed line.
Cellular IoT with 5G SBA Core
In summary, currently deployed LTE-M and NB-IoT devices will connect to the 5G SBA core in non-standalone mode, as the 5G SBA core gets deployed. Most of the cellular IoT deployment today is with the 4G network, providing device density of about 60K devices for every square kilometer. Massive IoT deployment requirements in 5G call for device density to be scaled to a minimum of one million devices for every square kilometer. 3GPP is working on enhancements to 5G technical specifications to get to this density in Release 16 (expected to be completed by June 2020) and Release 17 (expected to be completed by end of 2021 or early 2022).
The UK 5G launch season is in full swing, but it is a pale shadow of what we need - 4G still reigns supreme.
5G has arrived – or has it? The new services are surprisingly limited in geographic coverage, performance and features, focussing on broadband only applications. The massive IoT connectivity application of the services has been delayed to 2023. The current incarnation in all carriers except Three UK does not offer the promised alternative to replace leased lines (‘millimeter wave’). Even the much hyped mobile broadband speeds have been downgraded from Gigabits to burst rates in the hundreds of Mbps at best. Add in the lack of reliability, spotty coverage and inconsistency in commercial approach, and 5G is no more than a work in progress.
With a significant number of infrastructural and operational challenges to be overcome before 5G can become a business reality for UK companies, Nick Sacke, Head of Products and IoT, Comms365 explains it is time to set the ‘glass half empty’ promises of 5G to one side and leverage the proven quality, consistency and reliability of existing wireless networks, especially 4G, to support business communications infrastructure and growth.
The promise of 5G has been compelling for many reasons. From businesses looking to achieve widespread IoT deployments, to those seeking a viable broadband and leased line replacement alternative or companies struggling in areas of rural connectivity deprivation, on paper 5G appears to have all the answers. The recent 5G roll out announcements, however, have been something of a disappointment for all. Limited to just six cities initially (EE) – and with variable accessibility even within these areas – the 5G rollout is a promise rather than a reality. It will take several years before 5G offers ubiquitous accessibility – and there are no firm plans to support rural areas and manufacturing – rather, a set of innovation challenges that are funding Consortia projects to look at innovation to address the problem. Even then, a number of key features of the service are still to be clarified.
The reality today is that 5G – where it is available – is providing enhanced mobile broadband and no more. For those with compatible devices – an issue in itself given the lack of available devices and the Huawei situation – 5G will enable voice calls and broadband internet access. Even then, the promised speeds are not being delivered – customers can expect 150-200Mbps at best, and on the Vodafone commercial plans we see ‘guarantees’ of 2Mbps, 10Mbps or ’the fastest available’ – this falls short of the Gigabit speeds promised. Furthermore, consistency is a very concerning issue, with both speed and coverage variable within the launch city locations. On the plus side, Vodafone has offered an ‘Unlimited’ data plan that will begin to chip away at one of the big pillars of operator revenues, i.e. the mobile data costs.
So why have EE, Vodafone and Three rushed to announce 5G networks that are still more of a half-promise than a reality? The answer is a land grab, to try and get to market first with something, rather than nothing. The impact for businesses that want to make investments in high speed wireless technology, leveraging the value of repeatable, consistent, widespread and easy to use services, is that 5G is already a significant disappointment. Carriers will need to raise their game significantly if they want businesses to invest their communications budgets in the new technology.
While the Tier 1 network providers are promising to rapidly expand the 5G network range – with EE planning to upgrade more than 100 sites to 5G every month – this is very much a work in progress. On the plus side, the 5G network will address the capacity issues facing overloaded 4G networks, enabling millions of additional connections on existing spectrum.
But what about the other key aspects of the 5G network offering? 5G has been touted as a viable alternative to leased lines and a chance for companies to avoid expensive fibre or copper-based Ethernet connections. Unfortunately, the promised Fixed Wireless Access based on ‘millimeter wave’ (FWA) – essentially very high speed connections between two points – requires significant infrastructure change that the network providers are struggling to deliver. Rather than towers, FWA is a very short range service and, as such, demands very high antenna density, with small cells (antennas) deployed on buildings, street furniture and lamp posts 10metres apart. Network providers have overestimated the willingness of local authorities and building owners to provide the planning permission required to install antennas on lampposts and buildings. Without antenna density, FWA is not a viable, scalable option for business connectivity; at best companies will have to wait three years or more before 5G offers a viable wireless leased line alternative. At the time of writing, Three UK has just launched their FWA offering in a few postcode areas in London for home broadband, and already there are accounts of intermittent signal problems impacting performance, which lends weight to the argument for more antenna density being required to achieve stable, repeatable service coverage.
At worst, of course, the continuing concerns regarding the potential health implications of 5G networks could further delay installation. Local authorities will remain wary about exposing the public to risk; unless and until the 5G industry can address in a concerted, focused way that the persistent claims that running high frequency networks in high density areas is not a risk, planners may meet resistance from schools, hospitals, and community building managers.
There are other shortcomings. 5G services today do not include any Service Level Agreements, undermining any business confidence in the quality and repeatability of the service. There seems to be no ‘network slicing’ (the technique to separate traffic types), making it impossible to prioritise network traffic – such as IoT. Indeed, the entire IoT aspect of 5G has been shelved for now, with both EE and Vodafone confirming that IoT will not be part of the initial service. There is no clarity regarding support for IoT devices in the future, the ability to upgrade or migrate from current to 5G networks or any commercial information that would help both Managed Service Providers and businesses build 5G into their future IoT strategies.
What to do: Use what we have already - mature 4G services
So what are the options? 5G is disappointing but companies cannot afford to postpone much needed network investments in wireless primary and backup services indefinitely. The good news is that 4G networks are now mature – and that means both widely available and reliable. The arrival of 5G will address the burgeoning capacity issue for 4G, which is great news; and recent market price adjustments have taken 4G out of the last resort category into a viable option for primary and resilience connectivity.
4G is proven to support VoIP and Unified Communication streaming; it can also be used for machine to machine communication. Software Defined Networking (SDN) enables 4G to be blended with other networks to deliver primary connections that deliver a reliable and affordable leased line alternative. Furthermore, IoT is deliverable today using the unlicensed spectrum and other standards, including NarrowBand IoT and LoRaWAN, to enable mass IoT deployments (which will be incorporated later into the emerging 5G standard, future proofing investment).
Critically, all of these services come with SLAs; networks are reliable and accessible. Essentially, it is possible today to meet business needs for affordable and consistent primary and secondary connectivity services with the existing 4G network infrastructure.
5G technology looks good on paper and there have been significant deployments in the US and other countries. But there remain a number of very significant infrastructure challenges that continue to undermine 5G value and impact on our business landscape in the short to medium term.
As the 5G network plans and service offerings stand today, businesses will struggle to justify investment in the new technology. However, while waiting for the promise of 5G to be realised, businesses can extract significant value from 4G today. And with further price disruption expected within the 4G market, the cost model will become ever more compelling, for primary, secondary and IoT connectivity.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 2.
Industry predictions for 2020 from Sazzala Reddy, co-founder and CTO of Datrium (on Cloud):
Ransomware will innovate faster than mechanisms to prevent it.
Ransomware is plaguing the enterprise and it’s getting worse, fast. Due to its insane profitability, the proliferation of non-state and state actors, and cybercrimes (including ransomware attacks) will cost the world $6 trillion annually by 2021. According to the State of Enterprise Data Resiliency and Disaster Recovery 2019 report, nearly 90% of companies consider ransomware a critical threat to enterprise business. Ransomware will be a massive threat to all organisations for the foreseeable future because it is very challenging to detect or prevent, exacerbated by the furious pace of innovation. Prevention would be the ideal course of action, however organisations must prepare for when defences fail—since they will fail. While the current recommendation from experts is to just pay up the ransom, there is an alternative approach: every business should investigate deploying a quick data recovery infrastructure that can help instantly roll back the IT environment to its pre-ransomware state and recover from an attack unharmed. Ransomware recovery will become a budget line item for the majority of CIOs in 2020.
Mainstream enterprises will finally embrace DR to the cloud.
Businesses are clamouring for better disaster recovery solutions in the face of escalating threats from natural and human-generated disasters. Using the cloud for DR has been theoretically interesting but physically impractical due to the huge expense of storing large amounts of data in the cloud and the costs and slowness of moving it across the wire in either direction. In 2020, mainstream business will become open to leveraging the cloud as a DR site and will start shutting down their physical DR sites because new cloud DR technologies will make it possible to leverage on-demand cloud resources during a disaster while keeping cloud costs low during the state of normal business operations. While there will be many options for customers to choose from in 2020, they must take caution and make sure to verify claims surrounding recovery point objective (RPO) and recovery time objective (RTO). 2020 will be the Wild West of cloud DR performance claims.
Edge computing hype will wane because the cloud will be enough for most use cases.
IoT, which spans a wide variety of use cases such as self-driving cars that produce tons of data and need to operate in disconnected mode with local intelligence, will require edge computing. However, the emeter in your house, which does not produce too much data, can directly stream small amounts of data to the cloud. What’s consistent across both of these uses cases is that a global economy demands that data be collected in smaller hubs in different regions and finally aggregated to a central hub. Despite the hype around edge computing, the fact is that the cloud is well-suited for this purpose because cloud vendors have already built out data centres across the globe.
Serverless will be the future of programming.
Serverless computing – idea that one can write some code and it simply runs somewhere in the multicloud with a pay-per-use cost model – is extremely compelling. By reducing barriers, it enables significant speed improvements in developing new applications. However, most programmers are trained in classical monolithic linear patterns of software development. Doing serverless requires a mind shift, which is always hard and takes a long time. In 2020 we’ll see the emergence of new programming tools around serverless that leverage classical programming techniques (because this is easier for humans) but automatically convert that into a nonlinear serverless execution model, harnessing the huge potential of serverless computing.
2020 Boomi predictions
1. Businesses to wise up in 2020 and stop jumping on the latest tech train without evaluating ROIs — Ed Macosky, SVP of Product & Solutions, Boomi, a Dell Technologies business
For the past few years, companies have scrambled to follow the latest tech trends, forgoing ROI assessments in frantic attempts to modernize. $1.3 trillion was spent on digital transformation last year, but it’s estimated 70% of those investments went to waste. Moving every application and dataset into the cloud or applying serverless computing to every workload isn’t always the best move — and I anticipate we’ll see businesses finally applying lessons learned from their overeagerness in 2020.
It’s not fiscally responsible to incorporate every last tech trend; none of them are magic bullets to digital transformation. Businesses are going to have to be more strategic, customizing their plans to their business objectives and culture, and placing emphasis on accelerated time to value over long-term, moonshot ideas.
2. Digital transformation strategies for the cloud will pull an about-face to hybrid IT environment — Ed Macosky, SVP of Product & Solutions, Boomi, a Dell Technologies business
Companies that rushed to move all their business processes to the cloud are now finding it more expensive or cumbersome than anticipated. Over the next year, we’re going to see many businesses return to the hybrid model. Despite cloud computing’s recent leaps and bounds, it still can’t do everything on-prem can, creating breakpoints across environments.
3. Is iPaaS a thing of the past? The future of data integration in 2020 and beyond — Steve Wood, Chief Product Officer, Boomi, a Dell Technologies business
The term iPaaS (integration platform as a service), first came around when we announced AtomSphere in 2008. Now, as we enter 2020, I believe we’ll see a shift in this area as the market consolidates and continues to commoditize -- Gartner predicts that by 2023, up to two-thirds of existing iPaaS vendors will merge, be acquired or exit the market. Within the next year, I believe there will be a new category/term used to define the unification of applications, people, processes, systems, and devices: data unification platform.
4. How organizations are dealing with their data isn’t going to cut it in 2020 — companies need an integration strategy — Steve Wood, Chief Product Officer, Boomi, a Dell Technologies business
Global data regulations, combined with data silos will have organizations scrambling to rethink their data management strategies in 2020 and look toward data integration — or risk getting left behind. Savvy businesses are now turning toward this to glean more accurate and robust insights, streamline operations, and improve business outcomes. This new strategy reduces the time and resources required to constantly reroute data to one place. As companies put dollars and resources behind edge computing and IoT next year as well, updating their data strategy is critical so they don’t risk losing out on the benefits next-gen technologies or fall behind competitors.
5. Companies will rely more on metadata than data to provide insights — Steve Wood, Chief Product Officer, Boomi, a Dell Technologies business
Overzealous data analyses have brought many companies face to face with privacy lawsuits from consumers and governments alike, which in turn has led to even stricter data governance laws. Understandably concerned about making similar mistakes, businesses will begin turning to metadata for insights in 2020, rather than analyzing actual data.
By harvesting data’s attributes — including its movement, volume, naming conventions and other properties — companies will give indications of concerns around accessing PII and other sensitive information. Metadata lends itself well to data privacy, and with the correct machine learning and artificial intelligence modelling, can still provide critical information to the C-suite such as lead generation changes, third-party data access, potential breaches and more.
“As consumers, we now demand organisations, particularly customer service departments, to hold our data securely and to not misuse it - i.e. share it without our permission. For customer service departments, this means training their AI algorithms to report back to employees on what it did and how it did it – that’s called XAI (the X being eXplainable).
“This new frontier is to take the outbound part of communications and automate it in a way that isn’t creepy, but in fact helpful. AI must be used where it is best, and people when empathy is needed. For example, when a customer’s washing machine has broken, what is the experience he or she wants regarding the repair? Well, they mostly want to know about the progress. In this case, AI works best – it can learn which channel the customer likes best, tell them what’s happening, let them know when it’s going to happen, tell them if anything is stopping it from happening, and then ask them if it all went well. Nice short messages, written or spoken, at a frequency that the AI learns is best, using words the AI works out are nicest. On the other hand, with a product like life insurance the first port of call with the company following a bereavement should always be with a well-trained and empathetic human first.
“This new boundary between human and computer interaction is something customer service departments have to get right as technology becomes ever more advanced. The question all customer service departments should be asking is how do we, as humans, respond to automated computer systems, and how can we, as employees, use this carefully and responsibly to improve job performance and provide better customer service.”
Stijn Christiaens, Collibra’s CTO and Co-Founder, comments
“In 2020, Gartner has estimated there will be more than 10,000 Chief Data Officers (CDOs). Many of them will be breaking into the role as first-generation CDOs. If CDOs continue to add offensive capabilities (such as how do you make money from data) on top of the table-stakes defensive capabilities (how do we comply with data regulations) then the role will keep growing. These first-generation CDOs will be change agents that will lead the way in the evolution from data management to Data Intelligence.”
“Data Intelligence is about understanding the positive effects data can have on the business and offering the ability to create that impact. In this new paradigm, Data Citizens will use data to solve complex problems, implement ideas to drive bottom-line results and transform experiences. Moving forward into the next decade data will no longer be the purview of database admins or data architects – it will belong to every knowledge worker.”
Myke Lyons, Collibra CISO, adds:
"For certain security use cases, like malware protection, AI is already the industry standard. It is growing quickly in the risk calculation of threats and network security to help drive accurate responses to incidents. It's impossible and unnecessary to protect everything in an enterprise at the same level, so the next expansion that we will see in 2020 will be in the discovery, classification, and tagging of critical data to help enterprises protect their most sensitive classified information – something that can only be handled with AI for large enterprises."
Digital Workforce are the largest independent RPA and intelligent automation service provider in the world. James Ewing, Regional Director for UK & Ireland, has over 15 years-experience in the enterprise tech sector, and offers some thoughts on automation:
The last year has seen the automation industry continue its meteoric rise – so much so that the Information Services Group (ISG) have predicted that by the end of 2020, over 90% of European businesses would have adopted RPA to some extent. As the advent of automation continues, we will move to a state of ‘hyperautomation’, where we see not only the level extent of automation increase but also the sophistication of automation tools increase.
Hyperautomation has been named by Gartner as a Top 10 Strategic Technology Trend for 2020. With this hyperautomation, we will see the increasing prominence of Intelligent Process Automation (IPA); the combination of robotic-process-automation (RPA), Artificial Intelligence (AI) and machine learning (ML). The result will be a shift towards increasingly AI-driven decision making, where AI provides humans with the intelligent insight, they need to make decisions. This won’t serve to replace humans in the workplace, but rather it will augment the roles humans take on. IPA will automate mundane critical business processes and provide intelligence from the data, giving humans the opportunity to focus on the more specialist, creative and innovative tasks.
Those who do automate at a rapid rate, will eventually, some possibly next year, have ‘digital twins’ for their businesses, meaning those who fall behind and are not vigilant in keeping up to date with trends, are risking losing out to the competition who do automate their services.
Another trend we are starting to see is that businesses that are automating their services are struggling with scaling their robots past the initial stages. This is due to the businesses struggling with the maintenance side of RPA, in order to keep it running successful after launch, it must be checked upon and improved regularly. By employing a provider to do such a service this would ease their scaling issue and allow business growth and development through automating more and more services.
Despite all the future trends of Intelligent Automation and now Hyperautomation, organisations are either yet to start or have ‘stalled’ or ‘stagnated’ in their deployment. 2020 will see organisations kick starting their journeys or solving their inhibitors to growth and rapidly deploying first phases or waves of RPA in their organisations.
In the case of RPA, the common saying “Too many cooks can spoil the broth” is non-existent as RPA is not a ‘standalone’ software technology. RPA suits being part of a ‘one of everything’ larger vendor, we have seen over the past year and will continue to see RPA vendors prime themselves for market consolidation, ‘right-sizing’ themselves for potential investment or, even acquisition, by big players, such as Microsoft, IBM and Oracle.
We also anticipate that as organisations look to scale and accelerate their use of IPA tools, they will begin to invest in more innovative and flexible operating models, such as the cloud-based Robot-as-a-Service (RaaS) operating model. The RaaS model is the latest iteration of Software-as-a-Service and allows organisations to pay for digital workers by the minute – meaning they can scale up or down, cost effectively, with ease. This should enable those already automating to overcome the great scalability barrier that many organisations have faced in the last year.
“Content is king” is a phrase that we’re used to hearing, meaning that data is all-powerful. However today, it would be more fitting to say ‘context is king’ – after all, what is data without context?
By Bart Schouw, Chief Evangelist in the Office of the CTO at Software AG.
Politicians know about the importance of context more than most of us – if something they say is taken out of context by the wrong person, the consequences can be severe. The same rule applies in analytics, in the sense that if it is captured, stores and analysed within a vacuum, it’s difficult to derive any meaningful insights.
Data is one of the most powerful assets we have in the digital era. Yet, it remains redundant if we lack contextual information. Hence, “context is king”.
We rely on context to receive and reveal the truth. When things are taken out of context, however, this aggregates misinformation. Relating this to data analytics language it translates to giving importance to context as the key to avoid gathering of unusable data.
In other words, if on a factory floor an error event occurs every day at the same time, and there is nothing apparently wrong with the machinery, how does a data scientist get to the root of the problem? He/she adds context.
Take, for example, Covestro, which makes high tech polymers materials used in the automotive and construction industries. In its quest to improve its production process, Covestro took a different tack. It knew that hiring an army of data scientists was a challenge - the right ones are hard to find - and if they leave, the company would lose that knowledge.
Instead it decided to arm its factory engineers with advanced self-service analytics. Coverstro’s knowledge workers analyse the production process daily with the help of the embedded AI and ML models. They improve the effectiveness of those models by feeding in events that they deem relevant. Those events act as context for the models. This vastly enhances the models’ effectiveness in detecting and predicting faults and failures.
Here is how it works: If something happens in the factory that the engineers deem important, it will be logged into what they call the context hub. The engineers can even automate some of the logging, once they figure out that those events are significant for the analytics and happen regularly. That includes things like maintenance events or production situations - like the heat of the engine remaining above a certain level longer than 10 minutes.
In my eyes, those engineers are the front-runners of the next generation of workers. Introducing the Contextualist!
The Contextualist isn’t a technical data scientist, and as such the role will only really thrive if the technology used can be simplified to a point that the analyst’s and developer’s roles are reduced to a minimum. This is going to happen soon, I believe.
Self-service analytics enter the picture
Software companies are putting their efforts and resources toward making existing tech more easily accessible in order to increase mainstream adoption. Currently ML, AI and other technologies are so difficult to implement that only the really big companies can afford the very expensive workforce.
Simplification will come through self-service interfaces, making it easier to apply the technology in a horizontal (generic) way, or through solution accelerators, where domain relevant knowledge and decisions can be applied in a vertical way.
There’s no doubt that new technologies are helping humans do their jobs more effectively, but in order for this collaboration to be its most efficient, a context hub is needed. If the job that you’re doing relies on the effectiveness of AI to augment decisions, then it becomes vital that it is being fed with the relevant contextual information. This will ultimately help organisations improve decision making in the long-term.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 3.
Top five technology trends to impact the digital infrastructure landscape in 2020 - Equinix predicts hybrid multicloud, artificial intelligence, cybersecurity, data regulation and sustainability will be key factors influencing how organizations undergo digital transformation:
Equinix’s top five technology trend predictions for 2020, point toward the critical digital transformation that organizations are making to lead in the new digital era. Equinix’s expansive footprint across more than 50 global markets, and its position as a leading meeting and interconnection point for ecosystems of networks, clouds, enterprises and nearly 10,000 customers, give it a unique and holistic lens to view critical digital infrastructure trends.
Equinix’s 2020 technology trend predictions include:
1) Distributed infrastructure and edge computing will accelerate hybrid multicloud adoption
There is a seismic shift underway across many industries as businesses are embracing edge computing and hybrid multicloud architectures. Increasingly, businesses are moving computing from centralized data centers to a distributed infrastructure and toward the edge, where data exchange and interconnection between businesses and cloud services are growing at an exponential rate.
The advent of edge computing has also become a foundational enabler for other emerging technologies such as 5G mobile communications, which will allow internet of things (IoT) and other edge devices to take advantage of faster connectivity to data and compute resources with single-digit-millisecond network latency.
According to analyst firm IDC, by 2023, more than 50% of new enterprise infrastructure deployments will be at the edge rather than corporate data centers, up from less than 10% today. And by 2024, the number of apps at the edge will increase 800%. The IDC report says to prepare, businesses must modernize IT to become virtualized, containerized and software-defined to support the edge. And they should also consider new data center partners that can bolster edge build-out and prioritize infrastructure optimization and application communication costs.
As a result, in 2020, Equinix anticipates edge computing as a key driver in accelerating hybrid multicloud adoption across every business segment worldwide. The third annual Global Interconnection Index (GXI), a market study published by Equinix, estimates that between 2018 and 2022, private interconnection between enterprises and cloud & IT service providers will grow annually by 112%. The report predicts that traditional cloud computing architectures, which are highly centralized, will shift as enterprises look to extend cloud computing to the edge to solve for challenges introduced by the highly distributed nature of modern digital business applications.
The key challenges that the combination of edge computing and hybrid multicloud adoption will solve include:
· Lower latency and bandwidth savings—Proximate high-speed, low-latency connections (<60 – <20 milliseconds) are necessary for companies to materially close the “distance gap” between their application and data workloads and cloud service providers (CSPs). With agile and scalable cloud environments closer to the users at the edge, data access and application response times can be faster and cost savings from reduced data transport can be realized.
· Enterprise consumption of hybrid multicloud—Enterprises generally determine which cloud platform to place their applications on by which CSP delivers the best service for a specific workload. This freedom of choice makes it easy and practical for IT organizations to experiment with different cloud platforms to see which delivers the best quality of service (QoS) at the best price. Additionally, more than ever before, enterprises require the flexibility of retaining control and securely running business-critical applications in-house and want the flexibility of leveraging both private and public hybrid cloud environments, depending on specific use cases.
· Political and regulatory factors—With more frequent and complex incidents of security and privacy breaches, many countries are regulating where and how data can be used. These privacy and data sovereignty compliance requirements will lead to more distributed data centers and cloud services that keep data local to a specific geographic region or country.
2) AI and IoT will drive new interconnection and data processing requirements at the edge
Equinix predicts that enterprises will accelerate the adoption of AI and machine learning (ML) for a broader set of use cases, requiring increasingly complex and more real-time-sensitive processing of large data sets originating from multiple sources (sensors, IoT, wearables, etc.). An airplane with thousands of equipment sensors, an autonomous vehicle producing telematics data, or a smart hospital monitoring patients’ well-being can each generate several terabytes of data a day. About 75% of enterprise AI/analytics applications will use 10 external data sources on average.
To meet the scale and agility requirements of the above, Equinix believes businesses will continue to leverage public cloud service providers, while most will likely find ways to use an optimal set of AI/ML capabilities from multiple CSPs—effectively deploying a distributed, hybrid architecture for their AI/ML data processing.
Yet Equinix believes for many use cases, an additional set of stringent requirements related to latency, performance, privacy and security will require that some of the AI/ML data and processing (both inference and model training) be proximate to data creation and consumption sources. Equinix predicts this will create an impetus toward new architectures and the increased adoption of vendor-neutral, richly interconnected, multicloud-adjacent data centers at the edge, which deliver improved control, auditability, compliance and security of AI/ML data, and low-latency connectivity to remote data and compute infrastructures.
Furthermore, Equinix predicts that greater interconnection and data processing capabilities will pave the way for new digital data marketplaces, where data providers and buyers can transact easily and securely at scale within vendor-neutral data centers at the edge.
3) The rise in cybersecurity threats will require new data management capabilities
The World Economic Forum has ranked breaches in cybersecurity as one of the top risks facing our global community. No company or individual is immune to the cybersecurity challenges we face today or will face in the future. The financial loss attributed to cyberattacks continues to impact economies worldwide and is estimated to cost $6 trillion USD annually by 2021.
With the increase in cybersecurity attacks and data privacy and protection regulations, most companies are now moving toward accessing cloud services over private networks and storing their encryption keys in a cloud-based Hardware Security Module (HSM) at a location that is separate from where their data resides. This HSM-as-a-Service model allows them to increase the level of control over their data, to strengthen resiliency of operations, and to support of a hybrid technology architecture.
In 2020, Equinix predicts that new data processing capabilities such as multiparty secure computation, fully homomorphic encryption (operating on encrypted data) and secure enclaves (where even cloud operators cannot peer into the code being executed by a cloud consumer) will move toward mainstream and will allow enterprises to run their computation in a secure manner.
4) Data regulation will influence enterprise IT strategies
Today, many enterprises buy and sell data in order to get a competitive advantage, but these enterprises must adhere to government regulations for personal data privacy and protection. What started with the European Union’s General Data Protection Regulation (GDPR) and is now transcending into other local regulatory frameworks such as the California Consumer Privacy Act (CCPA) among many others, and is putting more pressure on enterprises to ensure data compliance. In fact, there are 121 countries that have either already announced or are in the process of formulating data sovereignty laws that prevent the movement of their citizens’ personal data outside the country’s boundaries.
In 2020, Equinix believes we will see further complexity in protecting personal data as global trends toward stricter or new data privacy regulations continue to gain momentum, making it more difficult for global companies distributed across multiple markets to navigate. In a recent survey commissioned by Equinix of over 2,450 IT decision-makers across the world, 69% of the global respondents listed “complying with data protection regulations” as a top priority for their business, while 43% of them reported “changing regulatory requirements around data privacy” as a threat to their company. In the UK, these figures were 65% and 35% respectively.
In 2020, Equinix predicts IT strategies will increasingly focus on data privacy, with continued application of the secure discovery, classification and encryption of personally identifiable information (PII). Equinix believes HSMs will be an integral part of a data security architecture and strategy for encrypting PII and providing an exceptionally high level of security for safeguarding data.
5) Digital transformation will provide a foundation for a more sustainable world
According to an Equinix Survey, 42% of IT decision-makers globally agree that the “greenness” of a company’s suppliers has a direct impact on their buying decisions, compared to 30% in the UK. Equinix anticipates that with increasing pressures on the world’s resources and the increasing desire by many companies to cut emissions, digital transformation could begin to set the world’s economy on a progressively sustainable footing.
In 2020, sustainability will likely be an initiative for world-class organizations as stakeholders increasingly look to digital businesses to lead and innovate in areas of environmental responsibility and sustainability. Equinix further predicts that digital and technology innovations will provide companies with the opportunity to overcome barriers, such as the geographic dispersion of supply chains to the complexity of materials and deconstructing products. Machine-to-machine and data analytics enable companies to match the supply and demand for underused assets and products. “The cloud,” in combination with mobile, can dematerialize products or even entire industries. Equinix anticipates that as businesses depend on data center resources to connect with customers and run many aspects of their operations, they will look to vendor-neutral colocation data center providers who are committed, vocal and proven champions for advancing environmental sustainability.
Five technology trends poised to transform the future of work, according to DXC Technology:
Five technology trends are poised to transform the future of work beginning in 2020, DXC Technology has announced as part of an annual forecast. The rapid adoption of emerging technologies such as artificial intelligence (AI) and machine learning (ML) – coupled with trusted data ecosystems, empowered interconnected teams and tech-evangelist leaders – promises to produce new levels of workforce efficiency, productivity and growth across enterprises.
“The notion of accelerated productivity will force enterprises to rethink their technology decisions and investments across the enterprise technology stack, which, in turn, will drive a sea change in how enterprises are led and structured, make informed decisions and engage employees and customers,” said Dan Hushon, senior vice president and chief technology officer, DXC.
“Tech-evangelist leaders will define new interactions between AI and people to create high-performing teams and shape digital strategies that unlock an organization’s full potential – securely and confidently modernizing applications, optimizing data architectures and moving workloads to the cloud to produce new and better business outcomes.”
Design thinking shifts from IT services for people to IT services for machines
The thinking behind systems design is shifting as IT services are increasingly being built for machine-to-machine interaction, and as processing moves closer to where data resides. This will further expand “The Matrix” – the pervasive, intelligent IT infrastructure beyond the cloud that includes edge computing, internet of things (IoT) platforms, machine intelligence, augmented reality/virtual reality and more. It will usher in new design choices and transformational architectures, and push companies to more aggressively pursue IT modernization.
“Microprocessors capable of decisions in nanoseconds, stream and batch processing architectures and analytics moving to the network edge (where the data is) – all of this will enable enterprises to make better, faster, data-driven decisions more cost-effectively,” Hushon added.
Teams, not superstars, are the high performers
In 2020, companies will recognize that achieving their full potential means developing and nurturing a network of high-performing, interconnected teams consisting of multidimensional individuals, rather than siloed groups of single superstars.
Enterprises will restructure to expand team linkages across the organization. The shift from superstar individuals to high-performing teams will require new strategies for talent acquisition and development.
According to Hushon, “Enterprises will put greater emphasis on communication, adaptability and decision-making empowerment; double-deep expertise in business and technology; and collaboration tools that promote productivity and learning.”
Data’s value increases in ecosystems
Enterprises are pooling data in ecosystems to achieve outcomes that benefit both the individual and enterprise. Data ecosystems will flourish as they adopt trust mechanisms that validate an individual’s right-to-share and an enterprise’s right-to-consume data. Self-sovereign identity standards and blockchain-based consent with trading partners, for example, are helping to facilitate responsible data sharing and drive the rapid growth of data exchanges.
“As these capabilities become more pervasive, manufacturers, service providers and consumers will be more willing to share data in exchanges and ecosystems,” Hushon explains. “In turn, CEOs will seek to identify and pursue ecosystem-centric business models and trading partners that deploy trusted and compliant data-sharing practices.”
AI redefines professional services
The pervasive use of AI and ML in business is revolutionizing professions such as legal, accounting, healthcare and education by democratizing access to data and expert services. AI is extending customization and personalized services to a broad base of customers through low-cost intelligent agents. Additionally, AI benefits professionals in their decision-making because it can provide new insights, manage information overload and reduce human error.
Hushon noted that while AI and ML democratize professional services, organizations should stay vigilant to guard against the potential loss of critical skills while using increasingly sophisticated, AI-powered decision support systems.
“As these decision-support systems become more sophisticated, businesses need to continue to build critical skills in organizations,” said Hushon. “Additionally, enterprises should protect against unintended consequences by training people to quickly detect and correct improper bias or unsafe behavior from AI. Overall, AI will illuminate intelligence hidden in systems, empower consumers and complement professional expertise.”
New wave of tech-savvy leaders accelerates business transformation
A shift in business leadership will gain momentum in 2020 as technology-driven markets proliferate and new leaders advocate for technologies that can improve enterprise speed, agility, productivity and innovation advantage.
“Emerging technology evangelists will work at the CXO level to shape digital strategy. At the same time, they will spearhead major initiatives with smart products, mergers and acquisitions, intellectual property development and learning initiatives for accelerated business transformations, value and outcomes,” Hushon concludes.
Learning is computationally demanding, not just in terms of processing power but also the underlying graph query language and architecture of the system. We look at how these challenges can be addressed with native graph databases.
By Richard Henderson, Solution Architect, TigerGraph EMEA.
New developments in machine learning are being powered by deep link graph analytics which support unsupervised learning of graph patterns, feature enrichment for supervised learning and explainable models and results. It’s a potent combination that will serve enterprises well for years to come.
We see machine learning (ML) being used for a range of complex computing tasks, including fraud detection, personalised recommendations, predictive analytics, identification of user groups and influential users, reporting weaknesses or bottlenecks in operations and supply chains, and more.
But ML is computationally demanding, and graph-based machine learning no less so. With every hop, or level of connected data, the size of data in the search expands exponentially, requiring massively parallel computation to traverse the data. Computationally this is too expensive for key-value databases as it requires a large number of separate table lookups and for relational database management systems (RDBMS) which must create table joins for every query. Even a standard graph database may not be able to handle deep-link analytics on large graphs.
One solution is a native graph database featuring massively parallel and distributed processing.
Unsupervised machine learning
Applying graph database capabilities to ML is a relatively new, but ultimately not very surprising, development: the Google Knowledge Graph, which popularised the concept of extracting actionable information based on patterns of relationships in data, was introduced in 2012, and graphs are known to be ideal for storing, connecting and drawing inferences from complex data.
That it didn’t happen sooner is down to the fact that, until recently, graph databases didn’t support the algorithms for deep-link analytics and struggled with very large datasets. But using ML algorithms in a native graph database opens new doors to these unsupervised methods, making possible the use of whole classes of graph algorithms to extract meaningful business intelligence including:
● community detection
● label propagation
● betweenness centrality
● closeness centrality
● similarity of neighbourhoods
These algorithms share a common requirement: the ability to gather data and analyse it while traversing large numbers of nodes and edges. This is a powerful feature of modern graph databases. Without it, many of these classes of algorithms would simply not be feasible to run.
Now these algorithms are finding uses in business to tackle a range of ‘difficult’ problems including fraud detection, identifying user groups and communities and reporting weaknesses or bottlenecks in operations and supply chains.
Supervised machine learning
Graph is also giving a boost to supervised machine learning because of its ability to support the analysis of a much richer set of data features – allowing you to deploy more sophisticated ML algorithms.
Consider the problem of detecting spam phone calls on a massive mobile phone network. This was precisely the problem that China Mobile wanted to solve. It has more than 900 million subscribers who make over two billion phone calls a week, but a tiny percentage of those are unwanted or fraudulent phone calls which the operator was keen to disrupt.
The approach was to analyse the data features of the phone that was initiating the call to determine whether it met the risk criteria for being fraudulent and then send a warning to the recipient’s phone – while it was still ringing – to warn them that the caller might be a scammer. The recipient could then decide whether to answer or not.
One could use a simple set of data features to detect phones associated with fraudulent calls, but China Mobile found that relying on duration of phone call and percentage of rejected calls to raise a warning flag resulted in too many legitimate calls being flagged as fraudulent – i.e., false positives.
China Mobile chose to broaden the scope considerably and monitor 118 data features to identify ‘good’ and ‘bad’ phones. The ML algorithms had to be powerful enough to analyse all of these data features and fast enough to do it in the time it took the network to connect a new call. Using ML, it would be possible to classify a caller as good or bad based on their relationships to other phones on the network which could be summarised in three key properties:
● Stable group – based on how many phones a given phone calls and receives calls from on a regular basis. Relevant factors include the number of phones it regularly connects with, the frequency of interactions to and from each phone and the duration of the relationship with each phone.
● In-group connections – the degree of connectedness between the phones that the target phone is in regular contact with.
● 3-step friend relationships – the degree of extended connectedness between the target phone and other phones. Does the given phone have connections with other phones that have connections with other phones that in turn initiate calls to the first phone (forming a sort of friendship loop)?
It turns out that bad phones score consistently and reliably low on these metrics, and it’s difficult for scammers to redress or hide these features of the phones they are using. Using data on known good and bad phones, the ML algorithms can be trained to recognise suspicious patterns of behaviour with a high level of confidence.
But its one thing modelling these metrics and another challenge altogether to implement them across a network of nearly one billion phones in real time. The real time element of this was very important because there is no point warning the recipient of a phone call that it might be fraudulent if you can’t do it while the phone is still ringing – an important consideration in China Mobile’s choice of graph database.
A native graph database not only has the query language to traverse many connections and filter and aggregate the results but also the computational power and underlying system architecture to do this in real time.
A criticism of neural networks and deep-learning networks is they don’t provide insight into causal factors – how did you get this output from those inputs? Without that, you don’t know what factors the ML system is associating with the given output, which erodes confidence in the ability of the system to maintain consistent results over time.
This underscores the importance of explainable models in ML. The objective is to be able to highlight the key variables associated with a result, and it turns out that graph analytics is well suited to compute and show the evidence behind ML decisions.
Explainable ML boosts user confidence in the result. In an online retail environment, the likelihood that a consumer will respond to a product recommendation is higher if the recommendation is accompanied by a reason, such as people you like also liked this product or this product is similar to a previous purchase.
In the more business critical scenario of fraud detection, explainable ML can be a regulatory or audit requirement, and it is also more helpful to fraud investigators if they can see the connections that caused a transaction to be flagged as suspicious rather than just receiving a numerical fraud score.
Graph databases represent data on networked objects in a way that closely mirrors reality, opening new doors to supervised and unsupervised machine learning techniques. It also exposes the underlying decision making process in a way that neural networks do not, satisfying the need for explainable ML models. It’s no surprise then that businesses are turning to graph to solve deep-link data analysis challenges.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 4.
A host of predictions from the world of information security:
Aaron Zander, Head of IT at HackerOne:
Which industry sector is ripe for a massive breach next year?
Government, Healthcare, and finance are still very "easy" targets. This isn't going to stop any time soon. 2019 felt like a good year to see more companies really start investing in security, but it still seems like a small inflection, and not the tipping point. Personally, I'm keeping my eye on DNA databases; we have no idea what the value of DNA data will be, but I know that in our lifetime it will probably become one of our most valuable identifiers, and right now we pay other people to tell us trivial things about our history and give it away for free with no real protections.
How are the massive fines we've seen this year (e.g. BA) going to impact the security decisions businesses make next year?
2020 is the year we will see these fines really pay out. The oil industry has taught me not to believe the initial lawsuit when the appeals process and stalling tactics can decimate millions, or billions, in fines. Almost every company that had a major breach in the last year has fully recovered in stock value. I want to see more executive heads roll, more fines, and maybe criminal charges in 2020. Negligence with my data should be considered criminal negligence.
If businesses make just one investment next year, what should it be?
If you don't have a backup and recovery process documented AND tested, do that. Ransomware is still devastating banks and hospitals and governments because they never invested in security and IT, and they didn't invest in backups either.
Do you see companies becoming more or less open to public disclosure in 2020? Why or why not?
I feel like we're trending to being more open to public disclosures: it proves you can do security, and it builds trust. We, as the public, and the media, need to stop punishing people for being secure and being honest. We don't want cover-ups, we should celebrate disclosure.
Which InfoSec news story most caught your attention this year and why?
I'm going to say the CapitalOne disclosure, not because it was the biggest, or the worst, or the most interesting, but because it was a line in the sand, drawn by one of the first financial institutions saying "we care about security and we care about transparency". Vulnerability Disclosure Programs get a lot of flack as a "half measure" by bug bounty hunters, and many companies out there just see it as a place to collect low signal reports. However, CapitalOne not only proved that having a way for the general public to securely report a security issue to actual security team members is incredibly significant for the chance someone finds a medium, high, or critical issue.Furthermore, by disclosing the issue, they also put themselves as a trusted leader in the space.
1. In 2019 we have seen a strong growth of multi-cloud adoption, with more than 73% of organisations using 2 or more cloud providers. Organisations and business units are choosing the best provider for their use individual cases. This will bring added attack surfaces and with the lack of skills and lack of homogeneity on cloud controls, we will witness more cloud breaches, mainly due to cloud misconfigurations.
2. Cloud providers will continue to push into security, with integrated solutions, such as Azure Security Center, AWS Security Hub or GCP Command Center. Cloud providers will increase market share with customers having low legacy architectures, but continue to struggle with multi-cloud and complex hybrid architectures.
3. Containers and shift left security will continue its path to widespread adoption. The next phase will be the adoption of security by design through Infrastructure as Code, such as AWS Cloud Formation, Azure Resource Manager and GCP Cloud Deployment Manager.
· What will be the top five cybersecurity threats to businesses in 2020? Will ransomware and BEC attacks still be the biggest threats or will any new ones come to light?
o Supply chain attacks are a constantly developing threat. Although overall, they seem limited to more advanced and determined adversaries, the risk is evolving. What to do when you struggle to catch the big fish? Poison its bait! Target a supplier that has far less security control in place and from that ‘island’ you can jump straight onto your target. From a defensive perspective this is difficult thing to prevent. The larger the organisation, the harder it is to enforce security and perform business impact assessments for each and every supplier. 2020 might just be the year that gives us more large-scale examples of this threat.
o I think ransomware is a prevalent threat and still something that should be taken seriously in 2020. We see that large organisations are well aware of the risk and taking the necessary precautions. Looking at the number of municipalities, hospitals and small businesses fallen prey to ransomware this year, we clearly see a shift towards the public sector and SMEs. As these targets overall have lesser security, chances are that a greater number will fall victim and actually pay the ransom, making ransomware still very profitable for adversaries. Good to note is that ransomware still, more often than not, seems to rely mainly on the human element… Which bring us to the next point: phishing.
o Business email compromise and phishing in general is ever evolving and will most likely continue to grow in both volume and sophistication. The past year we have seen an increase in advanced phishing methods targeting applications secured with two-factor authentication (2FA) and almost all reporting phishing website appear to use a secure HTTPS connection. Although it is a good trend that 2FA and use of HTTPS is being adopted, we see that end-users still fall prey to phishing. Hopefully 2020 will also be the year of increased support and adoption for hardware authentication devices.
o In line with phishing, SMS phishing (or Smishing) seems to be on the rise. More and more Smishing campaign appear to be executed by adversaries, most of which are going full-circle to where we were ten-or-so years ago with email: The sender can easily be spoofed, and we will rely on the inherent trust users have in this type of messages. Most Smishing campaigns don’t seem to focus that much on the content of the text message, as long as the content puts some pressure on the victim and the company name that is used as sender matches the victim’s profile they will click. The included hyperlinks are often not even masking the fact that it is an illicit webpage: ‘https://resetyouroutpost24password.evilhackerwebsite.com’… right!
· What impact will GDPR have in 2020? Will we see larger fines than those against BA and Marriott?
o Hopefully we will see the effects of GDPR. We seem to have surpassed the ‘peak of inflated expectations’ (to put it in Gartner terms), where each and every vendor drives on the ‘GDPR fear’. In 2020 we will hopefully see realistic fines and proportioned action on violations of GDPR.
· What will be the leading cause of data breaches in 2020?
o The human element will most likely remain the leading cause of data breaches.
· How will the most successful cybercriminals operate in 2020? State-sponsored hacking attacks? As part of cybercrime rings? Lone warriors?
o Cybercrime is constantly growing, with new phishing and ransomware attacks (and associated tools) I expect cyber-criminals to have the biggest impact next year. Looking at the global political situation, nation-state attackers are also likely to make some headlines next year. However, with these actors it might also happen without it ever making the news. Only time will tell!
Appsec World of Predictions
1. Websites will continue to be hacked! Some of them will result in big hefty GDPR related fines. Many of these will likely be through third-party components. Magecart will continue to feature highly in the successful hacks that impact organisations financial data.
2. DevSecOps will continue to gain traction within organisations, both large enterprise and smaller companies. The tools to enable security being easily based into the CI/CD process will become more readily available. To further enable DevSecOps, education will gain an increase in attention as developers are pushed more to being both security champions and coders. The need for organisations to have a well-developed, and embedded education programme covering the key aspects of secure coding, the OWASP top 10 etc will become more apparent with the increased adoption of DevSecOps
3. Cloud adoption across the entire DevOps SDLC will continue to increase as organisations see benefits in using cloud throughout the Software development lifecycle, through ease of use, lower compute costs and other benefits. This continual increase will see Cloud becoming a target for threat actors in 2020
4. Despite the adoption of Shift left and Dev(Sec)Ops we will still see Web breaches being one of the largest reasons attacks are successful. This will be especially true as organisations continue to developer applications quickly to meet ever changing market demands. Sadly, the OWASP top 10 is still fairly static in the top issues, and despite training and education available to help developers improve secure coding we will still see the same kinds of issues across many applications
5. To combat the continual breach of applications, and the ever increasing demands on time brought on by DevSecOps practises, organisations will look for a more continuous way to assess critical applications to give them greater visibility of the application throughout the lifecycle, irrespective of where and when it is deployed or updated; continually feeding back into the development backlog for efficient management and handling.
1. Organisations will continue to adopt a risk-based prioritisation for vulnerability management and remediation. As pressure increases on organisations to remediate quickly, this approach helps focus efforts on what to remediate and when, moving from a patch all critical to patch vulnerabilities that pose a true risk to my business.
2. Predictive risk prioritisation will continue to gain traction as vendors build predictive models to try to further enhance risk-based prioritisation of vulnerabilities. These models will attempt to guide organisations in what vulnerabilities are likely to be weaponised and used next. Through 2020 organisations will start to adopt these types of services more and more to build more effective vulnerability management programmes
Jonathan Deveaux, head of enterprise data protection with data security company comforte AG:
New terminology coming:
One term many technology professionals in the U.S. will all be hearing a lot is “DSAR.” What is a DSAR? A DSAR is a “Data Subject Access Request.” It is the act, from a consumer to an organization, requesting the details of how their personal data is being used within that organization. Additional requests from DSARs could be made to delete their data, or to disallow the sale of their data. Technology professionals can look within their organization today and ask how many times are end-users requesting for an ‘audit’ of their data. The question is, can they provide this information if they were asked today? Get ready for this term, as upcoming data privacy laws (such as the CCPA data privacy law going into effect January 1, 2020) may require organizations to respond to DSARs within a certain timeframe.
The Return of PCI DSS:
For the past two years, data privacy regulations and laws have been getting much of the attention in compliance. The Payment Card Industry Data Security Standards (PCI DSS) have been a principal model when it comes to data security for payment cardholder data. In 14 years, no organization who was or is 100% PCI DSS compliant has experienced a data breach of its payment card data. Many organizations, however, have difficulty in achieving 100% compliance, therefore choose to compensate for this and declare certain data security controls that are in place, while they are attempting to reach the PCI DSS requirement. Word is getting out that when PCI DSS v4.0 is finalized towards the end of 2020, the use of Compensating Controls as a compliance method will be no longer allowed. The PCI Security Standards Council will provide more guidance on this in the coming year.
The convergence of data security technologies:
Companies are subject to various data security and data privacy regulations that demand different ways of how data should be protected. Up until now, the capabilities to meet the different regulatory requirements are available by mixing products from different vendors. At the same time, several recent surveys have shown that skills-shortage and the complexity of current security solution portfolios are amongst the top challenges for CISOs. The market is asking for simplification and ease of operations. As a result, we will see that the convergence of protection methods like tokenization, format-preserving encryption, and data masking onto single data security platforms will have much attention in 2020.
Theresa Lanowitz, head of evangelism and communications at AT&T Cybersecurity please:
As healthcare, retail, manufacturing and financial organisations are modernizing their services through digital transformation, faster and more reliable performance is needed to meet growing customer demands. This is where 5G will have the most impact, but it could also result in a wider attack surface due to an increase in endpoints connected to 5G networks. Therefore, the security practices of organisations will have to adapt and make sure their security policies keep pace with the speed and agility of 5G. Those companies not familiar with the multifaceted nature of 5G, edge deployments and the capacity for virtualization should be encouraged to team up with a managed security services provider. In this new epoch, collaboration is a foundation for the ongoing journey to cybersecurity resiliency.
Martin Jartelius, CSO at Outpost24:
We will likely see a few more huge fines, which are used as examples, but we will likely also keep seeing that incidents keep happening on a very regular basis without much progress towards improvements against organizations, instead of going after the every day simple offenders we see that the main focus has been on those suffering incidents and not those guilty of the mismanagement of personal data making those breaches possible. Very much in line with what we predicted, and hence also not addressing the problems the legislators set out to address.
2 .What will be the leading cause of data breaches in 2020?
Outdated software and over permissive user access combined with the well known human factor of mistakenly not adhering to hardening and configuration routines.
3. What cybersecurity mistakes will organisations continue to make in 2020?
They will keep looking at the latest and greatest of threats, when the most breaches happen due to old forgotten systems, outdated software and poor access management leading to high consequences on breach of individual users. So, a misguided focus towards what’s “new and cool” rather than a responsible clean-up of the mistakes we are largely already aware off, or could find with a minimum of risk review and assessment.
4. What will be the top five cybersecurity threats to businesses in 2020? Will ransomware and BEC attacks still be the biggest threats or will any new ones come to light?
Ransomware will continue living on as a threat which is perceived as disastrous, we just must recall that every file encrypted could just as well be a file stolen by a threat actor – ransomware is just a shortcut to monetize from breaches without having to siphon through all the data to find that of a value, identify a buyer and thereby monetize properly. Its faster and more efficient, which is why it remain a visual problem.
We can just assume that espionage is also very frequent, giving a competitive disadvantage.
As increased understanding of cloud infrastructure and communication keep rising in the hacker community, we will also see more targeted attacks against those solutions.
Further we will see an increased amount of breaches due to the increased use of and dependence on exposed systems such as external source repositories, external ISMS and management platforms and more.
And lastly, a behaviour where we expect solutions we buy to be safe, without investigating this, leads to exposure where we perceive risk as transferred. Responsibility may be transferred, but the risk however is not efficiently transferred but remain with the one losing their data, regardless whose responsibility it is, you will be the one affected.
5. What cyber-defensive tools will organisations not be able to live without in 2020?
Tools to map new devices, track devices over time, identify risks to them including their users, identifying data and its exposure and then tools to help prioritize the risks to this now rather enormous attack surface.
Then come the basic tools such as all tools you need to manage this access, to isolate, log, track and monitor. Those are must haves to defend, but to do it efficiently, starting with knowing what there is to defend is a must, and being able to spot the holes in ones layers of defence a must, especially as both our environments and our protection solutions gets increasingly complex.
6. How will the most successful cybercriminals operate in 2020? State-sponsored hacking attacks? As part of cybercrime rings? Lone warriors?
Success is a means to measuring results against objectives. All those threat actors will keep being successful. Of course the hardest to defend against is the state actor.
We can also be certain that threat actors just buying access to exploitation kits, and buying their crypto malware, combining them and just running those tools will keep being successful from their perspective. And those are the ones we will see the most of.
We will also see the lone wolfs of course, and those who hack out of curiosity, especially those who are opportunistic. We will see SCADA, ICS, cloud and servers, mobile and web applications, all will break, because there are many targets and there are many attackers. The best we can do is decrease our attack surface and not be one of the easy victims due to our complacency.
Boost your data in three steps.
By Benjamin Ross, Director, Delphix.
Data exists everywhere within an enterprise, including databases, data warehouses, data lakes, CRM tools, and ERP systems. But in order to leverage that information, it is critical to understand how to access, refine, apply, and manage that data to make smarter business decisions.
While it’s more straightforward for companies that started off digitally native to think data first, it is a lot harder for legacy companies that are just starting out on their digital transformation journeys.
One company that is truly excelling in its data strategy is none other than media-services provider and production company, Netflix. Unlike its media rivals, who are legacy behemoths, Netflix uses data to drive every decision, from content development to its creative and marketing strategies. The data Netflix collects helps the company glean deep insights into every customers’ preferences and usage habits in real-time, empowering them to make almost all strategic and tactical business and operational decisions based on data.
There’s no doubt Netflix is a modern data company in today’s digital era, and its impact is felt across multiple industries. How can companies take a leaf out of Netflix’s data philosophy and begin their journey towards being truly digital?
Step 1: Ensure your data is accessible, easy to discover, and easy to process for everyone
In most organisations, the access and availability of data is controlled and managed through a centralised group of database administrators (DBAs) with no self-service capabilities for practitioners who need data. Data access and availability remains a set of ticket-driven manual processes with excessive workloads being put on the DBAs to do repetitive low-value tasks. But data should be accessible by those who need it, when they need it, and where they need it, in real-time. Companies should be embracing technologies and policies that empower their data practitioners to get the data they need, when they need it.
In addition, most companies lack even basic inventories of their data. The absence of good data classification makes it difficult, or in most cases, impossible to manage and govern data based on its risk-value profile and category. All the data ends up being treated in the same manner, with the same governance overhead and restrictions, which makes data less usable and easy to process than it can be, and of course, makes it difficult to find the right data. You simply can’t manage what you don’t understand.
Step 2: Remember that the longer you take to find the data, the less valuable it becomes
As organisations evolve to newer, modern technology stacks without retiring older stacks completely (still waiting on the mainframe to go away as predicted in the 1990s), or through acquisitions where they bring in an entirely new set of platforms, data is sprawled across heterogeneous sets of data stores – many of which are not compatible with each other and may even have multiple versions of the same database in use in different parts of the organisation. By spreading data in this way, creates challenges around data protection and governance, which is a pain point for the business. Multiple data systems storing the same information can lead to expensive data and infrastructure sprawl.
This sprawl of data store technologies means that it take businesses much longer to find the data and the sprawl further expands with the addition of cloud-native, fit-for-purpose data stores that developers are beginning to adopt as they develop new cloud-native applications, which have a different need as opposed to traditional databases. This is because spreading the data creates a lack of visibility and expertise, slow manual processes and issues surrounding compliance and regulation.
Step 3: Visualise your data - whether your dataset is large or small, being able to visualise it makes it easier to explain
With the rise of DevOps, monitoring and logging of all software components in your tech stack has become critical. Typically, monitoring tools are used to facilitate feedback between the operations and development teams during software development. If there are consistent errors for certain components, the teams are able to communicate and collaborate to resolve those issues. In response, DataOps is emerging as an approach to address these data-related challenges. It spans change across process, technology, and people to help an organisation become a data company, like Netflix and address data-related challenges.
It's important that organisations have a proactive incident response policy like this in place and use it when things go wrong — because they will go wrong. Data never lies, so organisations shouldn't base their decisions on hunches.
Data and visualisation tools, like Splunk, take threats we cannot perceive directly and make them accessible and easier to explain to our human sensory system through correlation, time mapping, and a graphical display that echoes our own visual systems. This means data must become a real-time resource, available at the fingertips of every user in a self-service format they can consume as insights. Driven by this need, we see an overwhelming demand to shift toward a way of working that enables data agility and accessibility.
Data is the winning ingredient
If there’s one thing companies can take away from Netflix’s data philosophy, it is the need to shift towards a data-driven culture.
To drive innovation and stay ahead of the competition, today’s enterprises need to master data. Organisations must take a holistic approach to nurturing a data-driven culture by rethinking their data management and governance processes as well as their data-related technology stacks to succeed in today’s digital-first business landscape.
A foundational change in how data is accessed, managed, secured, and leveraged across the enterprise is essential in dramatically modernising your data strategy while achieving regulatory compliance with today’s new privacy laws and growing number of data breaches.
Data is the key to today’s businesses and modern-day software development. Leveraging data to drive innovation, identify new business models and add new value-added services and offerings delivers meaningful value to customers.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 5.
2020 predictions from Michael Glenn, Market Intelligence, Exasol:
People will stop talking about big data, but enterprise data strategy will still be a top priority for enterprises, proven by the growth of the CDO role:
86% of the respondents in Exasol’s 2019 Cloud Survey have a Chief Data Officer (CDO) within their company, which proves and solidifies the status of data strategy as a mission critical initiative in business today. But in 2020 we will see the term ‘big data’ drift away as companies mature beyond this buzzwordy lexicon. Instead they will have use-case specific terms to frame their data analytics efforts.
For example, instead of saying “we do big data”, they will say “we’re working with customer demographics, credit card statements, transactions and point of sale data, online and mobile transfers and payments, and credit bureau data to discover similarities to define tens of thousands of micro-segmentations in the customer base. We then build ‘next product to purchase’ models that increase sales and customer retention.”
Data warehouses modernisation projects utilising containers will expand rapidly
To date, cloud has primarily been used to build new apps and rehost infrastructure. However in 2020 we expect enterprises will increasingly leverage cloud to modernise existing business apps, processes and data environments.
In 2020, we expect more data warehouse modernisation programs to be deployed in a containerised hybrid / multi-cloud environment helping organisations become more agile and deliver a more frictionless deployment and management experience. This investment will be driven by the need to speed up data accessibility, improve the timeliness of insights, lessen support and maintenance costs and future proof existing data environments.
A container-based approach allows organisations to reap the benefits of “cloud-native” as quickly as possible in the enterprise. Containers can help these companies manage data in a hybrid cloud setup in ways that other approaches cannot. Moving data to the public cloud can be extremely risky, and expensive. The first reason is data gravity, a tendency to leave data where it currently resides.
Containers often equate to agility, but they also increase portability. By building out services and data stores within containers, businesses can more easily move them all—or some of them at a time as part of your migration strategy—to the public cloud. Containers also provide flexibility in terms of maintaining a similar architecture across all you’re on-premises and cloud applications, but with the ability to customise rollouts in geographical regions.
DATA SCIENCE & AI
Widespread/general adoption of artificial intelligence will only appear in the most advanced firms
In 2020 we will continue to see AI investments gather speed, but for most companies this will only be in narrow use-cases that allow them to pick off the low hanging fruit in their industries. For example, CPG firms are more likely to invest in physical robotics on for the factory floor, and telcos will invest in customer-facing virtual agents.
The top performers will look to use AI to generate value more broadly across business lines and functions. For example, sentiment analysis can be used not only to gain a deep understanding of customer complaints, but also to inform marketing content and micro segmentation for sophisticated sales strategies. Shared sentiment around an issue will stand alongside spending patterns to determine next-to-buy models and deep marketing personalization.
A barrier in place to broad adoption of AI is a lack of training data. For large tech firms like Google, Apple, and Amazon, gathering data is not an arduous task in comparison to most companies. Because of the breadth and depth of their products and services, they have a near-endless supply of diverse data streams, creating the perfect environment for their data scientists to train their algorithms. For smaller companies, access to comparable datasets is limited or simply too expensive.
In 2020 we will see this demand for data satisfied by a growing availability of synthetic datasets; allowing less advanced or smaller companies to make meaningful strides in their AI journey. Synthetic data is data that is generated programmatically. For example, realistic images of objects in arbitrary scenes rendered using video game engines or audio generated by a speech synthesis model from known text. The two most common strategies for synthetic data usage we will see are:
1. Taking observations from real statistic distributions and reproducing fake data according to these patterns.
2. A model is created to explain observed behaviour, and then creates random data using this model. This strategy is strong when trying to understand interactions between agents that are had on the system as a whole.
Companies who considered their data storage capacities to be minimal will come to the realization that they need a sophisticated solution to house their synthetic data if they are to compete on the hard-hitting elements of machine learning.
I think 2020 may be the year when the realisation finally dawns that ‘cloud’ is not a single concept. It means different things to different people and making the decision ‘we need to move services to cloud’ is just the first step in what can be a long journey. It may include private cloud as an intermediate step on en route to most organisation’s ultimate goal, which is a small number of integrated SaaS services that deliver everything their organisation needs. Azure and AWS offer over 600 services between them. Organisations looking to move services to cloud need to define exactly what they want to do so that they can find the most appropriate solution – and understand that this may not actually be cloud at all. While cloud is pretty good for most applications, it’s not the best answer to everything and other options are still valid.
I hope they will also realise that cloud almost certainly won’t save them money unless they fundamentally reengineer their organisation. While it takes away in-house IT operations, it creates the need for new skills, such as billing management and managing one or more cloud providers to ensure they deliver the agreed service.
Second, whilst the IoT genuinely provides new capabilities, I believe that the hype around edge computing will die down as people realise that it is simply on-premise computing re-imagined – the next step in the regular waves of centralisation and decentralisation which have characterised IT over the last 40 years. One moment we all think that the best place for intelligence in the network is at the edge, and then technology changes and the most logical place for that intelligence becomes the centre instead. The growth of edge computing certainly doesn’t mean that cloud is dying. Each organisation will need to consider its own use case and choose the most appropriate solution, depending on how much real time processing they need.
Looking into my crystal ball, I also think that data generation, management and storage will continue to be a major issue for most organisations, both in the cloud and on premise. Data is still growing exponentially; the majority is now machine generated, data about data, and it’s very rarely, if ever deleted. But governance, risk and compliance will still be the number one concern of most CIOs next year.
Due to the compelling cost savings available, I believe SD-WAN should replace private MPLS services for most organisations within the next three to five years.
Finally, I think organisations will continue to look for reasons to adopt AI, but as with ERP/cloud/[insert next ’paradigm shift’ here], they need to realise that implementing AI in their organisation is a business process and culture issue, not a technology issue. The technology doesn’t care…….
Greenlight Commerce’s Managing Director Kevin Murray, offers the following observations:
“This year has been one of turbulence and unrest for the UK, with the prospect of Brexit bringing about great uncertainty for British businesses. Within the retail industry specifically, Brexit has created a stagnant environment, in which UK businesses have been focusing on ‘business as usual’ operations, rather than making any significant organisational changes.
Last year we predicted that 2019 would be a year of going back to basics within the UK retail landscape and we would predict that the following year will largely be the same. With the delay of Brexit, and the impeding uncertainty around any agreements with the EU, it is unclear what the outcome will be for trading deals and regulations. As such, this period of plateau will continue throughout the next few months for many retailers.
The recent announcement that retailers’ stock levels have risen to the highest point compared with the volume of expected sales, is just one example of actions UK retailers are taking, due to the fear caused by the proposition of a no-deal Brexit. Confusion around what will happen after Brexit has led many businesses to begin stockpiling goods.
Once the retail industry has a clear roadmap of what is to come then there will be a massive need to adapt to these changes. Businesses will need to adapt quickly if they want to remain on top. Investing in intelligent systems and the customer experience will be imperative for the success of a business to continue, and we think the leaders within the retail world will prioritise this.
Smaller businesses can usually move more quickly than their larger counterparts and should use this to their advantage to quickly establish themselves online and invest in customer engagement, in order to keep up to date with the competition.
Large businesses will also have more scope and confidence after Brexit, whereas smaller businesses will be fearful. This is due to large businesses already being established so, for example, the suppliers who provide them with items to trade will not falter and they too will want to continue to work with them.
Large companies will also remain more stable as they have the confidence to broker a trading deal with countries after Brexit, that won’t affect their sales or cost them a certain amount of money, possibly eventually bankrupting their brand. Take Dyson, they know they are a luxury product and that their items will be purchased with or without Brexit, so they can bargain and get a deal that is best for their brand.We can only hope that this coming year will provide some clarity for the retail industry, and that retailers can get themselves back on track and able to live up to the customers standards, even after the Brexit result.”
Six industry experts share their vision for healthcare IT in 2020
Advanced technologies have caused a significant impact on the development of the healthcare industry. Artificial Intelligence (AI) and Machine Learning (ML) in particular, have allowed for significant breakthroughs in life science and healthcare research and treatments, whether that’s automating critical but repetitive tasks to free up time for clinicians, through to automatic speech recognition for faster disease diagnosis, or the ability to create synthetic controls for clinical trials.
But with 75 percent of healthcare enterprises planning to execute an AI strategy next year, there’s a far greater opportunity round the corner to further unleash its potential. Here, six experts from leading healthcare organisations including Brainomix, AiCure, HeartFlow, Cambridge Cognition, Oxford Brain Diagnostics and Zebra Medical Vision, share their views on what 2020 holds for the industry.
“As highlighted earlier this year, the NHS aims to become a world leader in AI and machine learning in the next five years. In 2020, we expect to see this become more apparent in practical terms with, AI technologies becoming the predominant driving force behind imaging diagnostics.
With around 780,000 people suffering a stroke each year in Europe, and 7.4 million people living with heart and circulatory diseases in the UK, it is imperative we find ways to reduce the burden on healthcare organisations and improve time to disease detection.The number of MRI and CT scans for example is already on the rise, and AI has the ability to read scans as accurately as an expert physician. Utilising these new technologies to review scans for any disease can reduce patient wait time and ease the burden on medical staff. There will be greater recognition next year of the value of AI in augmenting human performance.”
“The greatest challenges in deploying AI solutions in healthcare vary widely by application. In 2020 (and beyond), it comes down to ensuring that back-end processes gain greater efficiencies. From an administrative standpoint, making it easier for AI to integrate with existing technology infrastructure will certainly help adoption. From a societal standpoint, building greater trust in AI and protecting personal healthcare data will continue to be among the omnipresent challenges.
Within the clinical trials industry specifically, we can expect to see a number of key challenges in 2020 which technology - including AI - will help address.
Once identified and recruited, one of the biggest challenges in clinical trials are keeping subjects engaged and optimised to treatment. Medication non-adherence has been shown to increase variance, lower study power, and reduce the magnitude of treatment effects. AI will play a critical role in understanding how a drug is performing in real-time and how patients are responding in clinical research including medication adherence and their behavior.
The adoption of new technologies in 2020 and beyond have the potential to provide clinicians with improvements in overall patient engagement, outcomes, quality of life, practicality in use, and reduce clinical development time and associated costs.”
“For me, 2020 will accelerate the development of the digital healthcare industry; a hybrid sector where medicine and cutting-edge technology converge to propel patient care forward. We’re starting to see more interest and investment in this fascinating field.It’s an exciting time to be leading a company like HeartFlow, which is truly bilingual in healthcare and technology. Right now, we’re able to use medical imaging and AI to give physicians unprecedented insight into potentially life-threatening restrictions on blood flow within the body. But we’ve only just scratched the surface of what integration between information technology, computers and healthcare can achieve, and the expectations are high. I look forward to seeing how these challenges are met in the year ahead.”
Digital biomarkers are the new frontier. The upward trajectory of digital capabilities over the last decade, combined with the widespread adoption of devices, has augmented biological markers with digital measures of disease progression.
In our field, it is now possible to use AI to enrich cognitive test scores with metrics that indicate cognitive effort i.e. the unique features of a patient’s voice that reveal when they are finding it particularly challenging to perform a task. Patients who are ostensibly performing within normal ranges but struggling to maintain that performance are likely suffering with the early stages of decline and could benefit from interventions that might slow or prevent further neurodegeneration.
Over the next year, we expect to see improvements in the precision of digital biomarkers for rapidly detecting neurodegenerative conditions such as Alzheimer’s disease. The ultimate goal is to integrate digital biomarkers into clinical care and improve patient outcomes.
“Dementia remains highly complex in nature and requires extensive collaboration to succeed. Urgent action to address these challenges is needed today. By 2050, 152 million people will have the disease globally.
Unlocking new biomarkers, leveraging smarter science and deploying funds where they are needed most may give the industry a chance to defeat this terrible condition. We must re-focus our efforts and move quickly now towards examining the disease much earlier, allowing novel biomarkers to measure the progression more accurately and develop specific and targeted drug treatments for the range of dementias that exist.
National level support to develop more holistic brain health and screening programmes will demystify the brain, rationalise the fear of dementia, and ensure patients and families have the opportunity to embrace interventions in clinical trials earlier in their lives.”
“With two billion people joining the middle class, a rising aging population and the growing shortage in medical experts, AI will be critical in enabling communities to provide productive and consistent health services. From medical imaging analysis to sensors and smart alerts, we are going to witness more improved and personalised care.
In 2020 we will see AI in deployment of hundreds of health networks globally and impact on millions of patient lives. AI has the power to transform patient care and empower radiologists to help with patient diagnosis. Our mission is to teach the Zebra software how to automatically interpret and formulate insights from medical images. Having a single AI solution that integrates seamlessly into existing workflows at an affordable rate, will support radiologists in delivering better patient care. Our platform allows healthcare institutions such as Intermountain Health, University of Virginia and Apollo hospitals to identify patients at risk of disease and initiate preventative treatment pathways.”
AR has moved from the gaming room to the board of directors and the factory floor, to revolutionise the way we will work within the next five years.
By Robert Hoyle Brown, Vice President, Cognizant’s Centre for the Future of Work.
While augmented reality (AR) is primarily seen as a consumer technology, it is fast earning a seat at the business table through its ability to drive more efficient operations by bridging the gap between customer expectations and services.
AR adoption is growing rapidly: research from Cognizant’s Centre for the Future of Work found that nearly half (49 per cent) of senior executives believe AR will become a mature technology that is accepted, established, and in widespread use within the next 18 months to three years. Additionally, one-third of the 300 respondents said they have already fully implemented AR initiatives and are capturing value as a result.
In fact, according to Gartner, 25 per cent of organisations will have deployed AR to production by 2020. It also estimates that the technology will create 2.3 million jobs during 2020.
With emerging technologies, it is often the case that companies will look for evidence of their success in the consumer market first. While the release of the Google Glass AR headset in 2013 did not lead to widespread adoption, many would argue that it was too far ahead of its time. Apple’s rumoured move to release its first pair of AR smart glasses in early 2020, together with Adobe’s recent Aero AR app launch and the success of Sephora’s Visual Artist AR tool, are signs that the global market could finally be ready for the technology.
What would a world with AR-enhanced enterprises look like and what are the remaining barriers to getting there?
Delivering process transformation
Soon, AR could reshape the way we work thanks to its ability to meld people, places, and time into one immersive experience. Creating these experiences will drive employee engagement, productivity, and efficiency, which are the keys to success in the face of increased competition.
Where technology has digitised previously repetitive manual and paper-based work processes, AR will take the next step by rewiring business process journeys to allow information to be exchanged while on the move. It will remove “look away” activities that involve checking information and toggling between multiple documents or screens. For example, factory workers can use AR to display construction schematics, assembly or repair instructions in front of their eyes rather than having to check and recheck a static document.
Furthermore, 82 per cent of respondents surveyed by Cognizant believe that redesigning business processes is the most notable benefit of AR, empowering employees to take a more analytical approach to work and driving time efficiencies. One of the main benefits of AR is that it enables employees to work more quickly, and the rollout of 5G will drive further AR adoption. The improved network latency will reduce the slow data speeds that can lead to non-fluid AR experiences, which will be integral to creating connectivity across the business.
Concerns and barriers are not showstoppers
Given the current talent shortage in many roles involving digital expertise, companies may be concerned about their ability to acquire the necessary people to make AR a success. Eighty-five per cent of respondents identified user experience and user interface as the most necessary skill. However, demand for user experience and user interface design, especially people who are experienced in this field, is already high even though AR is really in its infancy. There is also a need for the development of natural language processing, which is expected to drive adoption as voice is an easier format than hand gesturing for many.
The value of AR within businesses is also becoming clearer. According to 62 per cent of the senior executive respondents in our survey, brand reputation is expected to be the top qualitative benefit of AR adoption. Companies understand the need for their brand to stay up-to-date with emerging consumer demographics and see AR as a way of achieving this.
When faced with the need to manage the perception that AR is a technology reserved for the consumer market, it seems as though its success in the consumer setting is driving more interest from businesses. For 65 per cent of respondents, the “fun and games” aspect of AR is serious business. They believe gaming engines such as Unity or Epic’s Unreal Engine will be the external suppliers of choice for AR capabilities. Dismissed until recently as being “just a game thing”, the massive computing power and beautiful image rendering of these engines will transpose best practices from the immersive world of gaming to catalyse mission-critical enterprise processes as fit-for-purpose in an AR-enabled world.
AR coming to an enterprise near you in 2020
AR has the potential to revolutionise the way we work. Consumer familiarity with the concept of AR on a mobile phone (think: Snapchat filters and Pokemon GO!), along with its benefits, will make its transition into the workplace much smoother. Organisations can use knowledge around acceptance in this space to understand how AR will create more personalised interactions, augmenting the business experience in 2020 and beyond. Like the advent of the smartphone over a decade ago, this moment requires all of us to think differently because, with AR, the journey becomes the process – and the process becomes the journey.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 6.
Who will be the winners and losers in data protection and compliance in 2020?
asks Frank Krieger, VP, governance, risk and compliance, iland:
At this time of year, it is customary to look ahead and see what the future holds. As 2020 looms on the horizon, it is safe to say that there is plenty of noise in the security, data privacy and compliance arenas. But when we dial it down, there are other key trends that will influence the year ahead. For some professionals and industries, 2020 has the potential to be extremely positive. For others, not so much. Here’s my take on the winners, losers and growth areas for next year.
The winners: Data Protection Officers in high demand
The end of the decade has seen regulators get serious about data privacy. The implementation of the California Consumer Privacy Act (CCPA) on 1 January 2020 puts legislation in place on both sides of the Atlantic - requiring businesses to take a global approach to data protection. Between CCPA and GDPR, demand will skyrocket for skilled DPOs who can devise effective programmes that satisfy the legislation as it is written and account for the different ways local authorities will interpret it.
The profile of DPOs has risen as data privacy has become a board-level issue. Rising salaries have tempted professionals from the legal and governance sectors to upskill in this area, but there is still a talent shortage that will cause salaries to continue on an upward trend.
A related issue that will trouble global organisations headquartered in the UK is Brexit. In the event of no-deal they will need to ensure they have a DPO resident in the EU to meet their compliance obligations.
The losers: unprepared businesses that get caught in the middle of a battle between cyber criminals and cyber insurers
Businesses that have been relying on cyber insurance pay-outs to get out of trouble when ransomware strikes are in for a tough time in 2020. Cyber criminals have become much smarter, targeting organisations that have delayed investing in effective security, backup and disaster recovery.
These organizations are seen as easy targets to breach, especially if they are relying on cyber insurance as protection against ransom payments. However, time is running out for this risky strategy.
Cyber insurers, following an unwelcome spike in pay-outs, are tightening the criteria under which they will issue cyber insurance policies. Companies that can’t prove that they have robust security and backup processes to mitigate ransomware attacks will find it increasingly expensive and difficult, if not impossible, to gain cyber insurance coverage.
This is a major evolution as cyber insurers refuse to continue paying the price for poor client preparedness. As a result, the business case for investment in effective security, backup and disaster recovery will become compelling, even for organisations with smaller budgets, because one thing is certain: ransomware attacks will not let up any time soon.
The growth area: demand for audits and certifications will continue to increase
Related to the point above is a growing demand for audits and certifications in the coming year as businesses must prove that they have their security practices in order to satisfy insurers, investors and board members. We will see more organisations needing ISO/IEC 27001 and SOC 2 certifications and relevant industry standards. The professional audit and compliance sector will grow as a result and businesses that don’t have their own compliance staff may start employing compliance-as-a-service consultants to meet obligations.
The next decade will see businesses adapting to the new normal of a tough cyber threat environment and stricter data privacy regulations. There will be winners and losers, but I’d hope to see more companies putting measures in place that reduce their vulnerability to ransomware and improve their compliance position at the same time.
2020 predictions from Steve Haighway, COO Europe, IPsoft:
The hybrid workforce
“By 2025, I predict that the Desk Based workforce will be 50:50 human and digital workers – and over the coming few years we’re going to see rapid adoption of AI-powered digital workers as we create this hybrid workplace.
“Digital workers are already transforming the way that we interact with technology at work, for example, with Allstate using Amelia as a whisper agent to support customer service representatives in finding the right information to respond to customer queries. And their capabilities are only going to become more impressive over the coming year, with businesses using AI to augment an increasing number of roles, and independently manage and execute ever-more complex and emotionally human-like tasks.
“In 2020, expect to see digital workers acting as your Mortgage Advisor or as a Recruitment Specialist in your office. These digital workers will take on time and information intensive tasks, improving the speed and experience of customer interactions, while reducing the administrative burden on employees.”
Humanising Virtual Assistants
“In 2020, businesses will recognise that looks do count. While consumers are used to the disembodied voices of Siri, Alexa and Cortana, next year we’re going to see a big shift towards using human-like avatars for virtual assistants.
“Many companies will be driven by the need to develop even stronger bonds with their customers as traditional brand loyalty wains. Ultimately, this plays into our natural desire to care about those we interact with daily: person or machine. And companies want us to enjoy the experience of and form an attachment with their virtual assistants and return.
“Humanising virtual assistants not only works because we, as humans, like to connect with other human beings, but because it is also able to easier divorce our emotion with things that differ greatly from us or are unseen. That’s why we’re more likely to want to engage fully with an avatar that feels familiar to us and displays similar emotional and empathetic reactions that are a key to human conversation.
“Of course, it’s not a silver bullet: slapping an impressive avatar on an RPA solution won’t solve the limited user experience that ultimately frustrates consumers. However, those early adopters will find that a sophisticated avatar as the front of leading-edge AI assistants equipped with Emotional Intelligence will boost customer satisfaction, improve brand loyalty and help the re-establish personable customer services in a digital era.”
RPA will reveal itself as the new dot-com bubble
“Looking back at 2019, I’d call it the year of the RPA bubble. The RPA market has been rapidly growing over the past couple of years, with analysts predicting it will exceed $2.9B by 2021 (the Everest Group), reflecting the staggering valuations of some of its key players like UiPath, which was valued at $6.4bn earlier this year.
“However, in 2020, I expect this bubble will burst. RPA, while good at automating processes, is limited and will never drive genuine business transformation. This is reminiscent of the dot-com boom, where the firms that listed on the stock exchange did little more than consume vast amounts of investor cash despite showing little prospect of achieving profit. The current expectations for the RPA market are similarly unrealistic – traditional metrics of performance are once again being overlooked and big spending is being seen as a sign of rapid progress.
“The true opportunities for driving return on AI investment will be achieved where systems are able to understand context, intent and natural language, and use this knowledge to interact with users as well as execute on their requests. Anything short of this will eventually have to be ripped out and replaced, as its reliance on keyword-driven instructions and limited, static decision trees becomes a roadblock to business transformation.”
“While robotic process automation (RPA) has been the hyped technology of 2019, it’s potential is ultimately limited and will never drive genuine business transformation. With enterprises increasingly recognising its finite opportunity, next year we are likely to see a large uptake in hyperautomation: a powerful blend of robotic process automation (RPA), intelligent business management software and artificial intelligence (AI) used to automate processes in a way that is significantly more impactful than standalone automation technologies.
“Hyperautomation has the ability to revolutionise a company’s entire ecosystem. The sophisticated combination of disconnected technologies massively expands an organisation’s automation capability. This enables businesses to easily automate mundane, repetitive work, so that employees can focus on tasks that are both valuable for the business and fulfilling for the worker.
“However, the most successful enterprise implementations of hyperautomation will be where solutions are built to mirror human intelligence: combining digital emotional intelligence with natural language understanding will be key to making automation a natural part of our working lives.”
HR in 2020
“One of the greatest changes we’ll see in HR in 2020 will be the vast range of administrative tasks which will become automated. This is good news: lightening this burden means that HR managers will be empowered to work more strategically and drive more value back into the business and internal stakeholders.
“For example, the workflow of both the employee and HR manager when arranging leave is unnecessarily fiddly and time-consuming. This process – like many admin-based tasks – can be simplified by introducing an AI-powered digital colleague. Interacting directly with a digital colleague, like Amelia, the employee will be able to directly state “please create a holiday request for August 19 through August 21”. The necessary information to make the decision is automatically sourced and the request is then passed on to the HR manager who can simply speak or type the response of “approved” or “rejected,” which is registered by the digital colleague who then notifies the employee. This can all be done without logging into HR programmes, with the human-like interface delivering a better user experience to the employee and providing HR managers with the information they need to quickly respond to requests.
“Benefits enrolment or changes, training requests, timesheet submissions, leave of absence requests… all of these processes will increasingly be managed by digital colleagues in 2020, which in turn will result in significantly reducing the administrative burden on HR managers.”
As we go into 2020, the rise of the ethical consumer is set to shake up traditional business models. ‘The Greta effect’ has put environmental concerns at the forefront of many people’s spending habits, not just for food or fashion, but for technology too. At the same time, rising public awareness of issues like e-waste and the environmental cost of production have sparked another less expected trend – consumer fatigue with technology consumption overall.
Our appetite for the latest tech is clearly waning. Sales of the Apple iPhone were down 9% in the last quarter and crowds are failing to flock to launch events in the way that they used to. In part, this is because tech giants like Google, Microsoft and Apple are not showing the same level of innovation with each launch, and instead seem content to keep the pace of change steady and their products’ lifespan short.
However, the rise of the responsible consumer is likely to change all, as this shift will force many manufacturers to rethink their business models. After all, it’s no longer guaranteed that advertising spending sprees will boost sales in the way they have done previously, which means that big tech will have to prepare for people buying less.
Millennials are driving this change – their spending habits are more likely to be influenced by Marie Kondo than advertising or promotions. Yet, this does not detract from their reliance on technology now and in the future. Consumers will continue to buy and upgrade, but at a slower pace and with higher expectations. As a result, manufacturers will have to adapt to this shift by forging genuine ethical profiles.
Research by Klyk Tech estimates that extending the life of smartphones by just one year in London alone could save as much CO2 as taking 55,000 cars off the road. In addition, a recent UN report found that the world currently produces as much as 50 million tonnes of electrical waste (e-waste) every year. Against this backdrop, it’s clear that the way forward must focus on extending product life and using recyclable material.
While zero-carbon is simply not possible when it comes to tech production, companies making moves to reduce their footprint should be trumpeted. Apple, for example, is now committed to using 100% renewable energy during production and used 100% recycled materials for its new iPhone Pro. However, this clearly hasn’t gone far enough – changes in the design of the new model means that its carbon profile is 14% higher than last year’s iPhone XS. In addition, Apple continues to bring out products that are not eco-friendly; it’s recent Airpods Pro launch is a prime example as the product is so difficult to recycle.
Meanwhile, new kids on the block like Fairphone, an eco-friendly, fair-trade phone producer, continue to make waves in the market. Whilst companies like these may lack the muscle of industry giants, they nonetheless offer a compelling proposition and are likely to play an influential role in shaping consumers’ expectations in the coming years.
For all these reasons, 2020 may well herald the beginning of the end for fast tech. Ultimately, this is good news, as it will not only positively impact the environment, but also shake-up any complacent players in the market.
Hardware innovation that delivers on ethical expectations needs to be at the heart of this change, but the responsibility shouldn’t lie with manufacturers alone. Consumers must play their part too, by engaging recycling used tech or selling their unwanted devices through online platforms. With both sides changing their habits and expectations together, 2020 is set to be an interesting year for tech.”
Making the most of data – why it’s key for public sector organisations, according to Nick Smee, CEO, Yotta:
“As we look ahead to 2020, there is a growing recognition that data is the most valuable resource public sector organisations have at their disposal. The advance of IoT is leading to an escalation in the volume of accessible data sources.
Collecting data is not enough, however. Providers also need to filter it to deliver intelligence their customers can act on. Data visualisation will continue to be key, therefore, as it can extract granular detail from the data while ensuring it makes sense to every stakeholder – no matter their expertise.
There are still challenges surrounding infrastructure of course – particularly when upgrading it to ensure data can get to the right places. But we are seeing a step change in the way public sector organisations, including councils, approach digital technologies. Many councils are on a journey to digital transformation – and typically it is driven by a realisation that they have to become more customer-centric. That’s where connected asset management technology can play a part as an enabler to the uptake of digital systems. Councils have to invest but they end up increasing customer satisfaction as a direct result.
But while software can open up opportunities, it can’t ensure organisations capitalise on them. Challenges will continue throughout 2020. One of the biggest is cultural: the siloed management of data sources and asset classes. For example, a highways department is tasked with managing a myriad of assets, many of which will be connected, yet too often the data for each class of asset is viewed in isolation and valuable opportunities for better asset management are lost. More silos will need to be broken down and data shared if digital transformation is to be achieved.
Throughout 2020 and beyond, organisations will also need to be cogniscent that more of their public is smartphone-enabled. Councils increasingly need to be providing instant messaging communications and apps for all their services. Any council that does not is missing out on a great opportunity to make their citizens happier and failing to improve efficiencies for themselves.”
Widespread use and support mean iSCSI will still be widely deployed for at least the next few years, but its growth prospects beyond that are very unclear.
By John F. Kim, Chair, SNIA Networking Storage Forum.
What is iSCSI?
iSCSI is a block storage protocol for storage networking. It stands for “Internet Small Computer Systems Interface” and runs the very common SCSI storage protocol across a network connection which is usually Ethernet with TCP. You can read more about iSCSI at the “What Is” section of the SNIA website. (The SCSI storage protocol is also used to access block storage as part of the SAS, SRP, and FCP protocols, which run over various physical connections.)
Originally, the SCSI protocol was used only for local storage, meaning individual disk drives or direct-attached storage (DAS). Then around 1993, Fibre Channel came along and enabled SCSI to run the Fibre Channel Protocol (FCP) on top of a Fibre Channel Storage Area Networks (FC-SAN). iSCSI was submitted as a standard in 2000 and grew in popularity as it was supported by more operating systems, first requiring dedicated iSCSI HBAs but later using a software iSCSI initiator that ran on top of any type of Ethernet NIC.
The dedicated iSCSI HBAs gave iSCSI faster performance that was closer to Fibre Channel performance at the time, while the software iSCSI initiator made it easy to use iSCSI from many servers without buying special HBAs for each server. Probably the biggest boost to iSCSI adoption was when Microsoft Windows Server 2008 included a software iSCSI initiator (starting in 2008, of course).
iSCSI and Related Technology Milestones
Fibre Channel work; ANSI approval of the FC standard in 1994
Arrival of first Fibre Channel products, carrying SCSI over FC
First 1G FC products
iSCSI technology developed
iSCSI standard submitted for approval
First 2G FC products
Solaris, Windows, NetWare, and HP-UX add iSCSI support
First iSCSI HBA (1GbE)
First 10GbE NIC (10GbE shipments didn’t really take off until 2010)
First 4G and 8G FC products
iSCSI Extensions for RDMA (iSER) standard; VMware adds iSCSI support
FreeBSD, MacOS, and OpenBSD add iSCSI
10G Ethernet high-volume shipments begin
NVMe 1.0 standard released; First 16G FC availability
iSER added to Linux targets TGT (2008), LIO (2013) and SCST (2014)
Availability of 25G and 100G Ethernet products
NVMe-oF 1.0 standard released; first 32G FC availability
VMware ESXi previews iSER (GA in 2018); Linux kernel adds NVMe-oF
NVMe-oF able to run on TCP (in addition to RDMA and Fibre Channel)
First shipment of 200G Ethernet products (and 400G Ethernet switches)
iSCSI Use in the Enterprise
In the enterprise, iSCSI has been used mostly for so-called “secondary” block storage, meaning storage for applications that are important but not mission-critical, and storage that must deliver good—but not great—performance. Generally, the most critical applications needing the fastest storage performance used FC-SAN, which ran on a physically separate storage network. FC speeds stayed ahead of iSCSI speeds until 2011, when 10GbE reached high volumes in servers and storage arrays, equaling 8GFC performance. Starting in 2016, Ethernet (and iSCSI) speeds pulled ahead as 25G and 100G Ethernet adoption far outpaced 32GFC adoption.
The fact that iSCSI runs on Ethernet and can be deployed without specialized hardware has made it very popular in clouds and cloud storage, so its usage has blossomed with the growth of cloud. Today, iSCSI is the most popular way to run the SCSI protocol over Ethernet networks. The rapid growth of faster Ethernet speeds such as 25G, 50G and 100G (replacing 1G, 10G and 40G Ethernet), along with increasing support for congestion management and traffic QoS on Ethernet switches, have greatly improved the performance, reliability, and predictability of iSCSI as a storage protocol.
Other Storage Protocols Threaten iSCSI
However, the emergence of NVMe over FabricsÔ (NVMe-oF) now threatens to displace iSCSI for high-performance block storage access to flash storage. Simultaneously, the growing use of file and object storage poses a threat to both iSCSI (and to FC-SAN).
NVMe-oF is more efficient than iSCSI (and NVMe is more efficient than SCSI). It was designed as a leaner protocol for solid state (flash or other non-volatile memory) so it eliminates the SCSI layer from the protocol stack and delivers lower latency than iSCSI. Note that NVMe-oF can run over Ethernet TCP, Ethernet RDMA, Fibre Channel, or InfiniBand fabrics, with the RDMA options delivering the lowest latency, but all versions of NVMe-oF (including on FC or TCP) deliver faster performance than iSCSI on the same-speed connection. So now the fastest flash arrays and fastest applications (on Linux) are moving to NVMe-oF.
iSCSI Advantage: Broad Support
Probably the biggest advantage iSCSI still holds today is that it’s widely supported by all major operating systems and hypervisors. NVMe-oF is currently only fully supported on Linux, and maybe only 1/3rd of enterprise storage arrays today support NVMe-oF on the “front end” meaning from server to storage (and a few of those support NVMe-oF only on Fibre Channel, not yet on Ethernet). However, VMware has announced plans to support an NVMe-oF initiator and another vendor has independently developed a Windows Server NVMe-oF initiator. In addition, some specialized SmartNICs are able to take NVMe-oF storage and make it look like a local NVMe SSD, meaning it can be used by nearly any OS or hypervisor. (While only Linux fully supports an NVMe-oF initiator today, nearly every modern OS and hypervisor does support local NVMe SSDs.)
iSCSI Advantage: Hardware acceleration options
iSCSI runs most commonly on top of Ethernet using the TCP protocol, but can also run on top of InfiniBand. Also, it can run on standard network cards or specialized Host Bus Adapters (HBAs) to take advantage of either RDMA (using iSER) or iSCSI offloads and/or a TCP offload engine (ToE). iSCSI is still supported by almost all enterprise storage arrays. iSCSI can be accelerated by using network adapters with an iSCSI hardware offload and/or TCP Offload Engine (TOE). In the former, the adapter (host bus adapter or HBA) offloads the iSCSI initiator function from the server CPU. In the latter case, the adapter offloads the TCP processing from the server kernel and CPU. Use of a TOE has fallen out of favor in some circles due to limitations that arise from handling all TCP tasks in hardware, but other forms of stateless TCP offload are still popular, can be used to improve iSCSI performance, and are supported by most enterprise Ethernet adapters.
But… NVMe-oF can also be accelerated in the network adapters, can also use RDMA, and can run over a wider variety of networks than iSCSI.
iSCSI Limitation – Block access only
iSCSI can only support block storage access, but file and object storage capacity are growing more rapidly than block storage capacity because so much of the new content today—audio, video, photos, log files, documents, AI/ML data, etc.—are more easily stored and used as files or objects instead of blocks. File and object storage enable easier sharing of data across multiple users and applications than block storage.
If use of file and object storage continues to grow faster than use of block storage, it could limit the growth rates of all block storage, including iSCSI, Fibre Channel, and NVMe-oF.
iSCSI – An Uncertain Future
On the one hand, iSCSI use is being driven by the growth of cloud deployments that need block storage on Ethernet. On the other hand, it’s being displaced by NVMe-oF in areas that need the fastest performance, and also challenged by file and object for the storage of multi-media content, big data, and AI/ML projects. Widespread use and support—and the current OS limitations of NVMe-oF—mean iSCSI will still be widely deployed for at least the next few years, but its growth prospects beyond that are very unclear.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 6.
2020: Five Artificial Intelligence Trends for Engineers and Scientists, by Jos Martin, Senior Engineering Manager, MathWorks:
2020 will be the year of the “AI-Driven System” with rapid growth of AI in industrial applications. What are the five trends that will help make AI more pervasive in 2020?
1. Workforce skills and data quality barriers start to abate
As AI becomes more prevalent in industry, more engineers and scientists – not just data scientists – will work on AI projects. They now have access to existing deep learning models and accessible research from the community, which allows them a significant advantage over starting from scratch. Many initial AI models were image analysis focussed, many are now also incorporating more sensor data, including time-series data, text and radar.
Engineers and scientists will greatly influence the success of a project because of their inherent knowledge of the data, which is an advantage over data scientists not as familiar with the domain area. With tools such as automated labelling, they can use their domain knowledge to rapidly curate large, high quality datasets. The more availability of high-quality data, the higher the likelihood of accuracy in an AI model, and therefore the higher likelihood for success.
2. The rise of AI-driven systems increases design complexity
As AI is trained to work with more sensor types (IMUs, Lidar, Radar, etc.), engineers are driving AI into a wide range of systems, including autonomous vehicles, aircraft engines, industrial plants, and wind turbines. These are complex, multidomain systems where behaviour of the AI model has a substantial impact on the overall system performance. In this world, developing an AI model is not the finish line, it is merely a step along the way.
Designers are looking to model-based design tools for simulation, integration, and continuous testing of these AI-driven systems. Simulation enables designers to understand how the AI interacts the rest of the system. Integration allows designers to try design ideas within a complete system context. Continuous testing allows designers to quickly find weaknesses in the AI training datasets or design flaws in other components. Model-based design represents an end-to-end workflow that tames the complexity of designing AI-driven systems.
3. AI becomes easier to deploy to low power, low cost embedded devices
AI has typically used 32-bit floating-point math as available in high performance computing systems, including GPUs, clusters, and data centres. This allowed for more accurate results and easier training of models, but it ruled out low cost, low power devices that use fixed-point math. Recent advances in software tools now support AI inference models with different levels of fixed-point math. This enables the deployment of AI on those low power, low cost devices and opens up a new frontier for engineers to incorporate AI in their designs. Examples include low-cost Electronic Control Units (ECUs) in vehicles and other embedded industrial applications.
4. Reinforcement Learning moves from gaming to real-world industrial applications
In 2020, reinforcement learning will go from playing games to enabling real-world industrial applications particularly for automated driving, autonomous systems, control design, and robotics. We’ll see successes where reinforcement learning (RL) is used as a component to improve a larger system. Key enablers are easier tools for engineers to build and train RL policies, generate lots of simulation data for training, easy integration of RL agents into system simulation tools and code generation for embedded hardware. An example is improving driver performance in an autonomous driving system. AI can enhance the controller in this system by adding an RL agent to improve and optimise performance – such as faster speed, minimal fuel consumption, or response time. This can be incorporated in a full autonomous driving system model that includes a vehicle dynamics model, an environment model, camera sensor models, and image processing algorithms.
5. Simulation lowers a primary barrier to successful AI adoption – lack of data quality
Data quality is a top barrier to successful adoption of AI – per analyst surveys. Simulation will help lower this barrier in 2020. We know training accurate AI models requires lots of data. While you often have lots of data for normal system operation, what you really need is data from anomalies or critical failure conditions. This is especially true for predictive maintenance applications, such as accurately predicting remaining useful life for a pump on an industrial site. Since creating failure data from physical equipment would be destructive and expensive, the best approach is to generate data from simulations representing failure behaviour and use the synthesised data to train an accurate AI model. Simulation will quickly become a key enabler for AI-driven systems.
“All companies face the challenge of security awareness among employees, contractors and customers. Without the support from all users, technological efforts will be hampered in their effectiveness.
“Security awareness isn’t just about teaching employees what to do with phishing emails – there’s so much more, including developing products with security in mind.
“Multi-directional communication is extremely important in a security program, meaning working from the top-down, bottom-up, and side-to-side to get your messaging across. And yes, it’s true. Security is everyone’s responsibility.
“People learn differently – some are more receptive to visual, listening, or the ‘hands-on’ approach, and some people are attracted to different types of content – funny, serious, the historical background or whatever it may be. And at the same time, providing consistent communication is the key to a strong awareness programme.
“A major challenge for larger companies is maintaining control over the employee/worker identity lifecycle. In terms of culture, it’s a journey to influence behavior change for thousands of employees. Organisations need support from everyone from interns to the C-suite and Board to drive adoption and create a culture of security. At the end of the day, employees want to do the right thing – it’s just a matter of constant education and communication.
“When it comes to high-tech industries like those in the finance or healthcare, the key is to establish and maintain control over BYOD and Bring-Your-Own-App policies and mentality without impacting employee productivity”.
Some 2020 predictions from David Richardson, senior director of product management at Lookout:
1. Mobile Will Become the Primary Phishing Attack Vector -- Lookout expects credential phishing attempts targeting mobile devices to become more common than traditional email-based attacks. Traditional secure email gateways block potential phishing emails and malicious URLs, which works for protecting corporate email from account takeover attacks, but neglects mobile attack vectors, including personal email, social networking, and other mobile centric messaging platforms such as secure messaging apps and SMS/MMS. Moreover, mobile devices are targeted not only because of these new avenues but also because the personal nature of the device and its user interface. Enterprises must realize that when it comes to social engineering in a post-perimeter world, corporate email is not the only, or even the primary, attack vector used.
2. 2FA is dead. Long live MFA. -- Authentication will move from two-factor to multi-factor, including biometrics in 2020. Most companies have implemented one time authorization codes (OTAC) to provide two-factor authentication (2FA), but Lookout, and others in the industry, have already seen OTAC targeted by advanced phishing attacks. To protect against credential theft and to address regulatory compliance, enterprises are increasingly adopting MFA and biometrics using mobile devices. This new approach strengthens authentication and improves user experience, but it is critical that the mobile device is free from compromise.
3. Threat Actors will Leverage Machine Learning to Operate Autonomously -- One example of where we may see attackers implement machine learning is into the execution of phishing campaigns. Phishing lures and landing pages will be A/B tested by AI algorithms to improve conversion rates, while new domains will be generated and registered by AI algorithms. These enhancements will allow attacks to move faster than most existing solutions could detect them.
4. 2020 Election Hacking Will Focus on Mobile - As cyberattacks have evolved to target mobile devices because of their nature and form factor, so will cyberattacks in the 2020 Presidential Election. Spear phishing campaigns are moving beyond the traditional email-based phishing attacks we saw in the 2016 election cycle to advanced attacks that involve encrypted messaging apps, social media and fake voice calls. Before the next election is over, we will likely see some kind of compromise as the result of a social engineering or mobile phishing attack, particularly as presidential campaigns embrace mobile devices in their canvassing efforts.
5. Partnerships Are the New Consolidation - Within the past decade there have been many mergers and acquisitions within the security industry. That trend will likely continue, but now vendors will also tightly integrate their solutions to improve enterprise security. And, as we move into 2020 and beyond, a new trend is emerging that will see security vendors forming alliances -- even with those they consider their competitors -- and strategically collaborating to combat threats for the greater good. A recent example of this is the App Defense Alliance, which was launched in late 2019 to combat malicious apps on Google Play. These alliances also have a positive effect on AI solutions, as the corpus of data grows for Machine Learning algorithms to ingest.
Megaport 2020 Forecast: Multicloudy Skies with a Certain Chance of SDN, say John Veizades, VP of Engineering and Product at Megaport:
In recent years, there’s been a lot of hype around “multicloud” and what it means for businesses. Enterprises have struggled to understand where multicloud fits into their ever-evolving business strategies. In 2018, many customers went all-in with one cloud. This began shifting in 2019, and we expect 2020 to be a breakout year for greater multicloud adoption.
Enterprises Increasingly Turn to Multicloud
In 2020, more enterprises than ever before will embrace multicloud for their business-critical applications. They will hedge their bets by having the same application available from several cloud providers. They will place a premium on the ability to switch among these different clouds both quickly and efficiently. While in previous years, many enterprises went all-in with a single cloud vendor, they no longer want to be beholden to one cloud vendor. Organisations of all sizes are committing to multicloud, while they expect the user experience to feel seamless across clouds.
Data Deluge Tests Cloud Connectivity
Enterprises have been warned about the coming “data deluge” for the past 15+ years, with industry pundits annually increasing the potential data volumes as technological advances, like AI, create new demands for greater amounts of data that were previously unimaginable. Like a closet in a spare bedroom, clouds offer convenient places for data. This will test cloud connectivity in 2020, as enterprises look for the best ways to access, manage and derive intelligence out of their data in the cloud. Next year, look for signs of a rising level of data reaching “mega tsunami” proportions, and enterprises laser-focusing on scalable cloud connectivity to harness this data.
Blame it on the Gig Economy
As consumers, we want what we want when we want it, without unnecessary waste. When we work at enterprises, we don’t leave our consumer mindset at the door. A trend that should increase in 2020 is continued demand for alternatives to “old-school” vendor lock-in, long-term networking contracts. As business needs change, network availability must be flexible and change with them. Cost reduction by paying only for what is needed and used. These trends aren’t new, but they’re spreading to more vertical industries as technology makes it possible for organisations of all sizes to get what they want when they want it at a price that they are willing to pay.
No More “Versus” Between On-Prem and Cloud
Companies in sectors that deal with high volumes of critical, sensitive data have traditionally felt the need to keep control and ownership of that data. In 2020, there will continue to be a need for industries like media and entertainment or financial services to maintain control and ownership of critical data by keeping it on-prem instead of in the cloud, while still executing compute on that data using cloud resources. This will mean that the tie between on-prem data and cloud compute will become increasingly more important, along with the latency of that connection, as lower latency is allowing companies to gather better insights on more data. In 2020, we’ll see a need to get data centres closer from a latency perspective to cloud resources, as this will become more valuable to customers than ever.
Software-Defined Networking Will Change the Game for Data Centres
Technology has accelerated our expectations about the speed of business. We expect nearly everything to happen “instantly,” and that includes the time required to connect to data centres. In previous years, connecting to a data centre was measured in days. In 2019, enterprise-to-data centre and data centre-to-data centre connections happened in minutes. Software-defined networks (SDNs) are changing the data centre experience for many organisations. We believe 2020 will be a step closer to the point where no human intervention is needed in data canters to connect customers to the data sources they need. Some may see this as lost jobs. We see this as freeing up time to do something more productive.
Malcolm Isaacs, senior solutions manager, application delivery management, Micro Focus, says that security will become more proactive, with security response, rollback, and recovery testing embedded in the DevOps pipeline…
“With attacks and exploits becoming increasingly sophisticated and launched by anyone, from amateur hackers to state-sponsored experts, organisations must be prepared to take immediate action in the event that security is compromised. This is a necessity regardless of the organisation, as it is clear that every and any organisation is vulnerable to attack.
“While preventative security testing is now becoming a standard part of the DevOps pipeline, organisations will be looking at how they can prepare a quick and effective response in the event that security is breached. Teams will start adopting a proactive response to security breaches, ensuring that they can control further damage, rollback their systems, and restore corrupted data. We will see this type of testing becoming increasingly incorporated into DevOps pipelines and embedded into continuous testing processes, to minimise the impact and cost of a breach.”
“As 5G technologies begin to roll out, the pace in which we see breaches occur will accelerate. To combat this, organisations will need to refocus on driving security integrations across the business, moving to a centralised environment. Due to the continued skill gap present in the industry, organisations will move to adopt AI and behavioural analytics which will drive automation to augment and fill security gaps and drastically improve response times and accuracy of threat identification.”
“In the early days of GDPR, and similar regulations like the CCPA, the focus within the boardroom was naturally on mitigating risk. CISOs were tapped with protecting the organisation from fines, sanctions, lawsuits, erosion of shareholder value, and more. However, as these risks are starting to fade with a couple of years of investment, CIOs and others across the enterprise are starting to see the hidden benefits of those investments – and are now looking to extend them to not just protect, but also to drive value. With greater access to information and the ability to apply analytics (naturally within the guidelines of the privacy regulation), deeper insights are now available to help frame emerging opportunities and business models, to identify under-funded parts of the organisation, and to help drive more-effective collaboration.”
The main phases in the lifecycle of a building are construction, operations and occupancy. Each is undergoing a technological revolution as buildings become digital environments.
By Stuart McKay, Senior Business Development Manager, Enterprise, Panduit Europe.
New Intelligent Building concepts are changing the game for construction professionals, architects, design engineers and interior installation organisations. Ethernet and Power over Ethernet (PoE) is an integral part of Building Information Modelling (BIM) and Building Management Systems (BMS) and supporting the development of working environments that are a platform for increased worker productivity, health, safety and connectivity.
Across the property value chain there is recognition that digital infrastructure must be embraced to improve sustainability while providing the types of intelligent connected buildings that people, businesses and communities require.
New standards are being developed as digital becomes a new value proposition for the building sector. Ethernet and specifically PoE infrastructure, which combines data communications and power delivery over a single Ethernet cable, is becoming the dominant technology platform.
Ethernet cable infrastructure for combined power and data communications solves smart building connectivity, power delivery, network topology and supports thermal management requirements. A new white paper from Panduit – (Power over Ethernet – Fixed Network Foundation Layer for 21st Century Smart Buildings), illustrates how PoE will streamline processes and improve building performance, while creating fully connected and environmentally sustainable working environments.
Professionals are turning to digital infrastructure to reduce energy use and lower greenhouse gas emissions and meet energy use requirements. New generations of digitally native architects and designers are leading the efforts to attain net zero emission buildings. They recognise that technology capable of reducing installation costs and driving operational efficiency will be embraced.
BIM & BMS – Building Information Models and Building Management Systems
As BIM models advance and become interoperable this is driving evermore technology integration into building operation. Ultimately the goal is to develop interoperability between BIM, BMS, IT, IoT and manufacturing based on open standards. As this evolves, one estimate is that more than one billion sensors and connected devices will be deployed globally in buildings by 2021, alongside the billions of mobile connected devices brought into buildings by tenants, workers and visitors.
Figure 1: Building / BMS Network (From PoE Whitepaper)
This integration of devices into buildings will require cabling infrastructure which meets or exceeds the latest communications, power and thermal standards, and which does not add to network complexity and cost. The latest standards from the IEEE are PoE++ are IEEE 802.3bt TYPE 3 and IEEE 802.3bt TYPE 4. This is driving the adoption of PoE for integrated communications and power as the network standard in new builds and in the replacement networks in refurbishments. The latest standards provide PoE delivering up to 99w over twisted pair cables, enough to power the latest lighting, wireless access points and more.
Big changes in how buildings and occupants interact are being enabled by PoE cable infrastructure. The Royal Institution of Chartered Surveyors says “By combining BIM with the Internet of Things we can start to gain a living picture of our buildings. For the first time in history, the planning of the architects and designers can be verified and compared with the use of the building, with tracking over time as the use of the space and its occupiers’ changes. Bringing together all this data, we can gain new and better understanding of how our buildings actually work.” NB1
Ethernet represents the core infrastructure standard for combined communications and power which will provide plug and play connectivity for easy deployment of app-based sensors and devices which will make up the digital building.
Power and Data Network Infrastructure for the Long Term
The pace of change in consumer technology and endpoint IT continues to accelerate. So, it is vital to choose a standards-based technology with a clear roadmap. Ethernet is the local area network communications de facto standard for data. PoE runs over the same copper cable as the LAN, providing decades of stable, high performance network infrastructure.
Buildings are becoming digital with an infrastructure that supports all of the various applications to optimise operations. Building Management Systems are becoming more advanced, with applications which run and manage HVAC, water, lighting and security are constantly developing, becoming more responsive and smarter. This expansion of capabilities meant the control of these applications being transferred onto the data infrastructure plane. Now, PoE is extending to powering and communicating with an ever-growing variety of physical devices
Figure 2: What is Driving Digital Buildings – From PoE Whitepaper
In a security context, public and private multiple and single use buildings require evermore security and access control. Schools, hospitals, government and other public buildings must balance the needs for accessibility and security by ensuring a building’s physical security at access points and through the expansion of video monitoring. Another change is that the future direction of buildings in urban environments is upwards. That means major changes to modern buildings as multi-use and shared environments become ever more common.
Low voltage cameras, embedded sensors, kiosks, wireless access points, physical access points, digital signage and displays are being deployed in ever greater numbers. Market forecasts show an explosion in the number of devices about to enter buildings. Wireless access point numbers alone are forecast to expand by 30% per year until 2027. Kiosks and digital screens will become ubiquitous in multi-use environments. According to a report on the Global Kiosk Market, published by KBV research, the global kiosk market will reach $5.4 billion by 2024, at a growth rate of 26.4% CAGR.
High performance cable infrastructure is the base layer for these smart technologies. All these different devices need to run over a single network which provides power and data. With lower installation costs, fewer hazards and more flexibility, PoE is a proven, viable, cost-effective solution.
A single network has positive implications for ease of installation, lower maintenance and better performance and interoperability between different building elements. PoE is a simple means to power digital building infrastructure devices over the same network. Factors affecting cable infrastructure cost and operations include locations, runs, distances, connections and cable fill and these key factors effect performance and OpEx management in digital buildings.
Figure 3 Distributed Building Services – Millions / Outlets in Commercial Buildings by Type of Products – 2015-2021
New topologies for network infrastructure will provide direct connections from Telecom Rooms (TR) to the device allowing the switch to directly control the power and data communication to the device. Common challenges within the TR are crowded cable pathways and bulk is an issue as cable runs and access is becoming a serious concern as the number of connections increases cable and port densities. Solutions such as smaller cable diameters, angled connectors, and high-density patching help address these issues. These innovative solutions release space that would otherwise be used for cable management, to make room for active gear or switches.
Panduit’s Category 6A cable with MaTriX technology features an integrated tape that gives the cable advanced thermal properties for handling the heat rise within cable, which is caused by PoE. A digital building requires PoE cables in new physical environments; in ceiling voids and under floor spaces. To operate effectively at scale cable infrastructure flexibility becomes ever more important.
In the built environment, too many networks remain siloed and separate. This approach is no longer suitable for the new era of intelligent buildings. The new generation of architects and engineers have the responsibility for delivery of the next generation of smart digital buildings. Understanding the power and benefits of future proof PoE cable infrastructure is at the core of physical, economic and digital innovation.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 7.
NTT Ltd. ‘Future Disrupted’ predictions for 2020: Data, automation and IoT will enable virtual societies and change the way we live & work
Leading technology services company says connected cities and societies will be a focus in 2020 as it launches its ‘Future Disrupted’ predictions, looking at technology trends for the next 12 months and beyond:
NTT Ltd has published its Future Disrupted: 2020 technology trends predictions. They’re based on the most critical technology trends companies need to be aware of next year and the steps they need to take to address them. Formed from key insights from its technology experts, the company outlines the trends that will shape the business technology landscape throughout 2020 across six key areas: Disruptive Technologies, Cybersecurity, Workplace, Infrastructure, Business, and Technology Services. NTT Ltd. CTO Ettienne Reinecke predicts mainstream adoption of disruptive technologies in 2020 will finally see data, automation and internet of things (IoT) technologies come together to create connected cities and societies.
The company predicts that 2020 will finally see all the hype words of the past decade come together to create completely connected environments that are capable of running themselves autonomously to build more intelligent cities, workplaces and businesses – and on a secure basis. Data, AI and secure by design will be at the heart of this movement, empowering devices to talk to one another and act on that information without human intervention. Smart cities and IoT will become the norm as they improve productivity, growth and innovation across entire regions.
NTT Ltd. is the newly-formed company bringing together 40,000 people from across 31 brands – including NTT Communications, Dimension Data and NTT Security – to serve 10,000 clients from around the world. Using the insights gathered from its global client base, NTT Ltd. is able to better understand the future and shape the most effective intelligent technology solutions for its customers. The Future Disrupted: 2020 Technology Trends looks at the way businesses need to prepare for tomorrow, in the next year.
Commenting on the predictions, Ettienne Reinecke said: “The industry has been talking about different technologies, including the cloud, data, AI and security in different siloes. But 2020 is the year that will change. Next year, we’ll see complete end-to-end computing come to the fore, bringing to life fully intelligent environments that are completely connected and will have a big impact on the world we live in.
“We will see most cities and societies starting to follow in the footsteps of Las Vegas City, which has become intelligent in the way it shares data across the region, improving situational awareness through video and sound data. With IoT technology on a secure infrastructure, it’s created a safer environment to live in, improving living conditions and, ultimately, saving lives. Projects like these need a variety of different technology capabilities to come together in order to achieve great things, so building fully connected environments will be the key focus point next year.”
The predictions have been compiled by NTT experts, who have identified key trends for the next twelve months as well as the disruptive technologies we can expect in the future – and the steps businesses can take in 2020 to take full advantage of them.
Quote from Ettienne: “Technology is already changing quickly, but this is the slowest pace of change we’ll ever see. It’s clear too that we’ve never before had so much powerful technology at our disposal – technology we can use to answer questions and solve problems in our societies, businesses, and communities. There is a huge opportunity to use any and every tool out there to support innovation initiatives in every field and truly transform our future world for the better.”
Some of the disruptive technologies from the predictions include:
The most significant development to affect the IT landscape in the year ahead will be a rise in data breaches and a subsequent increase in the adoption of cybersecurity measures by organisations in an attempt to protect themselves, their networks and devices, from malicious attacks.
In 2020 we will see as many, if not more, data breaches and attacks than in 2019 as a result of the evolution in the way hackers operate (transitioning from ‘lone hackers’ to organised criminal conglomerates), human error and a struggle to find the balance between ‘just enough’ and ‘too much’ security. Finding a sweet spot when it comes to security and privacy controls is a challenge, and those organisations on the side of too little security will inevitably find themselves in the data breach cross-hairs.
As we shift towards new technological applications in 2020, including the adoption of technologies such as cloud and IoT, organisations will also experience an increased risk of suffering cyberattacks if they fail to prioritise the security of their public key infrastructure (PKI). This is according to the findings of the 2019 Global PKI and IoT Trends Study by nCipher Security and the Ponemon Institute, which also found altering a device’s original function through malware or other attacks to be the top perceived threat to IoT, according to 68 per cent of 1,800 IT respondents across the globe. While hackers continue to find new ways to access confidential data in 2020, organisations will begin to acknowledge their responsibility in preventing such breaches and look to upscale their cybersecurity portfolios.
The information captured and generated by connected IoT devices will continue to be used by businesses to help them run more efficiently, improve business processes, and make real-time decisions in 2020. However, there will be no point in collecting and analysing this IoT-generated data and making business decisions based on it if the devices themselves can’t be trusted. Such is the urgency to address this that the Interior, Homeland Security and Public Safety Ministers of the Five Eyes partner nations – including the UK – released a statement of intent regarding IoT security. In it, they agreed to collaborate with industries and standards bodies to provide better protection to device users.
Organisations will also continue to embrace technologies that perform well on-premises, in private cloud and in public cloud environments. They will favor multi-cloud, multi-deployment environments because they offer the best technology, and because they’re secure – regardless of whether or not they’re on-premises or in the cloud.
Stuart Reed, VP Cyber – Redesign, Nominet – malware & CISO roles:
“In 2020, we will see the cyber industry redesigned in some key areas. Malware will undoubtedly evolve, and ransomware will become more sophisticated, potentially even teaching businesses new ways to take payments and create customer service that encourages the victim to part with their money. That said, it will still be the simple attacks that cause the most damage, because organisations have a lot of work to do on ensuring they are utilising every layer of defence within their reach.
“We’ll also see the role of the CISO redesigned in 2020, as the imbalance of their work-life worsens and the role needs to change to meet the demands of the modern cyberscape; for example becoming more of a strategic resource for the business on mitigating risk and facilitating business transformation safely.”
Mark Burdett, Head of Product Delivery, Nominet - ML & AI enhanced cyber attacks
“2020 will see machine learning and artificial intelligence used to create distributed and targeted malware and attacks. An attacker using machine learning algorithms can create a suite of botnets or worm-style malware that gathers data from multiple attempts to breach commercial sites, ultimately generating more sophisticated attacks that could be targeted at Critical National Infrastructure or Governments.
“Using data from breaches, vulnerabilities, successful and failed attacks - the ‘next generation’ of malware can be created. It will make fewer obvious attacks but be more successful by using tactics proven to work. This would make pattern matching or DOS/brute-force security measures less and less effective.
“Protecting against this style of attack requires analysis of network patterns, command and control, and a large-scale dataset of attacks to see these attempts happening across multiple sites and networks, rather than a single instance or victim.”
Dave Polton, VP solutions, Nominet - Digital Transformation
“In 2019 we’ve seen a number of digital transformation projects not fully meeting expectations. Consequently, in 2020 I expect to see a number of small but significant step changes in the way businesses approach digitalisation projects, particularly with regards to security. Focus will be on improving user experience and generating value from smaller more tangible projects. For cyber security, this will mean solutions that make day-to-day operations easier and that are proven to mitigate the impact of security incidents. As a result of this, in 2020, security teams will need to be looking to communicate the value they bring in terms of the bottom line.”
Robert Meyers, Channel Solutions Architect, CISM, Fellow of Information Privacy at One Identity, comments:
"There will be a major regulatory push both in the US and especially in the EU to punish negligent data handling. This is already enshrined in GDPR, and the incoming European Commission is moving fast to act on this: post-breach, if it is found that the data protection mechanisms are negligent, the fine can increase significantly. This poses a problem for all kinds of customers who handle consumer data (small and large), as it puts the onus on the data breach victim to prove that they were proactive in defending personal data. So, while regulated industries (finance, healthcare, etc) are already quite familiar with identity/privilege management, a completely new class of companies is now forced to learn the importance of proactive data protection.
One typical example is the Equifax-story: the new lawsuit documents show that a critical server had admin/admin as login credentials – this is incredibly negligent, and in the EU this would multiply the fine.
Additionally, there has now been a lot of time for activists, like data privacy activist Max Schrems, to mount challenges to those who don’t hold data privacy to the same or legal levels. For this reason, we will likely see penalties being imposed as a result of activists enforcing the spirit behind the regulation (GDPR). Over the next year, we will also see how Brexit will impact the enforcement of GDPR and the complications of splitting up the supervisory authority. The only true and certain forecast that can be made is that next year will certainly be a tumultuous one in the privacy and compliance world."
Megatrend for e-retail: Climate Change and Digitalisation, by Jil Maassen, Senior Strategy Consultant EMEA, Optimizely:
We were told we have twelve years to save the future of our planet, but then this year we were told to make that eighteen months. Millions of people have taken action across the globe in the name of Extinction Rebellion, making sustainability and the future of the planet more mainstream than ever before. Taking notable action to become sustainable needs to not only be a new year's resolution for the technology industry, for our own futures sake.
The online world of retail has meant that physical assets are at our fingertips, but at what cost? As the likes of Amazon and ASOS continue to ensure we can purchase and receive physical things in an instant, it turns out that the environment is suffering for our on-demand consumer mindset.
Earlier this year, ASOS found itself at the centre of a serial returning debate, where customers buy more than they need on the premise that they can return items that they don’t want. Serial returning of course is even more damaging to the environment than instant deliveries, due to emissions and additional packaging. But, from a business perspective it’s clear that banning returns completely or even making it a difficult process, would ultimate hurt retailers’ profits. To find the right balance, retailers need to get to the bottom of why a return is happening in the first place. They can start by analysing customer purchasing behaviours. For example, whether consumers are purchasing one item in different sizes and colours with the sole purpose of returning the excess.
With this knowledge, simple changes and tests can be made to online platforms to mature the personalisation of recommended sizes. Quality over quantity is key, and it’s important to remember that the goal should be to not get users to buy more for the sake of buying more. Helping users to buy the ‘right’ thing will increase consumer trust and reduce the number of items purchased with the intent of return. Steps such as these towards running a more eco-friendly business such as these will become an even bigger trend in 2020. Retailers focusing on their carbon footprint will ultimately be the ones winning consumer trust, increase customer satisfaction and retention, as it becomes a primary concern for eco-conscious consumers to shop ethically.
This transcends beyond just e-retailers, too. Very few consumers see the exact same experience when they log on to their Netflix homepage, and the reason behind that is personalisation. Hundreds, if not thousands, of small experiments are being run every month, from the size of the imagery to where the ‘press play’ button is on the screen, or how products are laid out on screen. By trialing and then re-trialing these features, businesses can see first hand what provides customers with the most seamless and pain-free experience.
Providing sustainable, tailored experiences, and understanding that delivering on this brings increased customer acquisition and loyalty which is absolutely crucial for businesses to see success in the next year.
The growth of connected devices is seeing a fundamental change in how people and enterprises engage with each other in the digital world. But for this to be truly effective, these devices cannot solely rely on existing cloud infrastructure that was built to support millions of siloed apps. Instead, a decentralized approach that fully leverages these devices capabilities is essential.
By Fay Arjomandi, Founder and CPO, mimic.
Today, all the most popular consumer and enterprise applications are hosted in data centers. From Google and Facebook to YouTube and Instagram, this cloud-centric methodology has become essential for enabling our connected lifestyles. Underpinning this is a hierarchical client-server architecture that sees most servers located in data centers scattered around the globe. For years, this has been the optimum way of hosting applications that provide access to content and information to client devices (e.g.: smartphones or tablets.) However, new trends point to this quickly becoming a less efficient way of managing data.
Firstly, there has been an explosion of computing devices and embedded computing in all things. This has grown the ‘edge’, often referred to as the Internet of Things, where devices are connected to centralized servers in data centers through gateways and hubs. However, with edge devices having more computational power than servers of just a decade ago, the edge is becoming progressively more powerful.
Secondly, the advent of social media on mobile devices, orders-of-magnitude more personal multimedia content are generated on these edge devices. People are creating and sharing thousands of times more content than what major studios and broadcasters are hosting on central servers in the cloud. Today, most of the data generated on (edge) devices is sent back to servers on the central cloud for processing and to facilitate sharing.
The third trend is the decomposition of solutions. The emergence of APIs and microservices and their automated deployment are contributing to a serverless backend environment. Instead, the cloud is used to scale resources to fit demand either through volume or geography.
A new way
The current fixed and hierarchical client-server approach makes central cloud resources and network connectivity the bottleneck for future growth. Sending data from hundreds of billions of client devices to tens of millions of centralized cloud servers wastes bandwidth and energy and it has serious social and economic implications.
Furthermore, developers are reliant on cloud service providers who have access to the apps and the data stored or processed in their servers. Essentially, a handful of large companies have to manage the majority of consumer and enterprise data. And despite all the sophisticated security measures, storing data and hosting applications on third-party resources exposes the owners of the information to risks.
This challenge of exponential growth of computing at the edge, has opened up a massive opportunity. Enabling any computing device to act as a cloud server when it makes sense to do so can create a hybrid edge cloud that scales organically with new capable devices. In this way, central cloud resources that require significant real-estate and power, and are bandwidth hungry can offload much of their burden onto edge devices. Many microservices can be hosted on edge devices instead of being hosted in a centralized server, making them faster and more flexible to changing user requirements.
Acting as servers when feasible, edge devices can perform many of the functions of the servers in central cloud. This creates a hybrid edge cloud that is significantly more powerful than the centralized cloud.
For example, there are currently over 80 million Sony PlayStation 4 (PS4) consoles in peoples’ homes. This represents more than 600 million processor cores and 40,000 petabytes of storage. In comparison, this is much larger computing, storage, and memory resources in the aggregate than the entire Amazon Web Services (AWS) infrastructure.
And the PS4 is only one type of device. There are billions of smartphones, PCs, set-top-boxes, game consoles, streaming players, routers, tablets, and other computing devices that can potentially act as cloud servers.
The benefits of such a hybrid edge cloud architecture are phenomenal. From reduced cloud hosting costs and communication bandwidth and latency to improved network efficiency, reduced energy consumption and carbon emission. Moreover, this new approach leads to increased data privacy and providing consumers and enterprises with better control over their data.
Cynics might argue that a change as fundamental as this will require significant investment in overhauling network infrastructure. Yet, if a software-driven model is adopted, no change will be needed to the low-level design of edge devices.
All that is required is a downloadable software development kit (SDK) that runs on top of existing operating systems. No changes to hardware or operating systems are necessary. Instead, developers have the power to decentralize the existing cloud infrastructure.
Ultimately, the hybrid edge cloud environment will provide consumers and enterprises with more control over their personal data, minimize the cost of hosting and delivery of applications and services, and improve network performance.
Decentralization is the next revolution in cloud computing and an essential element to drive the Fourth Industrial Revolution.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 9.
What 2020 will bring for the data centre, according to Chris Adams, President and CEO, Park Place Technologies:
The time of year is upon us when the IT industry takes a step back, assesses the state of our technologies, and plans the next stage of digital transformation. After enduring countless prognostications about the impending death of the data center, IT leaders now find ourselves instead navigating toward a new balance between data center and cloud as we reach toward the edge.
Even as some enterprises strive to get out of the data center business altogether, the vast majority of companies (98 percent) are keeping on-premises systems. In fact, Gartner expects spending on in-house data centers to remain steady in 2020.
That’s not to deny the changes on the horizon. The industry is steering toward a different data center incarnation, which will be increasingly software-defined and, ultimately, capable of taking care of itself. This vision will not be achieved in the next few months, but two key challenges will compel rapid progress toward the autonomous, software-defined data center (SDDC):
Although edge computing is unlikely to hit high gear in 2020, lacking the business models to support it, enterprises will be preparing their infrastructure for a more distributed future and enhanced automation. Network function virtualization (NFV) will advance, and the SD-WAN stampede will continue.
The real newsmaker, however, will again be artificial intelligence (AI). The vast and growing volumes of IoT and consumer data will provide plenty of “food” for AI “thought.” Applications based on classic machine learning algorithms will give way to more powerful deep learning solutions. At the same time, AI will fuel a resurgence in the data center, as enterprises realize the shortcomings of processing massive data sets in the cloud and invest in high-density, AI-ready installations on site.
AI will not only be a workload in the data center, however, it will also aid management of the data center itself. The availability of increasingly sophisticated monitoring and maintenance solutions is already being felt. The next stage will be the collapse of existing siloes, such as between hardware and O/S monitoring or asset discovery and support.
Armed with unified dashboards and predictive AI, managers will gain unprecedented visibility. By year’s end, many enterprises will even be contemplating the next leap forward—turning the reins of data center decision-making over to the computers. As AI capabilities blossom and the need for human intervention fades, tiny adjustments made in real-time will optimize data center performance and efficiency in incredibly granular fashion.
This autonomous, self-healing, software-defined data center will not be achieved in 2020. But with the integration of greater automation, agile networks managed as code, and powerful AI guiding optimization efforts, the transition will be fully underway.
Increased User Expectations
As users experience assistants like Alexa and Siri, and cars that drive themselves, the expectations of what applications can do has greatly increased. And these expectations will continue to grow in 2020 and beyond. Users expect a store’s website or app to be able to identify a picture of an item and guide them to where the item and accessories for the item are in the store. And these expectations extend to consumers of the information such as a restaurant owner. This owner should rightfully expect the website built for them to help with their business by keeping their site fresh. The site should drive business to the restaurant by determining the sentiment of reviews, and automatically display the most positive recent reviews to the restaurant’s front page.
AI/ML Goes Small Scale
We can expect to see more AI/ML on smaller platforms from phones to IoT devices. The hardware needed to run AI/ML solutions is shrinking in size and power requirements, making it possible to bring the power and intelligence of AI/ML to smaller and smaller devices. This is allowing the creation of new classes of intelligent applications and devices that can be deployed everywhere, including:
AI/ML Expanding the Cloud
In the race for the cloud market, the major providers (Amazon AWS, Microsoft Azure, Google Cloud) are doubling down on their AI/ML offerings. Prices are decreasing, and the number and power of services available in the cloud are ever increasing. In addition, the number of low cost or free cloud-based facilities and compute engines for AI/ML developers and researchers are increasing. This removes much of the hardware barriers that prevented developers in smaller companies or locales with limited infrastructure from building advanced ML models and AI applications.
AI/ML Becoming Easier to Use
As AI/ML is getting more powerful, it is becoming easier to use. Pre-trained models that perform tasks such as language translation, sentiment classification, object detection, and others are becoming readily available. And with minimal coding, these can be incorporated into applications and retrained to solve specific problems. This allows creating a translator from English to Swahili quickly by utilizing the power of a pre-trained translation model and passing it sets of equivalent phrases in the two languages.
The Increasing Need for AI/ML Education
To keep up with these trends, education in AI and ML is critical. And the need for education includes people developing AI/ML applications, and also C-Suite execs, product managers, and other management personnel. All must understand what AI and ML technologies can do, and where it’s limits exist. But of course, the level of AI/ML knowledge required is even greater for people involved with creating products. Regardless of whether they are a web developer, database specialist, or infrastructure analyst, they need to know how to incorporate AI and ML into the products and services they create.
Increasing cloud investment
In 2019, more companies than ever adopted cloud computing and increased their investment in the cloud. In 2020, this trend will likely continue. More companies will see the benefits of the cloud and realize that they could never get the same security, performance and availability gains themselves. This new adoption, together with increased economies of scale, will lower prices for cloud storage and services even further.
Easier to use services
Additionally, 2020 will be the year where the major cloud providers will offer more and easier-to-use AI services. These will provide drag-and-drop modelling features and more, out-of-the-box, pre-trained data models to make adoption and usage of AI available for the average developers.
Tackling specific problems
On top of that, in 2020, the major cloud vendors will likely start providing solutions that tackle specific problems, like areas of climate change and self-driving vehicles. These new solutions can be implemented without much technical expertise and will have a major impact in problem areas.
As more and more organisations begin to adopt the hybrid cloud, we’ll eventually see a trend of cloud repatriation, which is what happens when companies don’t take the time to invest properly in migrating to the cloud. All of a sudden, many organisations are realising that they’re spending significantly more than anticipated had they just remained on-premise. Repatriating is not an easy thing to do either -- it’s costly and time consuming. Cloud repatriation should not be a ‘thing’ at all, the best thing for companies to do is to analyse the data and workloads they have before contemplating the move to cloud in order to figure out the costs and potential service impacts involved. It’s important to work with a solution that analyses the behaviour of machines, applications and workloads to figure out what will work best in which cloud solution. Having a cloud strategy in place ahead of time is essential, but it’s important to understand exactly what that strategy is. Having a ‘Cloud First’ strategy needs to be examined to ensure that it is the right thing for the business and not just hopping on the bandwagon.
2. Make way for Hybrid: Traditional datacenter to disappear
Over the next several years, we can expect to see the traditional datacenter disappear as cloud services, IoT, and other innovations limit the advantages that traditional on-premise datacenters can offer. Computing workloads will need to be located based on business needs rather than physical locations and as a result, companies will begin to move to the hybrid cloud in order to provide a more flexible infrastructure. Many organisations are already seeing the benefits of the hybrid cloud, and highly sensitive businesses like hospitals and medical organisations have found success with this model since it allows them to maintain control over sensitive data, such as patient records, by keeping it on-premises while moving less sensitive data and workloads to the cloud.
However, lawmakers have grown increasingly concerned that these cloud-computing systems, which many companies (including banks and hospitals) are using to replace traditional datacenters, have security problems that are poorly understood. We will expect to see a growing trend in the cloud migration movements, and a high demand for making business service availability be transparent to the backend infrastructure changes.
3. AI and machine learning will enable solutions to work out data protection on their own
At the moment, many data protection solutions are based on schedule, not intelligence; processes are service led and declarative. However, customers would like organisations to protect their data and do it without impacting performance or production. Next year, AI and machine learning will enable organisations to protect data in a way that does not impact either. AI and machine learning will provide organisations with predictability enabling them to see when lulls in productivity occur, when data is most active and where data can safely be transferred. Effectively, organisations will end up with a data protection solution that is tailored to the infrastructure and level of data change rate apparent in specific user workspaces.
Industry predictions for 2020, from Ben Gitenstein, Vice President of Product Management, Qumulo (on AI):
NVMe file storage will be adopted broadly for performance starved, low-latency applications in 2020
In 2020 the all-flash market for file in particular will be dominated by players that are best able to deliver value to their customers, ones that can bring the entirety of the enterprise. NVMe is a communications protocol developed specifically for all-flash storage. NVMe enables faster performance and greater density compared to legacy protocols. It's geared for enterprise workloads that require top performance, such as real-time data analytics, online trading platforms and other latency-sensitive workloads. NVMe provides great performance and financial benefits over flash. By the end of 2020 enterprise-grade storage will be flash and within flash it will all be NVMe. Qumulo anticipated this trend years ago and built and all-flash products and integrated NVMe.
Data-driven businesses will have to shift some workloads to the cloud for data processing, machine learning (ML) and artificial intelligence-(AI) driven workloads
Every major enterprise in the world is going to become a hybrid enterprise. In industries across all major vertical markets including M&E, transportation, bio and pharma, customers are using large volumes of unstructured data in order to accomplish their mission. Despite tremendous downward pressure, IT budgets don’t grow at the same rate as the rest of business. The public cloud enables a way to solve that problem with its elastic compute and elastic resources. It has democratized machine learning and other services by making it easy to share data with the public cloud, without a VPN. Organizations that want to be competitive have to be hybrid.
Scale-out File Storage will become the preferred technology for on-prem unstructured data active archives.
Modern file storage solutions deliver performance and economics in a single tier solution managed by intelligent caching. Object storage is not the best fit for on-premises customers seeking simplicity to deliver to performance applications and retain cost effectively, object storage was developed as a precursor to webscale technology and as the storage medium for web technologies. It was meant to be great for datasets that approach exabyte data level and are geographically distributed. In 2020, we believe the on-premises object storage market will evaporate and will become wholly file based.
Single tier solutions delivering the simplicity users desire will gain favour over multiple tier storage solutions
Enterprise organizations rely on tiering systems to manage the location of data and to get to the blended price point that they need for the entire system. Storage media are getting to a cost point where customers should be able to just put their data in the system and have the system place it on the right tier. Some instances should have flash or disk, however even in that situation the customer shouldn't have to manage that; the system should be able to manage that itself. No dedicated human should have to manage these systems. Technologies built to take a file system and break it into tiers will become more and more obsolete as underlying economics will dictate it.
As the cybersecurity industry heads into not only a new year, but a new decade, many of the threats we’ll see in 2020 will be a continuation of the developments of previous years only accelerating, with three key trends set to stand out:
1. Unabated Magecart evolution
Magecart has been threatening the ability for consumers worldwide to shop safely online for years, by stealthily intercepting their credit card data via their browsers. In 2020, its credit card skimming tactics will continue to evolve and remain a headline issue worldwide. Third party web services embedded in web applications continue to be massive targets and, unless business radically change their management of them, they will be continuously targeted. It is also worth keeping an eye out for possible side moves, from card skimming to more general form skimming as this would be an easy step for Magecart threat actors to make.
2. Uptick in cryptojacking
2019 saw a decline in cryptomining, which is not surprising, since the price of cryptocurrency has also decreased following its zeitgeist moment in 2018. Prices have been creeping up however, and it is likely that they will move past the breakeven point for currency miners to once again start cryptojacking, or secretly using someone’s computing power to carry out the cryptomining task. With successful miners able to go undetected for months, this is an issue that is likely to linger throughout the year, making it important for organisations to actively monitor for these threats.
3. Taking an attacker’s and a consumer’s view in monitoring internet facing assets
Going into 2020, more and more businesses will realise that a key focus of their security strategy needs to be geared towards broadening visibility into their internet-facing attack surface, viewing these assets from the attacker’s as well as the consumer’s perspective. The recent compromise of thousands of misconfigured Amazon S3 Buckets highlights the industrial scale cyber criminals employ when common weaknesses are found. And with so many third party services forming part of today’s web applications, the only place you can see the complete picture of the application and its potential compromise is by viewing what is downloaded to the user’s web browser. By adding these two views on top of the existing network monitoring, organisations stand a far greater chance of early identification and remediation of evolving threats.
Staying secure in 2020
Overall, 2020 will see organisations continue to shift digital interactions closer to customers and launch innovative methods for marketing, advertising, and selling their products online. While this will continue to bring great rewards for businesses, it will also increase risk over the coming year. Cybercriminals always move where the money is, whether it is surging online sales or the expansive cryptocurrency market. It is critical that the cybersecurity industry responds to this development by working closely with businesses to develop new ways to keep the data of both organisations and consumers secure.
Advancements in IoT and edge-computing technologies will fuel the demand for micro-datacentre and micro-cloud environments. To support this growth and the increasing use of edge services; telecoms companies will need to consider new, versatile network architectures and solution offerings.
By Joe Hughes, CEO and Founder of MTG.
Advancements in IoT and edge-computing technologies will fuel demand for low power networks, micro-datacentre and micro-cloud environments. To be prepared for these emergent trends and to accommodate new technologies, telecoms and datacentre providers must consider adopting versatile network architectures and product offerings.
IoT and Smart Cities are becoming mainstream as more public and private sector organisations are becoming comfortable with the technology and are beginning to benefit from real-time data and decision-making capabilities.
Early iterations of IoT devices were primitive; with basic functionality and limited processing power – capable of simple measurements and wireless communication, but with limited analysis or processing capabilities. Recent developments in embedded systems technology have greatly improved the performance, power characteristics and capabilities of these devices.
Modern IoT chipsets now feature ultra-low power components, artificial intelligence, encryption, visual imaging and advanced-wireless systems. These enhancements have given rise to the concept of ‘edge computing’, where more advanced forms of analysis and processing are performed on the device itself.
With the processing activity now happening at the network edge, this has reduced the need for high bandwidth links and continual connectivity back to a central datacentre environment.
Edge computing devices will benefit from low-power wireless network access (such as LoRa), as opposed to the relatively power-hungry, high power networks such as 5G.
Given the low power demands of IoT and edge devices, many telecoms operators are looking at a strategy of co-existence; deploying high-bandwidth, dense deployments of 5G sites; alongside long-range, ultra-lower power wireless technologies such as LoRa, NB-IoT and Sigfox. In the case of IoT – often less is more. The Isle of Man is a great example of how pervasive wireless networks can create opportunities in the economy. The Island was one of the first in the world to deploy 3G and it has near-total coverage for 4G.
The datacentre and IT industry have undergone a technological pendulum swing, with a recent shift towards cloud and centralisation, almost emulating the era of the mainframe. As computational power shifts to the network edge, we are seeing datacentre, storage and compute providers follow-suit.
Recognising the shift towards edge computing, many datacentres, cloud and hardware vendors have introduced micro-datacentre and micro-cloud solutions. Like their IoT counterparts, these micro-datacentre environments are also being deployed at the network edge; in containers, rooftops and public spaces.
The combination of edge-compute, low power WANs and micro-datacentres has allowed firms to deploy emerging services close to end-users. Gartner identified this trend in 2018, where they recognised that workload placement is now based on business need, not constrained by physical location.
“By 2025, 80% of enterprises will have shut down their traditional data centre, versus 10% today.” [i]
The benefits of edge-compute are not just technical. As data can be collected, processed and formatted at the network edge, this can eliminate the need for personal data transmission and in the case of cloud – eliminate cross-border data transmission and storage, an important consideration in the age of privacy and GDPR.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 10.
Rackspace CTO: five cloud predictions for 2020
Lee James, CTO, EMEA at Rackspace:
We’ve seen a massive maturity curve in cloud adoption across Europe over the last few years. Although organisations are now reaping the benefits of the cloud, there are undoubtedly new challenges that have been introduced into the market. New stakeholders are having to upskill fast and learn about its potential and risks, working practices are having to evolve beyond current training levels, and many are walking a tightrope between hype and opportunities for tangible innovation.
In 2020 some of these challenges will evolve. Here are the five cloud trends that I predict will affect businesses over the coming year.
Hybrid becomes the new multi-cloud, again
While the popularity of multi-cloud is undisputed, with 81 per cent of companies using cloud technologies in some way, many firms are still making investments in their private cloud solutions. This is due to a number of reasons, such as the security posture, ongoing data centre leasing, or just because it’s the best platform for the application – in some cases – business. Indeed, even the UK Government plans to revise its “cloud first” policy to “cloud right” (or something similar) early next year, acknowledging that public cloud isn’t right for everyone or every use case.
Reflecting this trend, we’ve seen the cloud giants respond with private cloud solutions that link directly into their public cloud solutions, such as Azure Arc, Google Cloud Anthos, and AWS Outposts.
In 2020, there’s going to be significant competition between the three biggest cloud hyperscalers and VMware as they all explore and deliver on how Kubernetes will unlock their potential to be the multi hybrid cloud provider of choice for customers. For customers, it’s ultimately going to come down to which fits and works best, as well as what gives the best bang for their buck. But this sets us up for an exciting year of new product and service announcements as each of the major cloud companies try to establish themselves as the cloud broker of choice.
Cloud native personas will shift
Over the past 12 to 18 months we’ve seen a rise in the cloud native persona. It’s perhaps unsurprising when looking at the successes that cloud native companies have had. The innovation, collaboration, and growth mindset have enabled some of the most cutting-edge customer experiences. Spotify is my favourite example, where separate squads all work together to deliver different components of the mobile application that seamlessly integrate for a simple and intuitive user experience.
However, the new cloud native persona is very different to the traditional approach to software management. To take advantage of this trend in 2020, it’s important that business and IT leaders educate their teams to move away from an off-the-shelf model and structure them in smaller development teams to test and build. We also need to make sure that this approach filters through into traditional education: we need the next generation to challenge our traditional ways of working, while ensuring that the right emphasis remains on the platform management and security posture to underpin a successful platform.
The continued Edge & IoT build up
In 2020, we’re going to see continued deployments of Edge and Internet of Things (IoT) across multiple industry sectors. I predict specific verticals will lead the way, such as the Industrial Edge and the Retail Edge as the main sectors driving adoption and importantly value. For example, 5G is now available across a few cities in the UK and 5G networks can deliver up to a 90 per cent reduction in power consumption, guaranteeing up to 10 years of battery life for low power IoT devices. This means, for example, that more retailers will have access to smart shelves like the ones Amazon implemented in its Amazon Go stores. This technology uses dozens of sensors to provide real-time inventory visibility and update pricing according to demand.
I am also looking forward to the 2020 conferences in London with 5G where I expect to see a huge ramp up of Augmented and Virtual reality demonstrations. These experiences use a lot of processing power and cellular data, but with the increased capacity of 5G networks, retailers will be able to create richer, more detailed experiences when integrating their physical and digital worlds. Real-time product placement via apps sounds fun but I am reminded of the scenes from the movie The 5th Element – are we ready for Hyper Reality experiences? This wonderful video from Keiichi Matsuda, a design consultant, shows a vision of how Edge and IOT can deliver Hyper Reality across many experiences. It looks fun but also sometimes overpowering.
Expect to see many experiments and expect to see GDPR, data management, and security play a huge part in the decision making.
Security at the fore
The rapid adoption of cloud across different business applications means that an enterprise can have as many as 18 different personas interacting, engaging and in many cases managing cloud services. But the question is: do they all know how to manage the different security risks involved?
As we shift more towards a hybrid cloud approach where workloads are balanced across multiple public cloud platforms and on-premise environments, a range of new security considerations emerge.
Now, security custodians must ensure that the 18 different personas understand and adhere to security, as well as ensure that it is always the default question asked no matter what service or platform is developed or used. So, in 2020, it will be more important than ever for the CISO to become the Customer Information Security Officer and work across more departments to ensure that security is always the first consideration, and that innovation never introduces undue risk to the business.
Cloud maturity will change the market in the 2020s
The state of cloud has changed beyond recognition over the past decade. Looking back, 2010 was the year cloud computing went from concept to reality. Ten years on, adoption rates are high and growing, with businesses transforming processes and creating previously unimaginable value from its flexibility and scalability.
Ten years ago, many companies such as Airbnb, Fitbit, WhatsApp, and Instagram didn’t exist. Look how these companies have enhanced our lives and all run on cloud platforms, responding and reacting to changing demands of their customer base. As more data is consumed, analysed, and acted upon, cloud platforms will have to grow and we have seen Microsoft Azure increase its compute capacity by 1500 per cent in the UK since 2016. This will have to continue to meet the new growing demands.
Looking Ahead to the 2020 IT Networking Landscape
with Alan Hayward, Marketing and Sales Manager at SEH Technology UK & Ireland:
Networking solutions are a critical requirement of every business’ IT department. Network technology has the ability to connect a breadth of technologies, including: computer networking, software, communications and hardware devices on a network. As businesses continue to digitise, move IT infrastructure to the cloud, and invest in emerging technologies, there is huge pressure to build professional network infrastructures that can sustain and support digital business models, at a cost effective rate.
In 2020, we can expect to see the IT networking landscape continue to develop and adapt to to keep up with business and market demands.
1. Hybrid and multi-cloud adoption
When selecting where to run its networking infrastructure, businesses want the best of both worlds, with on-premise data centers and the cloud. This means that it will require networking solutions that work consistently across both environments. Driven by an increasing number of web-enabled products like IoT and mobile devices, businesses will look to hybrid and multi-cloud environments. This will allow businesses to move towards deploying applications in the data centre, with on-demand cloud providing excess capacity.
2. Dongle servers unlocking virtual environments
With the continuous rise in virtualised environment threats, including the theft of confidential information, data alteration, or data loss, businesses will continue investing in dongle servers. Dongle servers are a virtual cable extension via the network, with safe and simple dongle management, plus maximum network availability through two network connections. This encrypts a point-to-point connection between the user and the dongle server, meaning the potential for unauthorised access to the virtual environment is removed.
3. Improving networking automation with machine learning
Many businesses are looking to automate mundane network operations, including analytics, management and security. As a result, we can expect to see an increase in the availability of machine learning driven analytics. This in turn will help businesses spot when a network component is in the initial stages of failure and predict when those initial stages will appear. It will also assist businesses in spotting anomalies in network behaviour to help cybersecurity teams reduce security breaches.
In 2020, we will also see SD-WAN infrastructure continue to be one of the fastest-growing segments of the professional network infrastructure market, driven by a variety of factors. Firstly, traditional business networks are increasingly not meeting the evolving needs of today's modern digital businesses. Secondly, businesses are becoming increasingly interested in easier management of multiple connection types across its network to improve application performance and end-user experience.
Overall, it will be an exciting year for the IT networking landscape, with big changes ahead in the realm of network automation, hybrid and multi-cloud environments, and continued dongle server adoption in virtual environments.
Machine learning and artificial intelligence will deliver cost savings through greater cloud efficiencies
Enterprises are looking for application and cloud service providers to help them operate more efficiently through the use of machine learning (ML) and artificial intelligence (AI) to deliver more effective resource management. Achieving this will require the environment or application to understand when it needs more resources and then automatically scaling up those resources to meet the increased demand. Conversely, the technology will need to understand when specific resources are no longer needed and safely turn them off to minimize costs. Today such dynamic resource allocation can be unreliable or must employ an inefficient manual process, forcing cloud customers to either spend more than necessary or fall short of meeting service levels during periods of peak demand.
DevOps will transition companies to cloud-native implementations.
Enterprises will seek to take full advantage of the cloud’s agility by re-architecting their application/technology stacks to optimize them specifically for the cloud environment. IT departments regularly use a “lift and shift” approach to migrating applications to the cloud, but the effort still requires some changes to ensure meeting desired service levels owing to some differences between private and public infrastructures. After the initial wave of migration to the cloud is optimized, DevOps will drive re-architecting their application/technology stacks to a cloud-native implementation to take further advantage of the cloud’s greater efficiency, reliability, scalability and affordability.
Application vendors will architect HA and DR into their core solutions.
Application vendors will endeavor to deliver greater value and higher reliability by integrating core high availability (HA) and disaster recovery (DR) features into their solutions. Most applications today require the customer to provide these protections separately, and most organizations do this for all their applications with a general-purpose HA/DR solution. With HA and/or DR built into an application as a standard feature, customers will be able to simply deploy it on any platform in a private, purely public or hybrid cloud environment. This will be especially beneficial for smaller organizations that normally lack the expertise or resources needed to implement and operate configurations capable of eliminating all single points of failure. For cloud-native implementations, the application vendor will want to take full advantage of the resiliency afforded by the cloud’s multiple availability zones and regions.
DBaaS and cloud will become the preferred platform for database deployments.
IT organizations have traditionally chosen to implement critical databases and applications in their own datacenters, where the staff retains full control over the environment. As the platforms offered by cloud service providers (CSPs) have matured, the cloud has become commercially viable for hosting critical applications, as well as Database-as-a-Service (DBaaS). This viability is true even for complete suites, such as SAP, that span virtually all of an organization’s departments and all of its business functions This change will put greater focus on reliability, availability and performance of the applications, and make the cloud more strategically important to companies. For CSPs who deliver greater resilience through availability zones and geographic diversity, it will be a way to secure long-term engagements with customers.
Resellers and system integrators will play an increasingly vital role as critical applications move to the cloud.
As the migration of enterprise applications to the cloud accelerates and matures, the need to ensure mission-critical high availability (HA) will create opportunities for resellers and system integrators. This window of opportunity is forming as enterprises seek more robust HA solutions that have yet to be fully integrated into the application and system software. Some system integrators may have the expertise and resources needed to leverage open source software in their Linux offerings. But an increasing percentage will choose to integrate solutions purpose-built to provide HA and disaster recovery protections, as these have proven to be more dependable for the customer, while also being just as (if not more) profitable for the integrator.
Winning retailers will build ecosystems by accelerating the use of value-adding partners, seeking those who can supplement their product with a unique service - and adding new channels through which to sell their products. As Gartner said, these ecosystems create “connections between partners, employees and even competitors… built into vibrant networks that can unlock value for all.
2. Support Networks
Retailers will use IoT to innovate and differentiate but also realise that their IoT projects need to be connected. By connecting IoT initiatives together they will benefit from the “network effect” that transforms individual silos, projects and initiatives to enable new insights and innovation. This can then support and automate decision-making to provide a relevant and timely response.
3. Sitting at the Top Table
Technology will sit at the top table organisation-wise in the New Year, after years of underestimating the importance of IT in enabling new digital business models. Technology leadership will start to bring innovation to the rest of the business, as visionary CIOs contribute new ideas. Retailers will need a flexible IT architecture and approach to support this.
Environmental responsibility will become a differentiator for retailers. An increased focus on the environment informs consumer choices and retailers will respond accordingly. Amazon has ordered an electric fleet of vehicles in response to concerns over its supply chain’s carbon footprint. We have seen retailers discontinue plastic cotton wool buds and plastic straws due to concerns about littering the ocean.
Smart retailers will see technology as a tool to help. This could be IoT and real-time AI response to supply chain issues, improving data visibility that drives decisions or using process mining to understand and eliminate delays that have an adverse environmental impact.
5. Blurred Lines
The blurring of industry lines will continue as consumer goods companies seek to avoid using the traditional retailer. The “direct to consumer” business model focuses on delighting its customers without any intermediary involved. Think Dollar Shave Club (since acquired by Unilever) offering razors direct to the customer for as little as $1.00 per month. Even Lego are considering offering a rental service for its toy bricks.
Although technology shockwaves continue to disrupt the retail industry, we see retailers fighting back in 2020 – and there are business models out there we have not even begun to imagine.
Data is accumulating faster than ever before and emerging technologies, such as 5G, IoT and cloud computing, are causing the amount of data we produce to spiral out of control. But against the backdrop of a climate crisis, what do the next twelve months have in store for the data centre sector? Peter Westwood, Data Centre Director at SPIE UK, the smart engineering company, provides his suggestions:
Hybrid cloud has many inherent qualities that make it attractive for businesses looking to modernise their IT, including cost optimisation, agility and continuous product development. Cross-sector appeal will also drive the hybrid cloud hype cycle, with Gartner predicting 90 per cent of organisations will adopt hybrid infrastructure management by 2020.
By John Young, Solution Architecture and Engineering, Sungard Availability Services.
While the ostensible benefits of hybrid cloud solutions are undeniably positive from the perspective of business value, we’ve seen a number of organisations who have had their IT modernisation strategies curtailed by issues of migration. For example, running mission-critical applications and infrastructure across multiple cloud environments runs the risk of creating hidden interdependencies between disparate systems. As a result, IT can become more difficult to control or fix in an instance of a failure, adding chaos and confusion to what can already be costly disruption.
But in the era of digital transformation, organisations cannot afford to be hesitant. Knowing how and when things can go wrong is the first step to mapping out a resilient hybrid cloud migration strategy. This starts with three simple questions: firstly, how can workflow streams be uninterruptable across two environments? Secondly, which applications should be hosted in a public cloud and which should stay on-premise? And finally, how can hybrid workloads ensure continued compliance, especially in the age of GDPR?
Multiple clouds, one uninterruptible business structure
At times, an organisation will need to migrate data from on-premise hardware to the cloud or vice versa. This requires a new IT governance model to be created, with policies and procedures attuned to where applications and data reside. An efficient migration strategy reduces the risk of data being lost during the process and the potential for other people to access information that they shouldn’t be able to see. A consistent toolset across hybrid IT deployments is needed to grant the ability to provision, view, access and manage public and private cloud resources with a single set of credentials.
The flexibility of a hybrid cloud service model also extends to options for provider or customer management of the on-premise private cloud environment. The most sophisticated cloud providers offer management solutions that span both cloud environments. A self-service portal enables the customer to manage their on-premise private cloud environment to reduce costs and to post service requests to the public cloud. In-house IT can also collaborate with operational support from public cloud service providers, taking full control of architecture, deployment, monitoring and change control in hybrid cloud solutions. They also resolve incidents to keep critical applications and workloads up and running.
Choosing which applications to run in private and / or public cloud environments
It’s often thought that applications and data that, if rendered inoperable or stolen, could affect the organisation’s ability to function are best suited to the private cloud. These so-called ‘mission-critical’ applications vary from industry to industry but are generally responsible for supporting the basic transactional activity between an organisation and any number of components in its network (i.e. customers and/or end users, products and services, network endpoints, etc.). On the other hand, that’s not to say that cloud providers should be discounted entirely for hosting mission critical applications. Cloud and managed service providers may layer their solutions with specific components that protect the stability of the applications they host, such as co-located data centres or managed security services.
When deciding which applications to move to the public cloud, an organisation should start with the less mission-critical ones (such as infrastructure services, messaging, web applications for collaboration, and database applications). These are good candidates for public clouds because they are less likely to cause widespread disruption to a business if they are knocked offline and can be cost-effectively maintained at the cloud provider’s data centre.
Fully understand and address the hurdle of regulatory compliance
In terms of business priorities, organisations endeavour to be more flexible, more available and more omnipresent in order to remain competitive. While hybrid and public cloud solutions are the natural choice for businesses seeking these benefits, a distributed model of data storage presents a challenge to one of the key facets of GDPR compliance: knowing exactly where data is. As a result, businesses looking to migrate data from on-prem data centres to the public or hybrid cloud must have the diligence to ensure visibility is not sacrificed in the process.
The need for visibility in distributed cloud systems is driving demand for so-called ‘sovereign’ cloud solutions, which provide the fundamental benefit of ensuring all data is stored on servers located on UK soil. Currently popular within the public sector due to enhanced security qualities, the GDPR is now also driving uptake of managed sovereign cloud solutions in the private sector, along with other factors such as cybersecurity and the uncertainty around data transfers around Brexit. These solutions will help close the widening gap between operational flexibility and regulatory compliance, and give businesses peace of mind when migrating to the cloud.
Taking a resilient approach to migration
Fundamentally, there’s main question organisations need to answer: how do we stay resilient through periods of change? When it comes to hybrid cloud, only when the threats to Business Continuity have been addressed can migration processes begin and concerns surrounding performance, flexibility, and control be put to bed. Cloud computing is not a one-size-fits-all solution, and the implications of simply following suit and choosing the wrong solution can backfire, with potential to cause real damage to the bottom line for companies. Only through a thorough examination of the options and insight from qualified experts will organisations successfully embark upon the right cloud journeys for operations today, and beyond.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 11.
Digital transformation is possibly the most significant initiative for global enterprise. Taking advantage of mobility, the internet, smart software and automation can transform businesses from laggards to market leaders or at least keep them in the race. However, a new survey highlights that around 6% of DT projects have been running for more than 4 years, and 12% report either neutral or unsuccessful results. So, what is derailing these transformative projects? And what can organisations embarking on, or mid-way through a project, do to help deliver successes?
By Sarfraz Ali, Senior Director of Market Development at Smartsheet:
Everywhere you look, businesses are investing in the power of change – whether through introducing new tools, digitalising legacy processes, or developing fresh people strategies. But digital transformation isn’t just about shaking up old ways of working.
Businesses are facing fast-moving competition, fickle consumers, changing legislation, and uncertain political climates, meaning they need to be more agile than ever before. In order to retain a market advantage, transformation projects are now more critical than ever. After all, a staggering 83% of senior managers agree that staying abreast of technology changes is vital to keeping up with the competition.
In a highly competitive, globalised marketplace, almost every business is feeling the pressure. An independent research agency, Loudhouse, conducted interviews with 622 senior managers who have had significant involvement with digital transformation projects. Respondents came from the United Kingdom, Germany, and the United States, across a range of roles, industries, and business sizes. The research discovered a worryingly high percentage of digital transformation projects aren’t creating significant business impact. Projects are meeting their stated objectives, but they’re not going on to benefit the wider organisation.
What and whom is driving change?
Digital transformation is widespread, but it’s also a relatively recent trend. The businesses surveyed found that 59% had started their projects in the last two years, with only 6% having begun over four years ago. In terms of what transformation involves, each project surveyed was as unique as the business behind it. But there are many commonalities in motivation – as well as a growing sense that external factors, not internal pressure, are the main reason for launching the project.
External factors outside of a business’ control are a powerful source of motivation. In the survey, 83% feel keeping up with technology advancements is a leading motivator, with 89% saying that this has had either a significant or revolutionary influence on business change Industry wide changes, like regulatory shifts, are the second most influential macro factor (79%), In the UK, factors such as Brexit were examples of political drivers (73%), and increasing costs (73%) following close behind.
In terms of who was driving these projects, it was not unsurprising that nearly seven in ten (68%) businesses report that IT is in control, followed by operations (52%), marketing (43%) and finance (42%). Interestingly, only a quarter of businesses report that the board of directors are leading the charge – suggesting that digital transformation can come from any level or department across the business.
Although the types of projects varied widely, when it comes to analysing the overall success of their transformation projects, businesses remain positive with 88% of respondents stating that they feel their project has been highly or reasonably successful, when charted against the targets they hoped to achieve. 42% of respondents were happy to say their project had gone or was going better than expected.
However, these feelings are difficult to compare against compelling evidence that digital transformation programmes are actually having a significant impact on the wider organisation. When asked to rate their project’s impact, under half (49%) feel their project has had a significant impact.
When delving further into the data, optimism is high at the beginning of the project but issues become apparent once the sourcing of suppliers phase commences. 71% say the project is running better than expected at the development phase but this drops to 22% once it is time to source suppliers.
Another area where there is a correlation between expectation and success is in stakeholder buy-in. Four in ten respondents feel that stakeholder support was lacking during their transformation project. And just as optimism fades during the life of project, so does stakeholder interest with 73% saying they had the best possible support at the beginning, but later, this falls as low as 48%. This drop-off suggests the onset of delays is closely linked with diminishing stakeholder support, as these individuals have the clout to drive projects forward. While a further quarter (26%) say board level support could have been better during their project.
Overall, it seems stakeholder support is directly aligned to the success of a project, with consistent stakeholder support being particularly vital – because when this drops off, delays kick in. As such, it’s clear a lack of stakeholder engagement is a serious roadblock and should be avoided where possible.
The survey also found a correlation between project success and the organisations attitude and use of tools to help the processes. Virtually all organisations (96%) say that ‘stakeholder collaboration is important for a successful transformation project’ – so the board, or an elected board champion, should be as invested as possible.
Nine in ten (88%) agree that ‘a successful digital transformation programme can only work if everyone in the business works together’, which means spreading the message about the vision and planned process of the project is crucial. Once a project is complete, the impact of collaboration becomes even more apparent; it’s cited as the number one factor that contributed to the success of a recently completed project.
The Right Tools
Almost half (49%) of businesses with recently completed transformation programmes feel technology played a key part in their project’s success. The majority (81%) say they have a specific solution in place to support their project. 39% use a single unified platform, 35% use integrated solutions (a mix of different tools), and 7% use a specialist but standalone solution.
This use of tools to help manage major projects offered a significant boost to success rates with organisations say the overall success of using a unified solution is almost 3x higher than using integrated software. 67% of organisations using a single platform say the project fully runs to schedule vs 40% using an integrated approach.
This correlation between unified solutions ultimately seems to be the impact of standardisation. Those using disparate technologies may find those tools don’t speak to each other effectively, leading to the duplication of work or miscommunications. With a single, standardised tool, this risk can be mitigated.
The data provides a valuable insight into what’s happening within a swathe of digital transformation efforts and paints a coherent picture of the factors that drive success which include:
· Seamless collaboration across every part of the project is critical.
· Project management tools can help but are more effective as part of a unified solution.
· Organisations need to ensure that stakeholders are brought into and kept within the project, even though challenging stages such as vendor selection, to keep moving forward.
Alongside sufficient budget, these are the top factors cited by organisations that had successful projects and significant beneficial impact to the wider business.
Predictions 2020 from Marcus Harvey, Sales Director EMEA at Targus:
Change is a critical factor in the fast-paced world of technology. New products and trends come and go so quickly that it’s hard to keep up, but if an organisation doesn’t want to risk becoming a dinosaur, then keeping on top of current developments is a must.
Here are my five predictions for what will be the biggest technology gamechangers to look out for in 2020.
Prediction 1: Holding superfast tech in the palm of your hand
At up to 100 times the speed of its 4G predecessor, 5G is the next biggest evolution in telecommunications infrastructure. It promises superfast download speeds and lower latency. While the exact date for rollout of 5G is hazy, the knowledge of its oncoming presence, means that businesses must prepare to accommodate for this technology in both work and leisure time.
Smartphones having superfast capabilities means we’ll see them doubling up as the go-to workplace device of choice; portability will be the order of the day. This will result in a domino effect for the bags and computer accessories industry, with a significant increase in demand for products to house and protect these devices as they become more prevalent and even more valuable.
Businesses will need to adapt to the needs of the new employees who chose smartphones and other portable tech as their preferred work devices. Laptop sleeves, tablet cases and laptop bags will surge in popularity; lightweight functionality will be the biggest requirement for any modern worker.
In addition, docking stations at work will need to be able to accommodate a wide variety of smartphones to ensure workplace set up is a seamless as a laptop experience.
Prediction 2: Security will be the priority
With regulations such as GDPR coming into play, data protection was a key defining factor for industries across the board in 2019. This brought discussions around cybersecurity to the forefront, with organisations and consumers alike being urged to take better care of their data.
Yet, cybersecurity has often been viewed in a silo – as a separate entity to physical security when the two are intrinsically linked.
One only needs to look at the example of Heathrow Airport to understand how physical security threats are just as much, if not more, threatening than cyber ones. Heathrow Airport was stung with a £120,000 fine last October by the Information Commissioner's Office after a staff member lost a USB stick which was later found by a member of the public. This goes to show that such a seemingly small mistake can have huge and costly implications.
Physical security trails behind cyber security in terms of budget and attention, but the penny will drop in 2020 when organisations realise that a lost memory stick or mislaid smartphone is all it takes to open the floodgates to crime and fraud, which can potentially cripple any business.
2020 will see a bigger focus on physical security and the bags and computer accessories industry will need to step up to the plate to ensure physical security is top-of-mind for organisations.
Prediction 3: Seeing the world through green-tinted glasses
Following through on sustainability promises will be a big industry gamechanger in 2020. We’ve seen enough pledge-making and greenwashing from companies misrepresenting their sustainability initiatives. Now discerning customers want to see action from companies in all different areas. Resellers and enterprises are also demanding it of suppliers; they’re enforcing high standards across the board as their own environmental policies ripple down the supply chain.
Being a smaller company means it’s easier to keep short supply chains in check, but as a larger corporation it can be difficult to keep it green at all levels. The onus is on the organisation to drive this change and transparency is key here – customers appreciate and expect companies who are honest about their sustainability journey.
Sustainability is no longer just a USP: it's now part of the table stakes. It’ll be interesting to see how this plays out in 2020 and beyond.
Prediction 4: Meeting the needs of the next generation
Flexible and remote working will be the name of the game in 2020. Gen Z, being the new working generation, will make decisions based on what technology their employers provide and what policies they implement. Being connected and agile will be key for employers looking to attract the right talent.
Setting up a workspace with connected functionality will be a must. Businesses need hardware that has cloud capabilities, matching the pace of the evolving work ecosystem. The right docking station, for instance, can help you get all your devices and gadgets synced up and under control. The same way a laptop is an improvement in portability versus a PC, a docking station is an improvement versus a desk cluttered by a mess of wires.
Connected desks are the way forward and this will become increasingly commonplace in 2020.
Prediction 5: The role of IT suppliers will change
As organisations, both big and small, turn to leaner models to achieve optimum business efficiency, the amount of time and budget available for training and development can be limited. Reliance on fewer people to get more done means that businesses will increasingly lean on their IT suppliers to educate and guide them on the best-suited technology for integration into, and outside of, the workspace.
Enterprises purchasing IT devices and accessories for the workplace face a wealth of choices which can ultimately be quite confusing. Often there is just too much choice and too many vendors and this can result in analysis paralysis.
In 2020, we’ll see IT suppliers existing as an extension of the organisation, imparting valuable advice and helping businesses determine how the technology can best add value to their operations.
Everyone wants to remove white noise and the weariness decision-makers face when there are too many choices that aren't quite the right fit. In the world of bags and computer peripherals, offering valuable advice and supporting businesses throughout their IT purchase journey will be key.
It’s the most wonderful time of the year, says Craig Smith, Vice President of IoT & Analytics, at Tech Data:
The end of the year is a time for the channel to take stock – although perhaps not in a literal sense given the shift in the market to SaaS. For many, thoughts are preoccupied with meeting sales targets and tying up loose ends before the festive season begins. However, the end of the year is also a good time to reflect on the year just gone and the preparations needed for the year ahead.
There have continued to be a lot of changes over the course of the year. IT investment, distribution and consumption structures have continued to undergo significant changes as we move away from linear channels supplying products from point-to-point.
It is crucial that partners use their time at the end of the year to identify the areas where they can add higher value to end customers through their products and services.
Critical to creating this higher value is identifying how to create business solutions with next generation technologies such as hybrid cloud, IoT and analytics, machine learning, AI and cyber security.
Plugging the network
2020 will see 5G mature as European rollout gathers momentum. Businesses are incredibly interested in potential commercial use cases and this in turn creates opportunities for the channel. It is not the only new networking technology on the market, however. Having been launched in 2019, WiFi6 will also become more commonplace in the coming year. Those partners that can bridge the gap between the two to deliver high speed connectivity for businesses whilst helping to manage costs and optimise investment in networking should have a successful 2020.
Analytics drives efficiencies
The next 12 months will be a tipping point for analytics and those that invest in it, whilst those that fail to grasp the opportunity will find themselves falling behind. Businesses from all verticals have realised the benefits of real-time insight, which is why we’ve seen market consolidation of analytics capabilities, particularly Salesforce’s acquisition of Tableau. Analytics creates a great opportunity for the channel, who can not only help customers navigate the complex business of choosing the right solution but also help them manage all the associated data and keep it secure.
AI, AI, Captain!
Robotic Process Automation (RPA) has been around for a while but interest in it has grown thanks to businesses looking for technology that will achieve operating efficiencies. There will be considerable growth in the RPA market as businesses look to use the technology to augment their workforce, but that also means someone needs to look after the RPA and businesses don’t always have those skills.
Of chief importance is security. No business wants to score a cyber security own goal by deploying RPA in an unsecure manner and leaving themselves open to being hacked, so the channel has a crucial role to play here.
Everything as a Service
The shift to as-a-Service has been taking place for a number of years but in 2020 it will continue to gain momentum in the channel, even in the most hardware driven industries. With the development of onsite, off-site, cloud and hybrid, on-premises-as-a-Service will become as commonplace as the various Software-as-a-Service offerings.
The ‘Splinternet’ becomes more splintered
In 2019, Russia passed its ‘Sovereign Internet’ law to block off its Internet from the rest of the world, and Iran implemented a near-total Internet shutdown. In 2020, this “Splinternet” trend of a fragmented Internet will accelerate, as more countries will attempt to create restrictions of their Internet using government control overflows of traffic and internet-based services. The most likely candidates to extend these restrictions? Turkey, Turkmenistan, and Saudi Arabia.
AWS Global Accelerator...finally globally accelerates
When AWS launched its Global Accelerator in November 2018, the intent was to let customers use the AWS private backbone network for a fee – rather than use the public Internet, which is AWS’ default behavior. While there are many examples of performance gains in various regions around the world, a recent ThousandEyes report found several examples where the Internet actually performs faster and more reliably than Global Accelerator, or, the differences are negligible. In 2020, the AWS Global Accelerator matures, and performance greatly increases to levels expected when it was first launched.
A Chinese ISP causes major global collateral damage
The Great Firewall doesn’t just isolate Internet users in China, the way many people think. A major Chinese ISP will demonstrate the impact of Chinese government censorship far beyond its borders, as hundreds of sites and services around the world get knocked offline for a significant period of time as a result of routing policies meant to only impact users within China.
DNS Snafus will be responsible for the most outages in 2020
Many things can be responsible for future outages, including natural disasters, attacks or even simple human errors. Attacks are also a major cause of outages. DNS is a fragile infrastructure that is often overlooked and has been a target for major attacks. Past DNS attacks such as Dyn have had a huge blast radius causing widespread outages, creating a devastating impact on businesses. BGP is another weak point in the fabric of the Internet that has been subject to past attacks. User error, such as “fat fingering” can also result in outages, as well as internal misconfigurations or infrastructure failures, with symptoms that manifest themselves on the network layer. BGP-related outages caused major collateral damage in 2019, leading many ISPs to adopt better Internet routing security measures, which will dramatically decline these issues in 2020. Similarly, DDoS attacks will decline overall, particularly in the US and Europe. Ironically, often overlooked DNS services may be ripe for a major service disruption or compromise that could cause ripple effects across the wider Internet.
The Internet becomes more important than ever before
This may seem like a no-brainer, but as the cloud has become the new data center and the Internet is the new network, the number of enterprises that rely on the cloud (insert Gartner metric here) increases every year, so does the reliance by the world’s biggest brands to keep their businesses online and keep the revenue chugging in. Fortunately, faster remediation of service outages improves the overall quality and performance of the global Internet, making worldwide connectivity more reliable than ever.
Backbone networks increase dramatically
As the amount of Internet traffic grows by the minute with every TikTok video, business traffic is competing against cat videos on a network that it wasn’t designed for. Just as the ThousandEyes Cloud Performance Benchmark report found Google Cloud and Microsoft Azure preferring to use their own private backbone networks (with AWS and IBM also offering this option), we’ll see more SaaS companies and cloud-based service providers creating private backbone networks to optimize their own network traffic instead of relying on the unpredictable public Internet.
The future of cloud in 2020 and beyond
By Guerney Hunt, Chair of Trusted Computing Group’s Cloud Work Group and Research Staff Member at IBM:
Just like this year, 2020 will bring about more changes for the cloud market. Broadly speaking, cloud is represented primarily by three types of services; Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS).
Obviously, the specific issues affecting cloud vary with the type of service being adopted. Nevertheless, cloud adoption and exploration will continue to increase in 2020, but the rate at which it is increasing is likely to drop slightly. A growing number of companies will adopt an internal cloud infrastructure as a way of providing some of their IT infrastructure. Increasing adoption of internal clouds will be more prevalent in mid to large-sized enterprises.
There will also be an increase in hybrid cloud infrastructures as enterprises look for the increased flexibility that comes with a cloud while concurrently maintaining control over sensitive assets. The costs associated with a potential breach in the cloud provider that exposes a company’s assets could be astronomical and irreversible. Consequently, companies will be looking for cloud providers whose infrastructure operates seamlessly with their own in-house infrastructures.
Multi cloud infrastructures will become increasingly popular. Companies will split their computations and cloud-based assets across multiple providers. Depending on the functions placed in the cloud, a multi cloud architecture can be used to mitigate the risks associated with a failure at a single cloud provider. For a dynamically managed cloud deployment, a multi cloud environment may give the user better control. These value propositions will contribute to the growth and adoption of multi cloud infrastructures in 2020 and beyond.
As enterprises adopt cloud infrastructures, security remains an important concern affecting adoption. 2020 will see Trusted Execution Environment (TEE) availability become more important for both customers and cloud vendors. TEEs utilize hardware and software to protect data and code. When properly implemented they allow the cloud user to place more trust in the infrastructure while concurrently reducing the exposure of the cloud provider to security breaches. As TEEs become more capable, they will help accelerate cloud adoption.
Trust is very closely linked to security. Customers will increasingly look for some form of certification or attestation that their cloud infrastructure is trustworthy. Prior to selecting a vendor, they will analyze more carefully the vendors operations, solutions and procedures to assure themselves that they meet the requirements of their business. They will require from any providers effective (and provable) means (including operational procedures) of protection for their business assets located in the cloud at all times.
In recent years, enterprises in every industry sector have been embarking on a digital transformation journey in one way or another. Business enterprises are taking advantage of the proliferation of digital technologies to define new business models or to improve business productivity with existing models.
By Geng Lin, Chief Technology Officer, F5 Networks.
Key digital propellers such as the Internet (as a ubiquitous reachability platform), applications and open source proficiency (as a skill set platform), cloud (as a pervasive computing and data platform), and, of late, AI/ML (as an insight discovery platform) help enterprise businesses to improve business productivity and customer experiences.
While the pace of digital transformation varies based on the business and the sector it is in, overall, the journey of digital transformation has three stages.
The steady rise in leveraging application, business telemetry and data analytics enables organizations to scale digitally. Adopting an agile development methodology to quickly iterate modifications has shortened the lifecycle of “code to users.” In digital enterprises, the “code” embodies the business flow and the speed of change in “code to users” represents business agility. In this new era of digital economy, applications have become the life blood of the global economy. Every business is becoming an application business and every industry is becoming application-centric industry.
As IT infrastructure automation and application-driven DevOps processes have been largely established across the industry, we envision that a layer of distributed application services that unifies application infrastructure, telemetry, and analytics services is emerging. The scale, agility, and complexity of digital enterprises demands their applications to have self-awareness and the ability to automatically adjust to operating and business conditions. This will breed a new generation of application services to collect, analyze, and act on the telemetry generated by apps and their infrastructure. These capabilities create new business uses. End-to-end instrumentation from code to customer will enable application services to emit that telemetry and act on insights produced through AI-driven analytics. These distributed application services will help application owners to improve application performance, security, operability, and adaptability without significant development effort.
Ahead of 2020, DW asked a whole host of individuals working in the tech space for their thoughts and observations on the business trends and technology developments likely to be major features of the year ahead. No one gave us the kitchen sink, but otherwise it seems that we’ve uncovered a broad spectrum of valuable, expert opinion to help end users plan their digital transformations over the next 12 months or so. The first series of predictions articles appeared in the November issue of Digitalisation World, continued in the December issue, and now we have a stack more predictions for you as the New Year dawns. Part 12.
2020 tech forecast: what will cause blue skies versus perfect storms?
2020 is just around the corner, and businesses everywhere are beginning to plan for the year ahead. There are many trends expected to grow next year, including contact centre technology, cloud, convergence and the Internet of Things (IoT) becoming a lot more widespread. But with this popularisation must come education.
With this in mind, four experts have come together to give their advice and predictions on what trends will become more mainstream in 2020 and the years beyond.
The advent of the ‘Golden Age of Voice’ in the contact centre
2019 was a big year for contact centres, which is evident from Forrester’s Three Customer Service Megatrends in 2019. Martin Taylor, CEO of Content Guru comments, “We’ve heard a great deal in recent years about the predicted downfall of voice communication channels, but what we’re seeing now is, in fact, the complete opposite. The resurgence of voice-led interactions driven by home assistants, and the fact that customers still overwhelmingly prefer to speak to another human for important queries, is ushering in a new Golden Age of Voice. I believe that 2020 will be the year Natural Language Processing (NLP) steps out of the Proof of Concept (PoC) stage and goes mainstream in the UC industry.
“Over the past year or so, NLP has cemented itself as a tool that opens up unprecedented insight into voice data, especially in the contact centre scenario. The results are far richer than those gleaned from the metadata analysis we have been restricted to until recently and a much larger quantity of data can be analysed.
“Advancements in sentiment tracking will be the next big step for NLP in the UC space and will continue to pave the way for monumental gains in the UC industry in 2020. This is where a sophisticated mix of keywords, tone of voice, and volume to create a much deeper picture of the caller and their needs, helping to ensure the most vulnerable customers are prioritised and all enquiries are dealt with far more efficiently.”
“AI has already started to feel like an old conversation. But the reality is that it’s only just off the starting blocks in many industries. In the last year, contact centres and organisations focused on customer engagement have moved beyond the AI hype into practical implementation. There are tangible examples of AI applications already in full swing in the contact centre industry, ranging from Natural Language Processing (NLP) to image recognition.
“Research from industry-leading analyst Gartner suggests that in 2020, 80% of customer service interactions will be handled, at least partly, by AI. This is hardly surprising, as around a quarter of customer interactions are already handled through an automated chatbot, and the customer engagement technology sector is constantly expanding the very definition of what AI is and what it can do. As it becomes the key business differentiator, organisations that stay ahead of the curve are seeing happy, loyal and engaged customers and higher profits, by turning AI hype into tangible business success. Moving beyond the hype and towards result-driven applications of AI will be critical to the success of any company wanting to survive in this competitive landscape in 2020.”
Digital transformation with more cloud and convergence
Steve Blow, Tech Evangelist at Zerto, thinks the biggest trend in 2020 will be the continuance of digital transformation:
“If you think of how much digital transformation has changed everyday life so far - such as being able to pay for your car tax online, rather than the post office - there’s no denying that it will continue to make a positive and valuable impact throughout 2020. However, there’s definitely a long way to go, and more that organisations could be doing to reap the benefits of digitising business operations. Yes, there is some fatigue around companies 'beginning their digital transformation journey’, but 2020 will be more than just moving to the cloud. Digital transformation is about how you deliver and provide services, and this is what will continue to drive organisations next year.”
One main driver, Steve believes, will be the ongoing adoption of cloud:
“In 2020 we will continue to see the adoption of cloud,” says. “As the old tech saying goes, “one cloud doesn’t fit all’, so I fully expect that businesses will continue to want more mobility and choice when it comes to their cloud. Microsoft and AWS will continue to go strong as more and more organisations will look to pursue their ‘cloud-first’ strategies, adopting the cloud and its services. Because of this, having the ability to move to a cloud, from a cloud, and back again will become more and more valuable.”
In order to do this, Steve thinks convergence will be the key:
“Next year will be the year of convergence. At the end of the day, people want less tools to do more things, reducing the overhead. IT teams don’t want to manage infrastructure anymore, they want to offload everything and are utilising the cloud to deliver that. On the flip side, though, they will want those applications they are still having to manage to be moving from a multi-point solution to a single-point solution that delivers multiple outcomes.”
He also has a word of warning around security concerns, however:
“It goes without saying that cyber threats, especially ransomware attacks, will continue to grow - especially as criminals are making considerable money from them,” says Steve Blow, Tech Evangelist at Zerto. “I predict we will see an increase in attacks that are more targeted than they are random. This will cost organisations money, not just from the ransom side of things, but from a data loss perspective. Data loss is money loss, and the only way businesses can prevent data loss is to make sure that they are constantly protecting it.”
The Internet of Things: evolutionary, not revolutionary
“According to Statista, the global IoT market will explode from $2.9 trillion in 2014 to $8.9 trillion in 2020,” points out Alan Conboy, Office of the CTO at Scale Computing. “That means companies will be collecting data and insights from nearly everything we touch from the moment we wake up and likely even while we sleep. As evidence of this, technologists have seen that the rise of edge computing and IoT was accelerated by the birth of the iPhone.
“Seeing what Apple has achieved, we are going to see a much broader perspective on this ability to put reasonable amounts of compute into a tiny form factor and move that into dedicated functions. In 2020, as a result, we can expect to see evolutionary expansion in the IoT space, not revolutionary. It will continue to evolve, driven by a need for more efficient, more compelling, cost effective solutions, with edge computing at the forefront.
“We are living in a world that is increasingly data-driven, and that data is being generated outside of the four walls of the traditional data center. With 2020 approaching, organisations are taking a much deeper look at their company’s cloud usage. Cloud was originally positioned as the answer to all problems, but now the question is at what cost?
“More organisations are turning to hybrid cloud and edge computing strategies, turning to solutions that process data at the source of its creation. In 2020, organisations will rely on hybrid environments, with edge computing collecting, processing and reducing vast quantities of data, which is then later uploaded to a centralised data center or the cloud.”
Public safety networks: enhancing communication and collaboration
2019 brought major reform for first responder technology due to growing public safety networks, the need for interoperability and the state of both man-made and natural disasters - plus the proliferation of 5G connectivity in the market. Estee Woods, Director, Public Sector & Public Safety Marketing at Cradlepoint, says “All of these factors combined have created a qualifying moment as we enter 2020.
“Public safety networks are considered mission-critical in enhancing interoperability, communications and successful team collaboration, since first responders are constantly on the move. Cellular has created an era of mobile broadband networks, leading to a proliferation of mobile offices and pervasive connectivity that have enabled a new standard in communication for emergency personnel.
“This draws a parallel to 2007 and 2008 when the smartphone was launched. The years following saw an explosion of new technology that was rapidly consumed and changed the way we live and work. It also changed the way we, as humans, interact with each other and see our world view.
“We believe that 2020 and 2021 will see a similar, accelerated impact on the public safety market. It will likely result in the largest rate of change in the industry since the smartphone boom. Some standout technologies will include:
- Unmanned aerial systems (UAS) will come into their own, for the safe and fast delivery of defibrillators and emergency medications. Legislation will need to be at the forefront to really eliminate agency risk and enable this untapped potential to be mapped out.
- BOT operations will help automate public safety responses. Integrations between human operations and computer operations form artificial intelligence. Automated workflows, called ‘BOTS’ can help public safety by serving as the virtual eyes and ears on the street. BOTS can be activated by spoken words and create alerts and workflows via the cloud. This topic will gain momentum in workplace automation in 2020.
- 5G, AI, LMR + LTE and UAS will be the biggest mover and shakers in public safety. Smart states will replace smart cities, and anything that can be connected will be connected. If there is no cellular/broadband connectivity, Private LTE networks will fill that coverage gap. Cloud and edge computing, along with 5G will enable AI technology and machine learning.
- MR systems – 20 years ago, most first responders relied on CAD and LMR ‘technologies.’ But that’s changing. New tech like LMR, LTE and PTT will need interoperability, which continues to be the biggest roadblock with the highest amount of risk if it can’t be overcome. Data interoperability needs to be mapped out, including LMR, P25, PTT, OTT, access control, 911 and data sharing, so these points can all intermingle. Integrating cellular networks into your overall mission-critical comms plan will be essential in making this possible.”
All in all, 2019 has been a great year in terms of developing technology. 2020, however, is set to kick up a much more of a storm, so it’s best businesses prepare.
2020 trends from Mikael Sandberg, Chairman at VXFIBER:
The IT landscape is fast evolving through various technologies, from big data, AI, connectivity and security, to driverless cars and more sophisticated cloud storage – which make up the smart cities of the future in the UK and beyond. The common theme that unites all these developments is full-fibre infrastructure, and the connectivity needed for all these applications to work.
Full-fibre internet is the answer the IT landscape needs to truly evolve, and for smart cities to really take off. Fibre is set to be a huge priority for the UK as it looks to 2020, particularly with political parties competing over their policies to help achieve full-fibre rollout, from Jeremy Corbyn’s proposals to nationalise BT OpenReach, and Sajid Javid’s suggestion to put an additional £5 billion for connectivity upgrades in the UK, as outlined at the Conservative Party conference earlier this year.
However, we are at a tipping point, with the UK currently at 8% of full-fibre coverage, and only 1% connected nationwide. This is put into perspective when compared to countries such as Spain, Portugal, Lithuania and Romania; which are all achieving over 90% fibre coverage. Not only that, Sweden, which has approximately 70% full fibre coverage (50% connected), has been put on a plan to be able to reach 100% - but its government has predicted this won’t be possible until 2025.
We predict that the IT and tech landscape - and beyond - will start to become more aware of the value of investing in fibre; it is not just high-speed internet, but also an investment in property, family and business. Indeed, with high capacity tech such as mobile and cloud gaming playing a more integral role than ever before in our everyday lives, we really need to be able to support ongoing technological innovation.
One key example emerging from developments in trends such as AI and IoT, is driverless cars. In a similar way to how we need to make people aware of the realities of why we need fibre, we also need to look at the complex processes behind the more glamorous innovations being talked about in the broader tech landscape.
For driverless cars to become a reality, we need to continue to collect a huge amount of data from the environments around us to be able to support the development of applications needed. Not only that, AI and algorithms are being developed in parallel to data being collected, proving that the final autonomous vehicle isn’t as close to being part of everyday reality as some people may think.
2020 is set to be the transition year to these (currently) abstract ideas developing into a finalised product – we’ll see more 5G trials and areas of AI being applied in limited application, although it’s likely that we won’t see any major autonomous vehicle trials on roads in the UK just yet, despite gathering momentum. So, although we’ll see more investment in data and AI, it’s essential that this lies in conjunction with the development of fibre networks. While people do tend to jump to the exciting end-product of the concept of a self-driving car, we need to focus on the infrastructure needed to support its development.
This is why 2020 needs to be the year that we should put momentum behind driving awareness of the urgent need for full-fibre infrastructure to advance the UK tech landscape. With internet speeds having tripled between 2013 and 2018, and the number of homes able to connect to an ultrafast line having hit 54 per cent, up from 36 per cent in 2017, it’s clear that this is certainly within our grasp.
Ransomware has become big business, generating estimated annual revenues of $1 billion a year for malicious threat actors. The victims of ransomware continue to stack up as criminals develop new, more creative ways to infiltrate IT environments, seize data and hold organisations to ransom.
As we approach a new decade, Simon Jelley, VP of product management at Veritas, explores how ransomware is likely to continue evolving in the year ahead.
“Public sector, healthcare providers and manufacturers to be singled out by ransomware attackers
“We haven’t yet seen ransomware reach its peak, but we will see it become more niche and target specific sectors in the year ahead.
“Until recently, ransomware attackers took a scattergun approach to their crimes. The Ryuk attacks and 2017’s WannaCry typified an approach that focused on large attack volumes designed to net enough victims to make the effort worthwhile. Now, we’re about to see attackers get more selective and focus on those industries where they can get the highest return on investment.
“The public sector, healthcare and manufacturing industries are all emerging as some of the most likely targets. It’s not necessarily because these sectors have a traditionally soft security posture or are particularly cash-rich, it’s because they rely so heavily on mission-critical information for their day-to-day operations. Cybercriminals know that if their attacks halt essential services, organisations will have less time to make a decision and will be more willing to pay the ransom. The stakes of a successful attack are much higher, so the chances of a victim paying up are so much greater.
“As attackers grow more selective over their targets, organisations in healthcare, manufacturing and the public sector need to be aware that the threats they are facing from savvy ransomware criminals will only get more severe. To keep pace and prepare for a worst-case scenario, it will be imperative to improve visibility over all their data assets and leverage greater automation to ensure their data is backed up and recoverable across a rapidly expanding number of locations and IT environments.”
“Ransomware attackers to target intellectual property
“What do successful businesses do once they have established themselves in a market? They diversify. Ransomware is no different. Just as businesses today are seeking new revenue streams, ransomware attackers are looking to boost their profits with new data exfiltration techniques.
“In 2020, ransomware variants will emerge that combine the usual data lock-out with data exfiltration capabilities. What makes this type of attack so devastating is that it is aimed at the most lucrative data - intellectual property (IP).
“Where once the goal was mainly to bypass defences and encrypt as much data as possible, we will soon see examples of ransomware attacks going after incredibly high-value information, such as product prototypes, schematics and designs.
“If a ransomware attack can deny an organisation access to the prototypes of a new car or phone, they could also take this information outside the walls of an organisation and sell it to competitors on the black market. Ransomware will no longer be a matter of data denied, it will be a case of data compromised.
“With businesses needing to remain agile to stay ahead of the competition, losing access to critical IP slams the brakes on product development and other crucial projects that feed into the revenue stream. We can expect attackers to tune their ransomware to seek out and capture this information specifically. That’s why it’s so important for businesses to have the right data protection measures in place for their most business-critical data.”
“Social engineering attack methods will evolve to target the wider supply chain
“Cybercriminals have long relied on social engineering as one of their most successful modes of attack. By fooling employees to share information or download their malware, ransomware attackers acquire the credentials they need to capture a company’s most important digital assets. However, in response to improved, more rigorous company policies, their techniques will evolve.
“We’re already seeing the beginnings of a secondary illegal market for stolen credentials. On the dark web, ransomware is fuelling the rise of a burgeoning market that makes it quick and easy for cybercriminals to gain remote access to corporate systems.
“This boom is being supported by a shifting attack strategy that will only become more embedded in 2020. Ransomware attackers will increasingly target their efforts, not on existing employees, but on adjacent targets and other accounts with access to the systems of their intended victim. This includes outside contractors, freelancers, partners and approved vendors.
“Thankfully, there’s a solution. In response to adjacent attacks, we are likely to see IT and cybersecurity teams given a larger role in the procurement process to ensure supplier integrity. Before onboarding a new supplier, an organisation must be confident they have comparable data protection measures and policies. Very soon, data responsibility won’t just be for internal consumption, it will be how organisations do business and choose who they work with.”
“Always have a backup plan
“To defend your organisation from ransomware in 2020, it’s crucial to take a proactive approach to prevention, supported by a system of layered data protection solutions and policies. This must include ransomware resiliency solutions that offer enhanced protection of business-critical data against ransomware attacks, coupled with a data protection education programme for employees at all levels of the business. Any gap in your defences is a weakness cybercriminals will exploit, so comprehensive protection is a must.
“However, the only thing that can assure protection in the long term is a sound backup strategy. No ransomware defence is perfect, so a successful attack becomes a matter of when rather than if. Organisations need to create isolated, offline backup copies of their data to keep it out of reach of any successful attack. Organisations then need to proactivity monitor and restrict backup credentials, while running backups frequently to shrink the risk of potential data loss.
“Last but by no means least, businesses should then test and retest their ransomware defences regularly. The coming years will be a period of great innovation and evolution in ransomware variants and attack methods, so stress testing will be critical to ensuring your backup strategy keeps pace and can deliver when it counts.”
Reasons to be wary in 2020
Cory Nachreiner, CTO at WatchGuard Technologies, provides some security predictions as we head into the next decade:
The team at our WatchGuard Threat Lab has been tracking the trends over the last 12 months and doing some creative thinking to flag up some of the top cyber attacks we can expect to see in 2020. Even though the threats on the horizon won’t be any less intense, complicated or difficult to manage, we do see a move to simplify security in order to mitigate the risks.
Ransomware targets the cloud
This billion-dollar ransomware industry will continue to evolve, but its focus will be on the cloud. Attackers are also showing a preference for more targeted onslaughts against businesses that don’t function with any downtime, such as healthcare, central and local government and industrial controls. As businesses of all sizes are increasingly moving their servers and data to the cloud, we expect to see this ‘safe’ haven start to crumble as ransomware attacks cloud-based assets including file stores, S3 buckets and virtual environments.
GDPR goes global
Companies across Europe have already been fined millions of euros for GDPR violations. Meanwhile, the US has no real equivalent, but as companies like Facebook leak more and more of our personal data, which has been used in everything from election manipulation to unethical bounty hunting, US citizens are starting to lobby for greater protection along the lines of the California Consumer Privacy Act which will come into force next year.
Multi factor authentication becomes the norm
Most businesses are still terrible at validating online identities. Previously considered too expensive and cumbersome for midmarket organisations, cloud-based Multi Factor Authentication (MFA) using easy app-based models have become more available and simpler to deploy and use for organisations of all sizes. Mobile phones have also removed the expensive need for hardware tokens. At long last, enterprise-wide MFA will become the de facto standard among all midsized companies next year.
A quarter of all breaches will happen outside the perimeter
The more widespread practises of flexible and mobile working often mean operating outside the traditional network boundary, which has been a key part of a layered security defence. And as mobile devices can often mask the signs of phishing attacks and other threats, we think that a quarter of all data breaches in the next 12 months will involve telecommuters, mobile devices and off-premises assets.
The cyber security skills gap widens
Demand for skilled cyber security professionals keeps growing without any recruitment and educational changes which could increase the supply, so we predict the skills gap to widen by an additional 15% next year.
Attackers will find new vulnerabilities in the 5G/Wi-Fi handover to access voice and/or data of 5G mobiles
Security researchers have already exposed flaws in the cellular to Wi-Fi switch when people use their devices in public Wi-Fi hubs, so it’s very likely we will see a large 5G to Wi-Fi vulnerability exposed in 2020, which could potentially allow attackers to access the voice and/or data of 5G mobile phones when they start to increase in use and operability.
At the moment, we don’t see enough real results or use cases emerging, even though everyone agrees that there is huge potential. I polled some of our experts to find out why this might be happening and identified five critical barriers that need to be overcome for AI to move into the mainstream.
1: You need a lot of data to build models
AI-based models probably need even more data than most because they need to learn for themselves. The explosion in the volume of data available from the Internet of Things, and particularly the increase in streaming data, should have made AI models far more ubiquitous. However, far too many organisations have not really caught up with this.
Joao Oliveira: "Streaming data is still an untapped resource for many, meaning that they are missing out on the power of analytics.”
Arturo Salazar: “Many have failed to appreciate that they may need to go hunting for new sources of data."
Ivor Moan: "Data preparation is crucial for getting value from data. Skimp on the data preparation, and you will find that you cannot rely on your analytical results.”
In other words, organisations that do not pay enough attention to data, and especially data quality, will fail to capitalise on AI.
2: Many models fail to move into production
Developing a model is relatively straightforward. The process often involves a number of different individuals and departments. Indeed, development and deployment are so different that they almost look like two separate life cycles, a bit like a salmon’s move from river to sea (and back again), according to James Ochai Brown.
Different skills are required, both project management-related and technological.
3: Users need to be able to understand insights from models
The best model in the world is only useful if it leads to action. That means that users have to be able to understand and interpret the outputs from the model to provide useful insights. Data visualisation is a crucial part of this process.
Data scientists, therefore, need to think about how they will demonstrate the outputs from their models to maximise the impact on viewers, for example, by asking themselves about the audience, and the message they wish to convey.
4: The real benefit of AI is business model transformation – but that is a long-term project
According to Christer Bodell, any organisation can introduce a single AI-based project or model, and it will improve current business practices. Getting real benefits from digitalisation requires something more: a genuine change in philosophy and a commitment to innovation that is likely to result in business model transformation.
This requires a longer-term change in thinking, says Andreas Godde, and particularly a focus on how value is generated for customers.
5: You need more users to increase use of AI – but they need to see others using it first
There is a book by Geoffrey Moore called Crossing the Chasm. It describes the difficulty of moving from "early adopters," those who adopt a product because it is new and interesting, to the "early majority," who will use products
because they are useful. The early majority need to see products being used by people like them, to solve real-world problems. AI faces a similar challenge – that at present, people do not see enough real-world examples of AI being used to solve problems.
Generate some of those, and particularly examples that show how AI can improve customer experience, and both companies and customers will be more prepared to accept AI and move across the chasm.
An efficient supply chain can benefit businesses in a number of ways, such as enabling a longer-term view of business costs or increased productivity. Therefore, enhancing the efficiency of digital processes within the supply chain can result in greater profits. However, the focus may soon shift to accuracy when supply chain companies fully realise the benefits that this factor can hold. Accuracy is a hugely valuable business metric that mustn’t be overlooked, as its importance only increases within the digital supply chain.
Increasing inventory visibility
Nobody can predict the future, and this rings true for predicting the demand for goods. Inventory management and the costs associated with it can prove a huge burden for all supply chains, and especially for those of retailers. However, by championing accuracy within inventory management, businesses can work towards maintaining balance and preserving the bottom line.
Of course, balancing inventory levels in just the right manner, not too high and not too low, is easier said than done. When inventory levels exceed justifiable consumer interest, businesses risk incurring excessive carrying costs through storage or perished and unsaleable goods, such as in the case of a grocery supply chain. However, having too few goods to hand is just as problematic. A stockout can lead to lost revenue and disgruntled customers, whose orders have not been satisfied in a timely manner. Even the most loyal buyers can be forced to rethink their commitment to a retailer if they are subject to issues stemming from poor inventory management.
Accurately managing the flow of inventory is core to avoiding such issues and maintaining a good relationship with customers. This can be achieved by capitalising on IT solutions that assist in the seamless convergence of supply chain systems. Order management systems (OMS) and more futuristic digital technologies, such as blockchain, can be used to minimise the siloes that cause friction within the supply chain. Unifying the end-to-end processes of a supply chain will not only allow goods to be more accurately tracked, traced, and monitored, but will also facilitate the sharing of data between partners. Seamless data sharing is crucial for allowing the clear visibility of goods up and down the chain, which is necessary for accurately predicting demand.
Aiming for perfect order accuracy
Performance metrics, such as Perfect Order Rates, which describe the number of orders that have been completed seamlessly, are hugely swayed by accuracy and a valuable form of measurement for businesses. By striving towards accuracy in order to achieve a higher Perfect Order Rate, businesses will be able to fully assess the productivity of the systems they have in place, providing them with cost control, increased customer satisfaction and greater efficiency.
Particularly successful in increasing transparency and accuracy are Order Management Systems and Enterprise Resource Planning (ERP). When connected with the work of supply chain employees, these systems can greatly reduce inaccuracies and improve the quality of employees’ work by minimising the likelihood of errors. In the simplest form, digitising data entry with barcode scanners and enhancing warehouse picking with voice-enabled technology is far more accurate than using a manual, paper-based system. When combined with RFID tags, continually scanned throughout the supply chain, the risks of missed orders or incorrect delivery locations will be greatly reduced.
Investing in the future
While investing in supply chain convergence and digital transformation can seem costly upfront, the associated benefits are unquestionable. Cost savings will be granted in the long-run and immediate accuracies will be experienced through the use of new technologies and mobile solutions within the supply chain. By standardising and converging key supply chain processes, such as transportation and warehousing, costs will be driven down by the resulting increased accuracy. Mistakes come at the detriment of time and money and investing in improving accuracy and mitigating potential issues is a great way to drive down costs, better satisfy customers and enhance productivity throughout the chain.
Increased visibility is just one benefit of this investment, allowing businesses to deliver an entirely demand-driven supply network by harnessing real-time and accurate data from all partners in the supply chain is also key. As soon as businesses update their mobile device fleet using technology that enables real-time collaboration and data-sharing between all aspects of the supply chain, they will experience a positive impact on the bottom line. Attaining this sort of flexibility and communication by integrating supply chain processes is the future of the digital supply chain and the most efficient way for businesses to gain an edge over their competition.
Modernising and digitising supply chain processes is a real possibility for businesses thanks to recent technological advancements. Now, each stage of the process can be recorded and tracked, with any errors being flagged immediately and rectified. Fully utilising this level of accuracy is necessary to not only effectively manage consumer demand, but to drive down costs and improve decision-making.
Long thought of as a technique that was rendered obsolete, liquid cooling is fast making a comeback in today’s data centres. As demands for data and digital services increase, many of its attributes are also suited to applications hosted in edge computing environments, especially those that are unmanned, where high reliability and infrequent maintenance visits are desirable.By Patrick Donovan, Senior Research Analyst, Schneider Electric Data Centre Science Centre.
For many, air cooling is the norm for maintaining data centre IT equipment at optimal temperatures or within recommended operating limits. However, as central processing units (CPUs), which typically generate between 70 and 80% of the heat in a server, become hotter with the advent of high-powered CPUs with many cores, and the emergence at the higher end of the performance spectrum of graphical processing units (GPUs), the cooling effort required to maintain safe operation increases accordingly.
To continue to rely on fans pushes up the energy requirement, contrary to today’s concerns to maximise the electrical efficiency of IT installations. Furthermore, the greater the number of fans and the harder they have to work, the greater the noise. This kind of noise pollution can also become an issue for data centres at the edge of the network, those required to support emerging technologies such as 5G, which will be deployed in densely populated areas, close to people’s work spaces or homes.
In this case, liquid cooling offers an effective, efficient and environmentally beneficial alternative to air cooling and several options are available, depending on the application requirements and any cost considerations.
Types of liquid cooling
Direct-to-chip, or Cold Plate, cooling, uses liquid either in the form of water or a specially engineered dielectric, to cool what is effectively a heat sink on top of the CPU. A piped circuit brings the contained liquid to and from the cold plate, which is located directly on top of the CPU or memory modules. Fans are still required to move air throughout the server, so although the airflow infrastructure is reduced, it is not eliminated completely.
Cold Plate cooling requires manifolds to be installed at the back of the rack. These can be retrofitted without having to change appreciably the size or the layout of a rack to affect the overall footprint.
In Cold Plate systems, the water or dielectric is isolated from the main server boards and does not come into contact with the components. For more comprehensive cooling, fully immersed systems, in which the entire circuitry of an individual server or indeed an entire rack, is submerged in a dielectric fluid that is completely safe for the electronics.
In the second case, a sealed chassis which can be configured as a normal rack-mounted system or as a standalone unit, contains the dielectric. Here the liquid will cools either passively via conduction and natural convection, or via forced convection in which the liquid is actively pumped within the servers. Heat exchangers outside the server reject the heat trapped by the liquid.
Finally, in Tub Cooling architectures, a rack is laid on its side and immersed in a vessel containing dielectric coolant. Once again, all components are covered by the cooling medium from which the heat is captured and are transferred to a water loop via a heat exchanger.
Tradeoffs and benefits of liquid cooling at the edge
Liquid techniques can greatly improve the effectiveness of a data centre’s cooling effort. Direct-to-chip cooling can account for between 50 and 80% of IT heat capture, with immersive techniques capable of removing more than 95% of the heat. Although there is an increased cost in terms of installing piped water circuitry in the first case, and the cost of the dielectrics, if used, these can be offset by the reduced requirement for fans in the case of direct-to-chip cooling, and their entire elimination in the case of immersive techniques.
Other trade-offs must be considered too, of course. Deployment in a new green-field site, designed to incorporate liquid cooling from the start, will be far lest costly in terms of capital expenditure (CapEx) than retrofitting an existing air-cooled facility.
However, servicing liquid-cooled systems presents certified data centre service partners or Managed Service Providers (MSP’s) with a new set of challenges. Handling procedures will be far more intricate as great care must be taken not to contaminate IT equipment with water, if that is the selected coolant, or to lose more expensive dielectric than is absolutely necessary through careless handling.
IT equipment immersed in fluid is inherently better protected from contamination, vibration and noise than air-cooled equipment, therefore overall system reliability is likely to be increased, especially in harsh environments. This also applies to unmanned edge computing facilities deployed in densely populated or urabanised environments as the requirement for data, digital services and technology increases.
In smart-cities, applications including real-time public transport information, public service announcement kiosks and in-time, autonomous driving systems, will require a preponderance of resilient, ultra-low latency and unobtrusive IT deployments, that must continue to work reliably without being seen or heard. For these environments, liquid cooling systems present a near perfect solution.
Although servicing liquid cooling systems can be expensive, it is well to remember that their inherent reliability due to immersion will result in longer times between service calls and fewer overall unplanned maintenance events.
As the sheer number of applications in edge computing environments increase, liquid cooling it seems, is a highly valuable, efficient and cost-effective way of meeting the demand for today’s digital services.
By Steve Hone, DCA CEO
As we move into a new year the DCA looks at some of its Corporate Members significant moments from 2019. We had numerous contributions that we have been unable to include all the information in our foreword, we have therefore provided some of the headlines – full details are provided within the Journal listed below.
The DCA’s membership base is varied – providers of products and services to the sector, Data Centre Owners and Operators, Consultants and Training Organisations. The DCA exists to support all its members and to raise both their profiles and the profile of the industry.
To find out more about the DCA and how to become part of the Data Centre community visit www.dca-global.org
TechBuyer is a global leader in the buying, refurbishing and selling of data centre equipment. Not only selling brand new IT parts, they buy used parts such as servers, memory and storage and turn them into low-cost, quality refurbished IT equipment. Techbuyer have been dynamic on many fronts related to the Data Centre sector and remains a very proactive member of the DCA – Mick Payne MD details a two year Knowledge Transfer Project in Partnership with UEL, presentations at DCW Frankfurt, panel sessions at Data Centre Transformation and DCD London, the launch of a Technical Facility in France and a listing in the Sunday Times Fast Track 200 Com. Techbuyer has also been working with Schools Africa a wonderful organisation that enables IT Asset Disposition (ITAD) service to provide second life equipment in African schools.
Datum delivers environmentally intelligent colocation and datacentre services underpinned by a highly secure and highly available, carrier-neutral environment. Datum is also a founding member of the DCA and have been members since 2010. Financially Datum is showing above-market revenue growth, and in June 2019, their success was reflected in their calendar year 2018 accounts, which showed their first EBITDA profit. Datum were also thrilled when the results of a client survey (published in July) revealed Datum had achieved a world-class category Net Promoter Score® (NPS®) of 91. An excellent Net Promoter Score which demonstrates a very high likelihood that Datum clients would recommend Datum to their peers or colleagues. In December 2019 Datum were shortlisted for the Best Performing Company (Data Centre & Hosting Services) award in the 2020 Megabuyte Emerging Stars awards.
Teledata provides colocation services to a range of sectors throughout the UK. They have made several exciting announcements this year including investment of a six-figure sum into the CloudActiv platform, which provides customers with truly resilient cloud hosting and effective disaster recovery of business-critical data and applications. They have also invested £1.5 Million in a new battery storage system to enhance energy efficiencies and £125K to update logical and physical security systems at their facility close to Airport City in Wythenshawe. Teledata also became a PoP (Point of Presence) for London Internet Exchange (LINX).
Procurri is a leading global independent provider of IT lifecycle services and data centre equipment with an extensive reach spanning more than 100 countries. Earlier this year Procurri announced the opening of a new office in Germany. Procurri’s newest European base is in Erlangen, Germany, Europe’s largest economy. This new location will service not just those doing business in Deutschland, but also those in the UK with European operations.
Interxion a co-location Data Centre Provider operating in 50 facilities across Europe and beyond. Interxion have also been a long-standing member of the DCA. In November Tech London Advocates (TLA) and Interxion launched a dedicated Working Group of investors, leaders, entrepreneurs, policymakers and mobile operators to address challenges of and campaigning for the opportunities of 5G technology in Britain. This community of ecosystem stakeholders is independent of government and led by the private sector to help in delivering the potential of full 5G roll-out to consumers, businesses and wider society. 5G technology promises radically faster download speeds, increased network reliability and will enable advanced technologies including Artificial Intelligence, Internet of Things (IOT) and driverless cars.
Submer Immersion Cooling designs, builds and installs Liquid Immersion Cooling (LIC) solutions such for HPC, hyperscaler, datacenters, Edge, AI, deep learning and blockchain applications. During the last months of 2019, Submer has consolidated its presence in Middle East and in US territories with new Immersion Cooling units in Saudi Arabia and California. The organisation has also hired a new CTO Scott Noteboom with decades of experience in the data centre industry: a real visionary who will help Submer pave the way toward a future of sustainable data centres. Submer took part in the leading event - SC19, the most important event in the supercomputing industry, presenting their Immersion Cooling solution for HPC. They also announced a strategic partnership with Colt Data Centres Services, London (UK).
EcoCooling supported Boden Type DC project won ‘Non-profit Industry Initiative of the Year’ at the coveted DCD award in London in December. The Boden Type DC project was selected as a finalist by an independent panel of data centre experts from hundreds of entries, submitted from across the world. Following the announcement of the finalists there were months of deliberation until the winners were announced at the awards ceremony. On the night EcoCooling was presented the award in collaboration with H1 Systems, RISE and Boden Business Agency as part of a brilliant night of celebration. The DCA is delighted to have the continued support from the EcoCooling team who won this award.
CNet launched an Apprenticeship within the network cable infrastructure sector across England and Wales. The company has fully planned and prepared the content of the CNCI® Apprenticeship that provides the apprentice and employer with a full itinerary of activities to follow and implement. It introduces the concept of an ‘Apprenticeship in a Box’, designed to take care of the time-consuming planning often associated with Apprenticeships and on-going professional development.
Chatsworth Products is pleased to announce that its eConnect® Electronic Access Control (EAC) has been named ‘Data Centre ICT Security Product of the Year’ at the DCS Awards 2019. For the ‘Data Centre ICT Security Product of the Year’ Award, finalists were asked to demonstrate the tangible impact their product has on the market, its value to customers, as well as how it differentiates itself from other products currently available.
Partners Asperitas, Rittal and coolDC® were delighted to receive the prestigious ‘Energy Smart’ award at the Global DCD Awards dinner, held in London in December 2019. The award recognises the world’s most energy-aware and innovative approaches to building sustainable digital infrastructure, fellow finalists included Huawei, Digital Realty and Windcores.
Congratulations to all three of these organisations – all of whom are DCA Corporate Partners.
Goonhilly Earth Station Recently joined the DCA as a Corporate Member announced that it has inked a partnership agreement with the Australian Space Agency to collaborate and create new opportunities in the space economy in Australia, the UK and beyond. The new statement of strategic intent and cooperation aims to help progress the Australian space sector and make the benefits of space more accessible for businesses, governments and institutions.
You may also wish to visit the DCA news page where you can look at all the news items from 2019 -https://dca-global.org/news
The DCS awards are designed to reward the product designers, manufacturers, suppliers and providers operating in the data centre arena. The winners were announced at a Central London Awards Ceremony on 16 May.
For the ‘Data Centre ICT Security Product of the Year’ Award, finalists were asked to demonstrate the tangible impact their product has on the market, its value to customers, as well as how it differentiates itself from other products currently available.
CPI’s patented technology integrates the functions of an intelligent rack power distribution unit (PDU) with electronic locking and environmental monitoring. Simply stated, eConnect EAC removes the need to power and network these devices separately, which offers significant deployment savings thanks to the technology’s ability to link up to 32 PDUs (16 cabinets with front and rear locks) under one IP address. Additionally, users can programme, monitor and control every cabinet access attempt remotely, and keep an electronic log entry for security and regulatory compliance purposes. The EAC solution is also compatible with most existing employee cards and supports dual-factor authentication methods.
“We are delighted that our eConnect EAC solution has been chosen by the voters of the DCS Awards as the best Data Centre ICT Security Product in 2019. We were confident in bringing this product to market, knowing it would meet the market’s demands for a simple, cost-effective power management and cabinet access solution,” says Julian Riley, CPI Regional Sales Director & General Manager for Europe.
“Electronic access control solutions are essential in addressing user access management issues within the data centre. Whilst electronic locking solutions for data centre cabinets have been available for some time, many of the products lack scalability and require the cost and complexity of having to deploy separate systems. eConnect EAC means that delivering intelligent security and dual-factor authentication to the cabinet is no longer out of reach for organisations needing to meet strict budgets.”
CNet is the first company to launch an Apprenticeship within the network cable infrastructure sector across England and Wales. The company has fully planned and prepared the content of the CNCI® Apprenticeship that provides the apprentice and employer with a full itinerary of activities to follow and implement. It introduces the concept of an ‘Apprenticeship in a Box’, designed to take care of the time-consuming planning often associated with Apprenticeships and on-going professional development.
The CNCI® Apprenticeship has been put together as the result of a close collaboration between major companies from the network cabling sector. It recognises network cable installation as a role and provides industry approved certification which standardises technical education for network cable installers. CNet is committed to the network cable installation sector and is looking to encourage more people to join the sector and inspire future career goals.
The Apprenticeship is available to all across England and Wales, the apprentice will benefit from on and off-the-job training and activities such as mentoring, shadowing, internal training and specialist external education programs. The Apprenticeship will take between 12-15 months to complete.
Apprentices must pass a practical assessment and professional discussion to successfully complete the CNCI® Apprenticeship, which ensures that the learner is fully competent and ready to work independently within the industry. With a team of professionally trained and CNCI® certified individuals, the risks to businesses are significantly reduced, and organisations can feel confident that their staff are competent to meet today's industry demands.
On successful completion of the Apprenticeship, learners will be able to confidently install, test and certify copper and fibre optic cable installation across a variety of environments, working to the correct standards and best practices. Learners will also be taught how to install with confidence and competency Smart Building technology including wireless access devices, CCTV cameras, door access controls and biometric security systems.
Apprentices also experience a wide variety of workspaces including potentially hazardous areas such as building sites, railways and highways. They will also be taught how to interpret detailed project plans to construct and fix network equipment cabinets, prepare cable pathways, and install cable support and containment systems.
The Apprenticeship is hugely beneficial as not only does it teach highly technical industry skills, but the learner will also gain a large variety of transferable skills that are valuable across any career going forward. This includes time-keeping, communication and organisation, customer service as well as how to participate in a variety of work environments, learning how to prioritise projects, work with a variety of people and to take responsibility and ownership of work.
The opportunities for career progression within the network cable sector are strong. Being CNCI® certified opens opportunities to venture into more specific areas of technology or other specialist areas. Individuals may choose to expand their knowledge and skills into wireless or more integrated technology installation, or progress to become a site manager or sought-after network infrastructure designer or enter the essential data centre sector as a technician.
Andrew Stevens, CNet Training’s CEO adds, “The network cable infrastructure sector is beset with significant skills shortages. The Certified Network Cable Installer (CNCI®) Apprenticeship has been long awaited and is a significant event within the sector. The CNet team has worked incredibly hard to get this off the ground and approved. It’s a massive step forward in the sector and I hope by offering this ‘Apprenticeship in a Box’ concept it will motivate companies looking for new junior recruits to train as everything is ready for the learners to just get going and start learning straight away. The Apprenticeship will inspire a generation and encourage them into the sector by offering a career opportunity following school education. A benefit of the Apprenticeship is that it will ensure that all learners are trained, educated and certified properly from day one and therefore gives them a great starting point to a great career in the sector.”
For more information on CNet Training’s programs, please go to https://www.cnet-training.com/cnciapprenticeship/ or call 01284 767100
Lincoln Discovery Centre Scoops Global DCD ‘Energy Smart’ Award!
With our partners – Asperitas and Rittal – coolDC® was delighted to receive the prestigious ‘Energy Smart’ award at the Global DCD Awards dinner, held at the Royal Lancaster Hotel, London, on 5 December 2019.
The award recognises the world’s most energy-aware and innovative approaches to building sustainable digital infrastructure and our fellow finalists included Huawei, Digital Realty and Windcores.
For coolDC, being Energy Smart involves a combination of factors aimed at both improving overall energy efficiency whilst also re-imagining data centres as a source – and not just a user – of power. Built with the with the principal objectives of demonstrating reductions in energy and carbon footprints, our winning project – Lincoln Discovery Centre (LDC) – illustrates how both can be achieved via the intelligent deployment of what would otherwise be disparate technologies.
Eschewing traditional air-cooling in favour of liquid-based technologies, LDC utilises a combination of the most energy efficient cooling solutions in a design that facilitates a collectively more efficient output. In offering a hybrid of complementary cooling solutions to clients whose needs range from standard applications to high performance compute, this enables us to match cooling to IT infrastructure / application requirements. In short, the power we use is only what is required.
In tandem with this, our design for LDC includes the installation of a gas-powered CHP plant. The Association for Decentralised Energy estimates that approximately 7% of energy is lost during transmission and distribution from the grid. Generating energy on-site consequently reduces this risk, as well as ensuring that the power generated is regulated according to the requirements of the IT and cooling solutions it is matched with.
We believe that energy efficiency is only part of story, and that to be ‘energy smart’, we also need to create value from what would otherwise be wasted heat. In addition to operating at the upper end of the ASHRAE scale, we are deploying solutions which specifically increase the possibility of repurposing heat that would otherwise be wasted. By reusing heat recovered from Asperitas’s immersed cooling solution, this enables us to think in terms of carbon abatement and facilitate a conceptual shift from ‘energy consumption’, toward ‘energy production‘.
Having already been awarded CEEDA Gold for both Design & Operate, as well as a European Code of Conduct Award for energy efficiency, LDC has proved that data centres don’t need to be large and centrally located to pack a powerful IT-punch.
What an amazing end to the year! Congratulations and thank you to our dedicated, hard-working team who are every bit as committed to creating a step-change in the data centre industry as we are.
Congratulations to all three of these organisations – all of whom are DCA Corporate Partners.
2019 was an exceptional year for Datum Datacentres as our strong growth continued to gather momentum. Datum is showing above-market revenue growth and, in June 2019, our success was reflected in our calendar year 2018 accounts, which showed our first EBITDA profit. This profit is a clear demonstration of the positive financial outcome from our operating decisions since our launch in 2012 – our firm belief in our service-enhanced offering has been rewarded and reflected in our ongoing success.
Still basking in the news of our EBITDA profit, we were thrilled when the results of our client survey (published in July) revealed that Datum had achieved a world-class category Net Promoter Score® (NPS®) of 91. An excellent Net Promoter Score, which demonstrates a very high likelihood that Datum clients would recommend Datum to their peers or colleagues, means so much to us because of our unwavering dedication to offering outstanding service.
Amazingly our year got even better when it was announced in December 2019 that we had been shortlisted for the Best Performing Company (Data Centre & Hosting Services) award in the 2020 Megabuyte Emerging Stars awards. The Emerging Stars awards are an independent ranking of the UK’s best-performing technology scale-ups as defined by Megabuyte’s proprietary Megabuyte Scorecard benchmarking methodology. They celebrate the achievements of the UK’s 25 best performing tech scale-ups against five financial criteria including size, growth and margins. What a wonderful way to end the year.
Where do we go from here?
After such an incredible year in 2019 - now with over 50 clients, revenue close to £5m and another year of 20% growth - we are keen to take our successes and use them as a springboard to further achievements in 2020. Clearly, we are hopeful that we will kick-start 2020 by winning the ‘Emerging Stars’ category at the Megabuyte awards in January, although we are well aware that we are up against some strong competition.
As is so often the way with a growing business, nothing ever stands still for long. The dust settled on our inimitable Marketing Manager’s retirement in October (although Lexie is still ‘on call’ for us, as needed, thankfully) and we were very pleased to welcome two new members of staff to our growing team during 2019: Melissa Munday and Aanisa Allali-Williams. Aanisa will be taking over as Business Administrator and Melissa will be stepping into Linda Martino’s shoes as Client Service Manager from January 2020 when Linda retires to the West Country (with both Lexie and Linda retiring to this neck of the woods within the space of one year we are starting to wonder whether this might be the destiny that ultimately awaits us all at Datum…). In light of our overarching focus on client service, both Melissa and Aanisa will be performing essential roles in ensuring that we continue to offer the best possible service to our clients and partners as we seek to achieve even greater things in 2020.
Looking ahead to the new year, we have some exciting growth plans up our sleeves. Alongside our existing ground floor data centre, we have prepared our upper floor shell and core as virgin space that is available for custom technical design. It goes without saying that clients who build out this space to their desired specification will continue to benefit from Datum’s existing high service levels, established connectivity, and its ultra-secure location within a List X park. Bring on 2020!
Datum’s service-enhanced colocation provides a proven platform for digital transformation projects from our environmentally efficient facility within the high-security Cody Technology Park in Farnborough. Our data centre services are underpinned by a highly available and highly secure, carrier-neutral environment. We can provide clients with Edge Pods, partial or full racks, or dedicated suites, along with custom builds for significant requirements. These offerings are backed by extensive supporting services including a 24/7 manned helpdesk, an advanced Datacentre Infrastructure Management (DCIM) system team and 24/7 technical support engineers. Additional support is offered to our clients through our range of datacentre engineering services.
We’re thrilled that the EcoCooling supported Boden Type DC project won ‘Nonprofit Industry Initiative of the Year’ at the coveted DCD award in London last week.
The DCD Awards are a special night where the industry’s best data centre projects and most talented people are celebrated. This year’s gala ceremony took place at the Royal Lancaster Hotel in London on Thursday 5th December.
The Boden Type DC project was selected as a finalist by an independent panel of data centre experts from hundreds of entries, submitted from across the world. Following the announcement of the finalists was months of deliberation until the winners were announced at the awards ceremony. On the night EcoCooling was presented the award in collaboration with H1 Systems, RISE and Boden Business Agency as part of a brilliant night of celebration.
This award is designed to recognise the great initiatives that NGOs, professional bodies and academia put together, which are educating and influencing the data centre sector. Funded by the European Horizon 2020 project, this prototype 500kW facility in the small town of Boden uses every trick in the book to lower its environmental impact.
A huge congratulations to all those involved in the project, it’s a fantastic achievement to be recognised on this level and showcases how important the project is for the future of the industry.
Find out more about the Boden Type DC project here.
Satellite communications innovator and space gateway Goonhilly Earth Station today announced that it has inked a partnership agreement with the Australian Space Agency to collaborate and create new opportunities in the space economy in Australia, the UK and beyond.
The new statement of strategic intent and cooperation aims to help progress the Australian space sector and make the benefits of space more accessible for businesses, governments and institutions.
One activity forming part of the agreement and already underway is Goonhilly’s involvement in the proposed SmartSat CRC (co-operative research centre) space research initiative. This consortium aims to enhance connectivity, navigation and monitoring capability across Australia and to maximise the country’s resources by solving major satellite system and advanced communications challenges.
Another is Goonhilly’s commitment to help develop Australian-based deep space communication assets. Goonhilly opened an office in Australia in 2018, run by industry veteran Bob Gough, and will invest further in infrastructure and facilities as part of its wider plan to support deep space projects globally.
The partnership will also help to make the benefits of space more accessible for Australian businesses. For example, Goonhilly’s Enterprise Zone status is unique in the space industry, offering businesses from Australia and beyond the chance to be part of a rich space ecosystem and to build a cost-effective European base from which to grow their business.
To help accelerate the growth of the Australian space economy, Goonhilly is also adopting its tried and tested UK business model, which is built around the development of close working relationships with local and national government, academia, business and other industry players.
Dr Megan Clark AC, Head of the Australian Space Agency said: “We welcome Goonhilly’s intent to invest in communications, and research and development in Australia. The agreement will provide greater opportunities for technology transfer and the creation of local skilled jobs in the space sector."
“Both Goonhilly and the Australian Space Agency have a shared commitment to increase the opportunities afforded by space exploration and development. Through this partnership we will enhance the capabilities and competitiveness of both Australian and UK industry, to forge productive international collaborations and promote investments in space,” commented Ian Jones, CEO of Goonhilly.
The two organisations share a common vision: to forge international collaborations and promote investments in space capabilities and capacities that help to accelerate the growth of the Australian space economy; to help improve the lives of all Australians through the development of innovative products and services; and to provide new opportunities by enhancing the capability and competitiveness of Australian industry.
About Australian Space Agency
The Australian Space Agency, established in July 2018, will transform and grow a globally respected Australian space industry that lifts the broader economy, inspires and improves the lives of all Australians. Priorities for the Agency include communications technologies, operations and ground stations; positioning, navigation and timing; space situational awareness and debris monitoring; research and development (including leapfrog technologies); earth observation services; and remote asset management in space and on earth.
Goonhilly is a global communications services hub and satellite station located in Cornwall, UK. It provides a comprehensive range of leading-edge connectivity and operational solutions to the space industry, GEO, MEO and LEO satellite fleet operators, broadcasters, as well as a wide diversity of enterprises seeking to grow their businesses on earth and in near and deep space. Customers include SES, Intelsat, Eutelsat and Inmarsat, as well as space agencies, governments, broadcasters and others. Since 2014 the partners in Goonhilly Earth Station Ltd. have been focused on building the company and investing in the site. Goonhilly has Enterprise Zone status, the UK government’s flagship programme for technology parks. www.goonhilly.org
Britain’s largest private-sector tech network brings together stakeholders to back cohesive roll-out of 5G
Tech London Advocates (TLA) and Interxion launch a dedicated Working Group of investors, leaders, entrepreneurs, policymakers and mobile operators to address challenges of and campaigning for the opportunities of 5G technology in Britain.
This community of ecosystem stakeholders is independent of government and led by the private sector to help in delivering the potential of full 5G roll-out to consumers, businesses and wider society. 5G technology promises radically faster download speeds, increased network reliability and will enable advanced technologies including Artificial Intelligence, Internet of Things (IOT) and driverless cars.
Earlier this year, research conducted by Barclays demonstrated that by 2025, superfast 5G networks could add £15.7 billion to the UK economy. Yet, there are critical hurdles to be addressed before 5G is available across Britain – notably extensive infrastructure investment, a clear, funded national strategy and increased knowledge amongst enterprises of 5G’s merits. Pre-election announcements from all major UK political parties promising ubiquitous broadband have brought additional focus on 5G and the UK’s digital strategy.
The TLA 5G Working Group, hosted by data centre experts, Interxion, will be launched at City Hall on Thursday 28th November 2019 and is open to TLA members. The inaugural members will convene to set the agenda for this not-for-profit community and hear from a number of industry leaders to understand the current state of 5G technology in the UK. The event will aim to address the disconnect between Mobile Network Operators (MNOs) claiming 5G networks are already accessible in major UK cities and the realities of transforming connectivity for mass economic benefit throughout Britain.
The TLA and Interxion 5G Working Group will be steered by a voluntary board of senior industry figures, including Russ Shaw, founder of Tech London Advocates and Global Tech Advocates, Kurtis Lindqvist, CEO of London Internet Exchange (LINX), Andrew Fray, Managing Director of Interxion, Mark Gilmour, VP of Colt, Nicolas Bombourg, Managing Director of Budde, and chaired by Caroline Puygrenier, Director of Strategy and Business Development at Interxion.
Russ Shaw, Founder of Tech London Advocates and Global Tech Advocates comments:
“The tech titans of the US and China have unquestionably stolen a march on the UK and are hurtling forwards with the roll-out of 5G for both consumers and businesses. For Britain’s thriving tech community to remain internationally competitive and at the forefront of global advancements, it is imperative that everyone can utilise the latest in digital infrastructure. The UK is leading in a number of verticals including, artificial intelligence, smart cities and fintech – sectors demanding a 5G ecosystem that accelerates growth and doesn’t hinder it. This new collective of advocates will lay a stake in the ground for UK tech and campaign to unlock 5G in the UK.”
Caroline Puygrenier, Director of Strategy and Business Development at Interxion comments:
“5G technology has been hailed as an enabler for the 4th Industrial Revolution, supporting the digitalisation of enterprises. However, the main emerging use case so far has been consumer related, as we download and consume content at much faster speeds than with 4G. The timing is right to form a business community focussed on 5G, and we’re thrilled to be working alongside TLA on this initiative, to demonstrate that the technology has huge potential for almost every sector – from transport, logistics and manufacturing, to financial services, digital media and healthcare. Together, Interxion and TLA have the expertise, the technical infrastructure and membership network to support businesses across Britain to unlock crucial opportunities from 5G.”
Those who wish to attend the launch event can register their interest via Interxion’s website here.
Interxion (NYSE: INXN) is a leading provider of carrier and cloud-neutral colocation data centre services in Europe, serving a wide range of customers through 52 data centres in 11 European countries. Interxion’s uniformly designed, energy efficient data centres offer customers extensive security and uptime for their mission-critical applications.
With over 700 connectivity providers, 21 European Internet exchanges, and most leading cloud and digital media platforms across its footprint, Interxion has created connectivity, cloud, content and finance hubs that foster growing customer communities of interest. For more information, please visit www.interxion.com.
WHY WE’VE OPENED A NEW OFFICE IN GERMANY
By Mat Jordan, Procurri
Procurri’s newest European base is in Erlangen, Germany, Europe’s largest economy. This new location will service not just those doing business in Deutschland, but also those in the UK with European operations. Brits are still no further forward in knowing how their departure from the European Union will pan out, but Procurri customers can be sure of a steady supply chain of hardware and ‘always-on’ 24/7 service whatever turn the negotiations or eventual exit takes. Such reliability and steadfastness even in times of such uncertainty gives Procurri customers a hefty advantage over their competitors.
Procurri offers a full Supply Chain Management service, allowing customers to enhance and streamline their global distribution value chain through outsourcing it and taking advantage of Procurri’s experience and existing strategic relationships in the space. Furthermore, their unique reverse logistics framework allows for Hardware Resale to buy, sell or consign data centre equipment through asset trade-in and buy-back programmes.
Lifecycle wise, Procurri’s new German presence will work in support of all three existing service strands. These are Independent Maintenance Services to extend support for out-of-warranty and end-of-life IT equipment running across a 24×7 global helpdesk network; Hardware Supply, new authorised parts, refurbished parts and complete systems and ITAD (IT Asset Disposition) to complement the overall value chain through an end-to-end suite of offerings.
The new German Procurri warehousing operation compliments the firm’s 14 locations across 3 regions EMEA, Americas and APJ, servicing over 100 countries worldwide! EMEA (Europe, the Middle East and Africa) is a key region and this new European presence will strive to strengthen and improve service levels even higher to those based or doing business there. Germany is well known for its business efficiency and manufacturing excellence and so Procurri sees the country as a natural fit to complement its purpose: offering unbeatable and all-encompassing service excellence to make their customers work easier and give them an advantage that their competitors can’t even hope to meet.
We also hired a new CTO with decades of experience in the data centre industry: a real visionary who will help Submer pave the way toward a future of sustainable data centres.
Last but not least, we announced a strategic partnership with Colt Data Centres Services, London (UK).
In November, we proudly deployed our Immersion Cooling solution at Retaam Solutions, an IT & Telecom Company based out of Riyadh, KSA.
Submer at Reetam Solution HQ
During the same month, Submer and Evocative had their first Open House and Immersion Cooling Demo event at Evocative Data Center, San José, California. Submer and Evocative (a leading provider of Internet infrastructure, colocation, hosting and managed services) partnership “is just the first step towards promoting the next generation of efficient and high-performance data centers, a mission Evocative is very proud to be a part of”, as Evocative’s CEO, Arman Khalili said.
Submer’s SmartPodX will be at Evocative Data Center for the next several months. “The SmartPodX is such a monumental leap in cooling efficiency, you have to see it to believe it. That’s why we are partnering with industry visionaries like Evocative to host educational demonstrations across the globe,” stated Jeff Brown, Submer’s Managing Director in North America.
Submer and Evocative Team
The Open House and Immersion Cooling Demo was also the occasion for Submer to present our new CTO: Scott Noteboom. The newly appointed CTO of Submer and former Head of Infrastructure Strategy at Apple and Yahoo, will bring his experience to help drive strategic innovation across Submer's line of immersion cooling systems and autonomous infrastructure platform.
"As soon as I met Scott, I knew he was the perfect fit,'' stated Daniel Pope, CEO of Submer. "Bringing more compute power in smaller footprints with dramatically reduced power consumption is more than good business strategy. It’s a practical, economic, and environmental imperative. Scott gets it, no one understands the challenges and possibilities better, and there's no one better to help us drive that innovation forward."
At SC19, we launched the SubmerStrings, an exclusive, interactive initiative to help our audience check their knowledge around Immersion Cooling and related topics (such as what the PUE is, what the most important benefits of Immersion Cooling are, etc.) The SubmerStrings gave us some interesting results that confirmed:
1. How Immersion Cooling is, literally, a technology on the rise. The demands of modern infrastructure have begun to surpass the capabilities of the traditional air-cooled rack and liquid cooling technologies appear to be the only possible way to:
a. Guarantee high computational density.
b. Dissipate greater quantities of heat per rack compared to air-cooling.
c. Save on electricity and improve efficiency.
2. The importance of education to debunk many myths around Immersion Cooling and prepare the ground for the HPC and datacenter industry to adopt a disruptive technology such as Immersion Cooling.
Finally, Submer is proud to announce a strategic partnership with Colt Data Centre that led to the deployment of a demo unit at their HQ in London and to training sessions with Daniel Pope (Submer’s CEO) and Diarmuid Daltún (Submer’s CCO).
And that is not the end!
During the next few weeks, just to end 2019 with a big bang, we will announce another important partnership with a major data centre colocation provider! Stay tuned!
By Mick Payne, Managing Director
2019 has been another interesting and rewarding year for Techbuyer. We have committed to a number of long term research projects which should generate some useful information and tools for the sector going forwards. We were also recognised for some of our success in the past.
Began KTP with the University of East London
July saw the beginning of Techbuyer’s two year Knowledge Transfer Project (KTP) in partnership with the University of East London. The research project aims to revolutionise the way data centre managers think about IT hardware configuration. Creating a tool that will allow modelling energy efficiency in data centres with optimal hardware configuration will enable us all to make best use of resources. The first phase of the research yielded ground-breaking results on the performance of older generations against new, which we shared at DCW Frankfurt in November (see below).
Presentation at Data Centre World Frankfurt
Nour Rteil and Rich Kenny from our KTP team presented and demonstrated their benchmarking process and results for older and newer generation servers at DCW Frankfurt in November. Initial findings exposed a huge energy efficiency increase from obtaining balanced memory configuration. The optimal memory configuration depends not only on the total memory capacity, but also the number of channels populated and types of DIMMs used.
Launched IT Asset Disposition Section of the company
Techbuyer kicked off the new year with the launch of a new section of the business devoted to IT Asset Disposition, a specialised service that provides a secure removal and certified data sanitisation to data centre customers. Simply put, it is the best secure and environmentally responsible solution to retiring IT infrastructure.
Opened technical facility and sales office in France
Techbuyer continued the fast rate of change by opening a technical facility and sales office in Paris, France in February. The sixth of our international operations, our French operation also expanded the company’s product range to include PCs and laptops; an area in which the team there have a lot of expertise.
Listed in Sunday Times Fast Track 200
We were ranked mid-table in The Sunday Times Fast Track in February, a recognition of our fast growing international sales. With facilities in the USA, Germany, New Zealand, Australia and France, Techbuyer has exported to over 100 countries and has a sales team that speak seven languages. Techbuyer rated 102nd over all and was one of the top ten in Yorkshire.
Electrical Review Award for WindCORES project
Techbuyer was delighted to win Sustainable Project of the Year at the Electrical Review Excellence Awards in London in June, just a few weeks after shortlisting as Data Centre Energy Efficiency Project of the Year at the DCS Awards. The project in question was in partnership with German green energy company WestfalenWind, which furnished a wind powered data centre inside a windmill with refurbished servers.
Partnership with IT Schools Africa
As a company, Techbuyer has a commitment to closing the loop on equipment use wherever we can. IT Schools Africa is a wonderful organisation that enables our IT Asset Disposition (ITAD) service to do that. Some of the equipment collected by our ITAD division – mice, keyboards and monitors for example – is not part of our core inventory and not profitable for us to sell. Rather than send it for recycling, we have made contact with IT Schools Africa, who test and refurbish all equipment before shipping for a second life in African schools.
Support of UN Sustainable Development Goals
In October, Techbuyer officially announced its support of the UN Sustainable Development Goals with nine targets set against Goal 3 (Good Health and Wellbeing), Goal 4 (Quality Education) and Goal 12 (Responsible Consumption and Production). Doing what we can in the areas we can make a difference is a great moral builder for the company and pushes us to do more to support the global movement to end poverty, fight inequality and prevent climate change by 2030.
Panel discussion at DCA’s Data Centre Re-Transformation Manchester
It was an honour to be able to join the conversation on Building a Circular Economy at the DCA’s Data Centre Re-Transformation conference in Manchester in September. Speaking alongside Tech UK, Green IT, the Uptime Institute and BEIS, it was amazing to see the level of dedication in the industry towards finding an alternative to the traditional “take, make, waste” approach to data centre provisioning.
Panel discussions at DCA’s Data Centre Re-Transformation and DCD London
Techbuyer was invited to take part in the “Journey towards a circular economy for the data centre industry” panel discussion at DCD London in November along with other members of the CEDaCI project. Bringing together different participants in the value chain was a great way to begin the journey that will turn “e-waste” into “e-resource” in the coming years. With approximately 121 million server units set to be deployed between 2019 and 2025, finding better ways of reusing and recycling will be vital for the sector.
Manchester data centre provider launches new cloud platform to eliminate the risk of downtime
data centre operator TeleData UK Ltd, has officially launched its new cloud
hosting platform - CloudActiv - with an event held at Great John Street Hotel Rooftop Garden in Manchester last week.
Joined by over 50 guests, the firm announced that its new platform would be made available to both new and existing customers, and that it will not be charging any additional fees for the increased levels of resilience. Every cloud hosted client will benefit from the new platform’s capabilities at no additional cost.
TeleData invested a six figure sum into the CloudActiv platform, which provides customers with truly resilient cloud hosting and effective disaster recovery of business-critical data and applications.
Using Active-Active technology, TeleData’s solution has seen the firm extend its public cloud platform over multiple data centres to provide constant sub-millisecond replication of data between two identical platforms. Active-Active architecture eliminates the risk of downtime by making it possible for customer’s data to be served from separate, synchronised hosting environments at any given time.
This delivers immediate ‘active-active’ failover to protect customers against any full-site data centre outage, providing built-in, automatic disaster recovery and removing the expense and complications of building and managing separate standby environments. This means that if one hosting location goes down, another instantly takes over so that a customer’s business critical resources remain up and running, without any disruption of service.
Commenting on the launch, TeleData Commercial Director Matthew Edgley said; “We’re taking significant steps to make the TeleData cloud one of the most resilient platforms on the market. Many providers today rely on a single data centre location, or second site options at significant extra cost, but this is usually far from the seamless solution that we believe cloud users should expect.
“We feel a steadfast responsibility to offer maximum uptime as standard, rather than as a configurable option at an increased price point, continuing to make us the go-to technical and commercial choice for business-critical cloud hosting”.
TeleData UK is Manchester’s only premium, independently owned data centre, providing enterprise class data centre and cloud hosting solutions to businesses across the UK.
Founded in 2007, TeleData UK provides colocation, server hosting, workplace recovery and data centre services to businesses across the UK. The firm’s solutions are designed to enable organisations to protect their applications, data and online presence from the damaging effects of downtime, and to make the most effective and efficient use of technology with secure and scalable hosting platforms tailored to business critical needs.