For as long as I’ve been involved in the world of IT, there’s always been a bottleneck (or three) that has slowed down the overall performance of the infrastructure designed to connect applications and users. Many moons ago, the very first storage service providers, with not a cloud on the horizon, found that their great idea had one major snag – latency. Yes, encouraging end users to offload their storage to a professional third party organisation made a great deal of sense, but not if it took days to back it up or to retrieve it. Of course, there are many other examples where either the compute, network or storage element of an IT infrastructure could not keep up with the other disciplines. And, right now, it seems that even the applications themselves are coming under scrutiny in terms of how they are written and how they perform on different devices – speed, flexibility and agility being the watch words.
In this issue of Digitalisation World, we turn our attention to another important, if not the most important component, of the overall IT infrastructure – the data centre. With data centre managers finally realising that their marvellous facilities do not exist in splendid isolation, but actually to house and empower or facilitate IT hardware and software, data centre design, location, operations and management is making a significant, positive contribution to the digital world. As the articles in this digital publication demonstrate, whether it’s hyperscale, the edge, outsourcing, hybrid, 5G and IoT, AR and VR, blockchain or AI, the data centre is central to the successful implementation of virtually any IT project.
Alongside the data centre predictions articles, you’ll find the final tranche of more general 2019 predictions. DW received over 160 responses to our request for a little bit of crystal ball gazing – thanks to all those who responded – and we think that they make for very interesting reading.
84% of business leaders think artificial intelligence (AI) will be critical to their business in three years, but only 15% are aware of AI projects being fully implemented in their organisations.
Despite executives having high expectations for the impact that AI will have on their businesses according to Cognizant’s new report, ‘Making AI Responsible - and Effective, only half of companies have policies and procedures in place to identify and address the ethical considerations of its applications and implementations. The study analyses the responses of almost 1,000 executives across the financial services, technology, healthcare, retail, manufacturing, insurance and media & entertainment industries in Europe and the US.
Whilst executives are enthusiastic about the importance and potential benefits of AI to their business, many lack the strategic focus and consideration for the non-technical aspects that are critical to the success of AI, such as trust, transparency and ethics.
Optimism and enthusiasm
The research shows that business leaders are positive about the importance and potential benefits of AI. Roughly two- thirds (63 %) say that AI is extremely or very important to their companies today, and 84 % expect this will be the case three years from now. Lower costs, increased revenues and the ability to introduce new products or services, or to diversify were cited as the key advantages for the future.
Companies that are growing much faster than the average business in their industry, in particular, expect major benefits in coming years, with 86% of executives of these fast-growing companies stating AI is extremely or very important to their company’s success, compared with 57% of those at their competitors with slower growth. These industry leaders say they plan to use AI to drive further growth, solidifying their leading positions and pull even further away from the pack. This is reflected in their greater investment in key AI technologies, including computer vision (64% vs 47%), smart robotics/autonomous vehicles (63% vs 43%) and analysis of natural language (67% vs 42%).
Almost half of those companies (44%) undertaking at least one AI project expect to increase their employee headcount over the next three years as a result of the impact of AI projects. Retail and financial services industry executives were most likely to expect a boost to employment (56% and 49% respectively).
Disconnect between optimism and actual implementation
However, business leaders’ optimism is disconnected from the actual implementation in many companies. While two-thirds of executives said they knew about an AI project at their company, only 24% of that group – just 15% of all respondents – were aware of projects that were fully implemented.
Some 40% of respondents said that securing senior management commitment, buy-in by the business and even adequate budget were extremely or very challenging, indicating that many companies are not yet fully committed to AI’s central role in advancing business objectives.
Neglect of ethical considerations
Only half of companies have policies and procedures in place to identify and address ethical considerations – either in the initial design of AI applications or in their behaviour after the system is launched.
Sanjiv Gossain, European Head, Cognizant Digital Business, commented: ‘The challenge today is less about understanding technical questions and technology capabilities, and more about crafting a strategy, determining the governance structures and practices needed for responsible AI’.
According to Gossain, ‘companies need to pay more attention to the non-technical considerations of AI deployments, many of which are more critical and complex than those related to developing and running the technology itself. AI operates in the real world and this will not only help companies determine which technologies can be used to advance business objectives, but also which have the potential to irritate customers, alienate employees, drive up R&D and deployment costs, and undermine brand reputation’.
The report provides three key recommendations to help companies take action and achieve the significant business benefits of AI:
Emerging markets are the most digitally mature, based on the latest Digital Transformation Index with 4,600 business leaders from 40+ countries.
Despite the relentless pace of disruption, the latest Dell Technologies Digital Transformation (DT) Index shows many businesses’ digital transformation programs are still in their infancy. This is evidenced by the 78 percent of business leaders who admit digital transformation should be more widespread throughout their organisation (UK: 71 percent). More than half of businesses (51 percent) believe they’ll struggle to meet changing customer demands within five years (UK: 22 percent), and almost one in three (30 percent) still worry their organisation will be left behind (UK: 19 percent).
Dell Technologies, in collaboration with Intel and Vanson Bourne, surveyed 4,600 business leaders (director to C-suite) from mid- to large-sized companies across the globe to score their organisations’ transformation efforts.
The study revealed that emerging markets are the most digitally mature, with India, Brazil and Thailand topping the global ranking. In contrast, developed markets are slipping behind: Japan, Denmark and France received the lowest digital maturity scores. The UK came in 19th place, in front of Germany (24th) but behind Russia (17th) and Spain (14th). What’s more, emerging markets are more confident in their ability to “disrupt rather than be disrupted” (53 percent), compared to just 40 percent in developed nations.
Behind the curve
The DT Index II builds on the first ever DT Index launched in 2016. The two-year comparison highlights that progress has been slow, with organisations struggling to keep up with the blistering pace of change. While the percentage of Digital Adopters has increased, there’s been no progress at the top. Almost four in 10 (39 percent) businesses are still spread across the two least digitally mature groups on the benchmark (Digital Laggards and Digital Followers).
“In the near future, every organisation will need to be a digital organisation, but our research indicates that the majority still have a long way to go,” says Michael Dell, chairman and CEO of Dell Technologies. “Organisations need to modernise their technology to participate in the unprecedented opportunity of digital transformation. The time to act is now.”
Barriers to transformation and confidence
The findings also suggest business leaders are on the verge of a confidence crisis, with 91 percent held back by persistent barriers.
Globally, the top five barriers to digital transformation success:
Despite almost half of all surveyed organisations (49 percent) believing they will struggle to prove they are trustworthy within the next five years. This figure drops significantly to 16 percent in the UK. Meanwhile, nearly a third (32 percent) don’t trust their own organisation to comply with regulations such as the EU General Data Protection Regulation (UK: 31 percent) and one in three (33 percent) don’t trust their own organisation to protect customer data (UK: 26 percent).
Plans to realise their digital future
Leaders around the world have reported common priorities and investments to aid future transformation, including an increased focus on workforce, security and IT. Forty-six percent are developing in-house digital skills and talent, by teaching all employees how to code for instance, up from 27 percent in 2016. For the UK this has increased from 27 percent in 2016 to 49 percent in 2018.
The top technology investments around the world for the next one to three years will be in:
How organisations fare in the future will depend on the steps they take today. For instance, Draper, a Dell Technologies customer, was traditionally focused on department of defence research but it’s starting to move into more commercial areas such as biomedical science.
“Technology enables us to keep solving the world’s toughest problems; from the infrastructure and services that underpin our innovation, to the experimental technologies that we wield to prevent disease for instance,” says Mike Crones, CIO, Draper. “We couldn’t push boundaries, and call ourselves an engineering and research firm, without being a fully transformed and modern company from the inside out.”
Only three in ten organisations make technology decisions in the boardroom, despite almost a third utilising strategic IT change to deliver increased revenues.
Coeus Consulting has published new research revealing that although the fate of many organisations depends on their ability to implement strategic change and to adopt disruptive technologies, a reported lack of business and IT alignment, coupled with a corporate fear of risk, means they risk losing out on crucial revenues and market share.
Just twenty one percent of those surveyed stated they seek to implement new technology as soon as possible, with some of the main barriers to adoption being: fear of disruption to core business (30%), lack of budget to adopt new technology (21%), and poorly planned adoption strategies (19%).
“While it is reassuring that organisations are at least attempting to keep up with disruptive technologies, it is somewhat concerning that they are not doing more. Monitoring advancements is the first step on the road, but only three in ten organisations make technology decisions in the boardroom. With technology now playing a vital role in every industry, organisations need to increase their understanding of technology and be prepared to take more calculated risks in order to reap the benefits and execute successful strategic change”, Keith Thomas, Head of IT Strategy Practice, Coeus Consulting commented.
Successful implementation rates are low among respondents which could explain these fears, with only seven percent noting that all of their organisation’s strategic IT change projects have met initial objectives over the past two years. The good news is that, of those from organisations that have a test and learn culture, and also set objective success or failure criteria for initiatives in advance, almost sixty percent report that their organisation investigates or adopts a different approach when initiatives don’t meet objective success criteria. “Organisations are blinkered to the market and must be willing to tread the fine line between adopting technologies quickly and rushing the process by investing in the wrong technology, otherwise they risk being overtaken by their competitors and will see declining revenues”, commented Ben Barry, Director, Coeus Consulting.
Aligned and informed organisational leadership is clearly an issue within organisations where at least some strategic IT change projects have not met initial objectives, with just over seventy percent admitting one of: business plans changing, senior management not buying into the change, or not taking enough risks as a reason for failure. “This is disconcerting, if those at board level are failing to see the benefits of strategic IT change, then implementation, adoption and deployment of new technologies is destined to fail. Businesses need to ensure board-level understanding of the importance of IT, as well as building stronger strategic IT change capabilities”, added Thomas.
“Consumer demand for new and improved offerings, paired with demand for digitalisation from the business, means that organisations not only need to increase the speed at which they are doing things, but must also match, or stay ahead of the offerings from disruptive and agile competitors”, Thomas noted.
Seeking to discover how organisations view the next wave of disruptive technology, almost a third (29%) of respondents believe artificial intelligence represents the most significant innovation set to impact their industry in the next two years, with data and analytics (18%) next in line. Despite their predictions on the next generation of technology, only 38% of respondents say they operate with dedicated teams monitoring the latest advancements. This suggests sixty percent of organisations could be operating with little knowledge of innovations taking place outside their four walls.
Despite the current economic climate, funding seems to be a secondary issue. Last years’ research found that just over six in ten (62%) of respondents predicted an increase in the size of their budget for the coming year. In actual fact, only fifty percent of respondents from the survey this year reported an increase.
However, just over fifty percent of respondents reported that digital services are being funded from the IT budget in their company, and additional funding is also allocated from elsewhere. Indeed, approaching six in ten (57%) are anticipating an increase in their budget for the financial year 2019 to 2020. This indicates that business leaders appreciate the need for IT in their current and future operations to the point of allocating funding, but not always to the point of consistently aligning with their IT counterparts.
Increasing operational efficiency (49%), customer satisfaction (32%) and increasing revenues/sales (31%) top the list of drivers of strategic IT change projects, demonstrating the expectations around the business value of IT change are not being effectively driven.
Businesses need to recognise the consequences that slowing IT spend, and ultimately, stagnating progress, could have on their business prospects. Taking unnecessary risks could lead to the downfall of an organisation, but in reality, spending on technology and taking a fail-fast, calculated approach to IT risk is now a necessity.
Benefits of digital technology are well-known, but a distinct lack of cohesion around how to effectively adopt it remains.
A new research report produced bySoftwareONE, a global leader in software and cloud portfolio management, has revealed that the majority of organisations (58 per cent) do not have a clearly defined strategy in place when it comes to adopting and integrating digital workspace technology. This indicates how, in many organisations, implementing and making use of such technology is still being carried out in something of a haphazard manner, meaning that they will struggle to truly maximise its potential unless they take steps to overhaul their strategic approach.
The findings of the research are summarised in SoftwareONE’sBuilding a Lean, Mean, Digital Machinereport, which has been released today. The report also found that, despite the fact that almost all organisations (99 per cent) employ some form of digital workspace technology, respondents have encountered a host of challenges when it comes to using them. These include higher security risks (cited by 47 per cent) and a lack of employee knowledge in how best to use the solutions (45 per cent).
For Zak Virdi, UK Managing Director at SoftwareONE, these figures should serve as a wake-up call to businesses who want to make the most of digital workspace technologies, but have not given enough thought to how to implement them in a way that maximises productivity while minimising any potential issues.
Virdi said: “It’s clear that our working lives have been made easier in many ways by digital technology – cloud apps like Office 365 and Dropbox have become very much the norm, and online collaboration tools like Smartsheet are being gratefully adopted by workers the world over.
“But bringing new digital solutions into the business without a unified, cohesive strategy in place is likely to lead to problems in the long run. These tools are designed to connect employees more effectively, but they can have the opposite effect if, for instance, one department is using a particular tool but another one is completely unaware of its existence. Moreover, introducing new technologies that are not sanctioned for use by senior leaders – frequently known as Shadow IT – can lead to security issues that can be difficult to remedy.”
This need for more clearly defined strategies is supported by the fact that digitalisation is being pushed not just by senior management, but by rank-and-file employees too. Almost two-thirds of respondents (63 per cent) believe that digital evolution is being promoted by the most senior personnel, while 30 per cent said that it is being driven by regular employees. With so many different needs to meet, a well-functioning digital workspace can only be created if there is a structured plan put in place by senior management.
Virdi added: “Employees are pushing hard for change, and it’s not just those at the top who are demanding it. There’s clear evidence that this appetite for new tech is present throughout the business, which means organisations have to work out how to cater to a very large cross-section of the workforce. With this in mind, it’s paramount that the drive to digital is built on the bedrock of a well-planned strategy.
“This should take into account the requirements of everyone at the business: senior managers and board members might seem to be the ones pushing hard for digitalisation, but this is often due to the pressure they are getting from their employees anyway.”
He concluded: “Building the digital workspace is not simply a process of introducing technologies and hoping that they take hold; it’s about having a specific lifecycle plan for every new tool that is introduced. If businesses adopt and maintain this mindset, the long-term benefits will be significant.”
Thales says that the rush to digital transformation is putting sensitive data at risk for organizations worldwide according to its 2019 Thales Data Threat Report – Global Edition with research and analysis from IDC. As organizations embrace new technologies, such as multi-cloud deployments, they are struggling to implement proper data security.
Ninety-seven percent of the survey respondents reported their organization was already underway with some level of digital transformation and, with that, confirmed they are using and exposing sensitive data within these environments. Aggressive digital transformers are most at risk for data breaches, but alarmingly, the study finds that less than a third of respondents (only 30%) are using encryption within these environments. The study also found a few key areas where encryption adoption and usage are above average: IoT (42%), containers (47%) and big data (45%).
As companies move to the cloud or multi-cloud environments as part of their digital transformation journey, protecting their sensitive data is becoming increasingly complex. Nine out of 10 respondents are using, or will be using, some type of cloud environment, and 44% rated complexity of that environment as a perceived barrier to implementing proper data security measures. In fact, this complexity is ahead of staff needs, budget restraints and securing organizational buy-in.
Globally, 60% of organizations say they have been breached at some point in their history, with 30% experiencing a breach within the past year alone. In a year where breaches regularly appear in headlines, the U.S. had the highest number of breaches in the last three years (65%) as well as in the last year (36%).
The bottom line is that whatever technologies an organization deploys to make digital transformation happen, the easy and timely access to data puts this data at risk internally and externally. The majority of organizations, 86%, feel vulnerable to data threats. Unfortunately, this does not always translate into security best practices as evidenced by the less than 30% of respondents using encryption as part of their digital transformation strategy.
Tina Stewart, vice president of market strategy at Thales eSecurity says:
“Data security is vitally important. Organizations need to take a fresh look at how they implement a data security and encryption strategy in support of their transition to the cloud and meeting regulatory and compliance mandates. As our 2019 Thales Data Threat Report shows, we have now reached a point where almost every organization has been breached. As data breaches continue to be widespread and commonplace, enterprises around the globe can rely on Thales to secure their digital transformation in the face of these ongoing threats. ”
When it comes to innovation, a focus on customers, talent and data are key for success.
Only one in seven businesses (14 percent) is able to realise the full potential of their innovation investments, according to research from Accenture, while the majority are missing out on significant opportunities to grow profits and increase market value.
Over the last five years, approximately £2.5tn was spent globally on innovation. Yet, the study shows it is not how much you spend that matters, it is how you spend it. The companies bucking the trend and seeing the biggest returns are investing in bold, watershed moves rather than incremental shifts.
The survey of C-suite executives found that:
Arabel Bailey, Managing Director UKI and Innovation Lead for Accenture, said:
“Fortune favours the bold when it comes to investing in innovation. The companies reaping the biggest rewards show a “go big or go home” mentality by investing in truly disruptive innovation projects. They don’t just tinker around the edges.
“The fact that return on investment overall is dropping is a worrying trend. Business are spending more than ever, but their inability to see proper returns is shocking. One of the reasons for this could be that many organisations still see innovation as a peripheral activity separate to the core business; an “ad-hoc creative process” rather than a set of practices that will fundamentally change their way of doing business. It’s like going jogging once a month and then expecting to be able to run a marathon.
“Equally some companies chase the latest tech trends without thinking about how to connect what they’re spending to the biggest problems or opportunities in their business. We have developed a more formalised approach to helping companies make more of their investment. It means thinking hard about your company, your market, your customer, your workforce, and placing your innovation bets on the things that can help to address your biggest business challenges.
The research revealed that there are seven key characteristics that can help companies to make more of their innovation spend. Companies need to be:
Use of blockchain technology to help secure IoT data, services and devices doubles in a year.
Gemalto reveals that only around half (48%) of businesses can detect if any of their IoT devices suffers a breach. This comes despite companies having an increased focus on IoT security:
With the number of connected devices set to top 20 billion by 2023, businesses must act quickly to ensure their IoT breach detection is as effective as possible.
Surveying 950 IT and business decision makers globally, Gemalto found that companies are calling on governments to intervene, with 79% asking for more robust guidelines on IoT security, and 59% seeking clarification on who is responsible for protecting IoT. Despite the fact that many governments have already enacted or announced the introduction of regulations specific to IoT security, most (95%) businesses believe there should be uniform regulations in place, a finding that is echoed by consumers 95% expect IoT devices to be governed by security regulations.
“Given the increase in the number of IoT-enabled devices, it’s extremely worrying to see that businesses still can’t detect if they have been breached,” said Jason Hart, CTO, Data Protection at Gemalto. “With no consistent regulation guiding the industry, it’s no surprise the threats – and, in turn, vulnerability of businesses – are increasing. This will only continue unless governments step in now to help industry avoid losing control.”
Security remains a big challenge
With such a big task in hand, businesses are calling for governmental intervention because of the challenges they see in securing connected devices and IoT services. This is particularly mentioned for data privacy (38%) and the collection of large amounts of data (34%). Protecting an increasing amount of data is proving an issue, with only three in five (59%) of those using IoT and spending on IoT security, admitting they encrypt all of their data.
Consumers1 are clearly not impressed with the efforts of the IoT industry, with 62% believing security needs to improve. When it comes to the biggest areas of concern 54% fear a lack of privacy because of connected devices, followed closely by unauthorised parties like hackers controlling devices (51%) and lack of control over personal data (50%).
Blockchain gains pace as an IoT security tool
While the industry awaits regulation, it is seeking ways to address the issues itself, with blockchain emerging as a potential technology; adoption of blockchain has doubled from 9% to 19% in the last 12 months. What’s more, a quarter (23%) of respondents believe that blockchain technology would be an ideal solution to use for securing IoT devices, with 91% of organisations that don’t currently use the technology are likely to consider it in the future.
As blockchain technology finds its place in securing IoT devices, businesses continue to employ other methods to protect themselves against cybercriminals. The majority (71%) encrypt their data, while password protection (66%) and two factor authentication (38%) remain prominent.
Hart continues, “Businesses are clearly feeling the pressure of protecting the growing amount of data they collect and store. But while it’s positive they are attempting to address that by investing in more security, such as blockchain, they need direct guidance to ensure they’re not leaving themselves exposed. In order to get this, businesses need to be putting more pressure on the government to act, as it is them that will be hit if they suffer a breach.”
It appears that many organisations will begin the New Year by reviewing their security infrastructure and taking a ‘back to basics’ approach to information security. This is according to the latest in a series of social media polls conducted by Europe’s number one information security event, Infosecurity Europe 2019.
Asked what their ‘security mantra’ is for 2019, more than half (55 per cent) of respondents say they plan to ‘go back to basics’ while 45 per cent reveal they will invest in more technology. According to Gartner, worldwide spending on information security products and services is forecast to grow 8.7 per cent to $124 billion in 2019.
When it comes to complexity, two-thirds believe that securing devices and personal data will become more (rather than less) complicated over the next 12 months. With Forrester predicting that 85 per cent of businesses will implement or plan to implement IoT solutions in 2019, this level of complexity is only set to increase with more connected devices and systems coming online.
However, many organisations will be looking to reduce complexity in their security architecture this year by maximising what they already have in place. According to Infosecurity Europe’s poll, 60 per cent of respondents say that maximising existing technologies is more important than using fewer vendors (40 per cent).
Victoria Windsor, Group Content Manager at Infosecurity Group, admits: “CISOs are managing increasingly complex security architectures and looking to streamline operations and technology in the wake of a growing skills crisis, rising costs and a myriad of compliance requirements. With many of us starting the New Year with well-intended ‘new year, new you’ resolutions, it seems that many security professionals are doing the same.”
Attracting 8,500 responses, the Infosecurity Europe Twitter poll was conducted during the week of 7 January, the first week back for many workers, and a time when many take stock of both their personal and professional goals for the year.
Infosecurity Europe also asked its community of CISOs about their focus for 2019 and discovered that complexity is major headache regardless of industry or size of operations.
Stephen Bonner, cyber risk partner, Deloitte highlights new and impactful challenges and advises security leaders to see the ‘big picture’. “It's often said that complexity is the enemy of security, and this remains as true today as it was twenty years ago. The difference today is that, in addition to technical complexity, companies now have to grapple with overlapping cyber security regulations, legacy technology, and intricate supply chains that stretch around the globe.
“These challenges can no longer be managed with point solutions. Security and IT leaders must consider how their technology fits into – and interacts with – the wider business and beyond. In other words, they must integrate ‘systems thinking’ into business as usual. Cyber security is now a core operational risk for many organisations, and an ability to see the big picture has rarely been so valuable.”
Nigel Stanley, Chief Technology Officer - Global OT and Industrial Cyber Security CoE at TÜV Rheinland Group, points to the challenges in the complex world of operational technology (OT), which covers everything from manufacturing plants through autonomous vehicles and power stations, and where control equipment is often old in terms of IT and often overlooked when it comes to corporate cybersecurity. “The good news is that having a New Year stock take and further considering these security systems will help you understand the key areas of business risk and help to formulate a plan to address it. In my experience the uncomplicated process of changing default passwords, screen locking the engineering workstation and educating a workforce will be time well spent in 2019. My OT security world is getting more complicated each day as fresh challenges arise. As we run fast it seems the bad guys run even faster. I plan to get some new running shoes for 2019!”
For Paul Watts, CISO at Dominos Pizza UK & Ireland, the speed of IoT development will become increasingly challenging: “Accrediting the security posture of IoT devices is challenging for enterprises, particularly in the absence of any regulatory landscape. I welcome the voluntary code of practice issued by the Department of Culture, Media and Sport late last year. However whilst the market remains deregulated and global manufacturers not compelled to comply, it will not go far enough given the speed these products are coming onto the market coupled with the insatiable appetite of consumers to adopt them at break neck speed – usually without any due consideration for the safety, security and interoperability in so doing.”
The survey of 137 senior executives in 4Q18 showed that concerns about “talent shortages” now outweigh those around “accelerating privacy regulation” and “cloud computing”, which were the top two risks in the 3Q18 Emerging Risk Monitor (see Figure 1).
“Organizations face huge challenges from the pace of business change, accelerating privacy regulations and the digitalization of their industries,” said Matt Shinkman, managing vice president and risk practice leader at Gartner. “A common denominator here is that addressing these top business challenges involves hiring new talent that is in incredibly short supply.”
Figure 1. Top Five Risks by Overall Risk Score: 1Q18, 2Q18, 3Q18, 4Q18
AI = Artificial Intelligence
Source: Gartner (January 2019)
Sixty-three percent of respondents indicated that a talent shortage was a key concern for their organization. The financial services, industrial and manufacturing, consumer services, government and nonprofit, and retail and hospitality sectors showed particularly high levels of concern in this area, with more than two-thirds of respondents in each industry signaling this as one of their top five risks.
Gartner research indicates that companies need to shift from external hiring strategies towards training their current workforces and applying risk mitigation strategies for critical talent shortages.
“Organizations face this talent crunch at a time when they are already challenged by risks that are exacerbated by a lack of appropriate expertise,” said Mr. Shinkman. “Previous hiring strategies for coping with talent disruptions are insufficient in this environment, and risk managers have a key role to play in collaborating with HR in developing new approaches.”
Talent Shortage May Exacerbate Other Key Risks
Beyond a global talent shortage, organizational leaders are grappling with a series of interrelated risks from a rapidly transforming business environment. Accelerating privacy regulation remained a key concern, dropping into second place in this quarter’s survey. Respondents indicated that the pace of change facing their organizations had emerged as the third most prominent risk, while factors related to the pace and execution of digitalization rounded out the top five emerging risks in this quarter’s survey.
Mitigation strategies to address this set of risks often come at least partially through a sound talent strategy. For example, a key Gartner recommendation in more adequately managing data privacy regulations is the appointment of a data protection officer, while both GDPR regulations and digitalization bring with them a host of specialized talent needs impacting nearly every organizational function.
“Unfortunately for most organizations, the most critical talent needs are also the most rare and expensive to hire for,” said Mr. Shinkman. “Adding to this challenge is the fact that ongoing disruption will keep business strategies highly dynamic, adding complexity to ongoing talent needs. Most organizations would benefit from investing in their current workforce’s skill velocity and employability, while actively developing risk mitigation plans for their most critical areas.”
Gartner recommends that enterprise risk teams and HR leaders collaborate to clearly define ownership of key talent risk areas that their organization is facing. “Different parts of the organization often have different pieces of information about what is actually going on from a talent risk perspective,” according to Brian Kropp, group vice president of Gartner’s HR Practice. “The best organizations are moving away from traditional engagement surveys to understand their talent risks. Building robust talent data collection and analysis techniques to better listen to their employees and identify real-time risks is a key part of this process.”
Global IT spending to reach $3.8 trillion in 2019
Worldwide IT spending is projected to total $3.76 trillion in 2019, an increase of 3.2 percent from 2018, according to the latest forecast by Gartner, Inc.
“Despite uncertainty fueled by recession rumors, Brexit, and trade wars and tariffs, the likely scenario for IT spending in 2019 is growth,” said John-David Lovelock, research vice president at Gartner. “However, there are a lot of dynamic changes happening in regards to which segments will be driving growth in the future. Spending is moving from saturated segments such as mobile phones, PCs and on-premises data center infrastructure to cloud services and Internet of Things (IoT) devices. IoT devices, in particular, are starting to pick up the slack from devices. Where the devices segment is saturated, IoT is not.
“IT is no longer just a platform that enables organizations to run their business on. It is becoming the engine that moves the business,” added Mr. Lovelock. “As digital business and digital business ecosystems move forward, IT will be the thing that binds the business together.”
With the shift to cloud, a key driver of IT spending, enterprise software will continue to exhibit strong growth, with worldwide software spending projected to grow 8.5 percent in 2019. It will grow another 8.2 percent in 2020 to total $466 billion (see Table 1). Organizations are expected to increase spending on enterprise application software in 2019, with more of the budget shifting to software as a service (SaaS).
Table 1. Worldwide IT Spending Forecast (Billions of U.S. Dollars)
2020 Growth (%)
Data Center Systems
Source: Gartner (January 2019)
Despite a slowdown in the mobile phone market, the devices segment is expected to grow 1.6 percent in 2019. The largest and most highly saturated smartphone markets, such as China, Unites States and Western Europe, are driven by replacement cycles. With Samsung facing challenges bringing well-differentiated premium smartphones to market and Apple’s high price-to-value benefits for its flagship smartphones, consumers kept their current phones and drove the mobile phone market down 1.2 percent in 2018.
“In addition to buying behavior changes, we are also seeing skills of internal staff beginning to lag as organizations adopt new technologies, such as IoT devices, to drive digital business,” said Mr. Lovelock. “Nearly half of the IT workforce is in urgent need of developing skills or competencies to support their digital business initiatives. Skill requirements to keep up, such as artificial intelligence (AI), machine learning, API and services platform design and data science, are changing faster than we’ve ever seen before.”
Artificial intelligence (AI) is increasingly making its way into the workplace, with virtual personal assistants (VPAs) and other forms of chatbots now augmenting human performance in many organizations. Gartner, Inc. predicts that, by 2021, 70 percent of organizations will assist their employees’ productivity by integrating AI in the workplace. This development will prompt 10 percent of organizations to add a digital harassment policy to workplace regulation.
“Digital workplace leaders will proactively implement AI-based technologies such as virtual assistants or other NLP-based conversational agents and robots to support and augment employees’ tasks and productivity,” said Helen Poitevin, senior research director at Gartner. “However, the AI agents must be properly monitored to prevent digital harassment and frustrating user experiences.”
Past incidents have shown that poorly designed assistants cause frustration among employees, sometimes prompting bad behavior and abusive language toward the VPA. “This can create a toxic work environment, as the bad habits will eventually leak into interactions with co-workers,” said Ms. Poitevin.
Recent experiments have also shown that people’s abusive behavior toward AI technologies can translate into how they treat the humans around them. Organizations should consider this when establishing VPAs in the workplace and train the assistants to respond appropriately to aggressive language.
Ms. Poitevin added: “They should also clearly state that AI-enabled conversational agents should be treated with respect, and give them a personality to fuel likability and respect. Finally, digital workplace leaders should allow employees to report observed cases of policy violation.”
Back-Office Bank Employees Will Rely on AI for Nonroutine Work
Gartner predicts that, by 2020, 20 percent of operational bank staff engaged in back-office activities will rely on AI to do nonroutine work.
“Nonroutine tasks in the back offices of financial institutions are things like financial contract review or deal origination,” said Moutusi Sau, senior research director at Gartner. “While those tasks are complex and require manual intervention by human staff, AI technology can assist and augment the work of the staff by reducing errors and providing recommendation on the next best step.”
AI and automation have been applied to routine work that has been successful across banks and their value chain. “In some cases, we witnessed layoffs to reduce unneeded head count, and understandably back-office staff are worried their jobs will be replaced by machines,” said Ms. Sau.
However, AI has a bigger value-add than pure automation, which is augmentation. “The outlook for AI in banking is in favor of proactively controlling AI tools as helpers, and those can be used for reviewing documents or interpreting commercial-loan agreements. Digital workplace leaders and CIOs should also reassure workers that IT and business leaders will ‘deploy AI for good’,” concluded Ms. Sau.
Technology seen as the greatest opportunity for organizations despite concerns about Artificial Intelligence.
Sword GRC, a supplier of specialist risk management software and services, has published the latest findings from its annual survey of global risk managers. Almost 150 Risk Managers from highly risk-aware organizations worldwide were canvassed for their opinions. Overall, cybersecurity was seen as the biggest risk to business by a quarter of organizations. In the UK, Brexit and the resulting potential economic fall-out was cited as the biggest risk to business by 14% of Risk Managers. The most notable regional variation was in the US where 40% of organizations see cybersecurity as the most threatening risk. The most lucrative opportunities for business were the benefits and efficiencies achieved by harnessing technology followed by expansion into new markets or sectors.
The Risk Managers were also asked about their acknowledgement and preparations for Black Swans (an event that is highly unlikely to materialize but if it did, would have a substantial impact). In both the US and UK, a major terrorist attack on the business is seen as the most likely Black Swan (UK 29% and US 35%), however, in Australia/New Zealand, only 13% of Risk Managers thought that one was likely. The next most likely Black Swan in the US is a natural disaster, with 48% of Risk Managers thinking it was likely or highly likely. This figure was 33% in Australia and New Zealand, and in the UK, where there are fewer adverse weather events, and no major fault lines in the earth’s crust, this figure was just 27%.
In the UK, Risk Managers were far more wary of Artificial Intelligence (AI) with 23% thinking it likely or highly likely that AI could get out of control. In the US this figure was 15%, and in Australia/New Zealand they clearly take a far more sanguine view with no one surveyed thinking AI was a risk.
Keith Ricketts, VP of Marketing at Sword GRC commented; “We are delighted to see the Active Risk Manager Survey going from strength to strength with a record number of responses in 2018. As Risk continues to grow in importance and influence in the Boardroom, we have this year focused on the biggest threats and most lucrative opportunities facing business. That cybersecurity is now recognised as the single biggest risk for many organizations is no surprise to us, as it supports the anecdotal evidence we have seen working with our clients in some of the most risk aware industries globally.
“Technology is a great enabler and that has never been more true. The feedback we have from our Risk Managers is that information technology is the key to almost every opportunity for business going forward, whether that is supporting expansion into new markets and geographies, streamlining processes to gain efficiency or harnessing big data and artificial intelligence to power product development and business performance.”
New data from Synergy Research Group shows that the number of large data centers operated by hyperscale providers rose by 11% in 2018 to reach 430 by year end.
In 2018 the Asia-Pac and EMEA regions featured most prominently in terms of new data centers that were opened, but despite that the US still accounts for 40% of the major cloud and internet data center sites. The next most popular locations are China, Japan, the UK, Australia and Germany, which collectively account for another 30% of the total. During 2018 new data centers were opened in 17 different countries with the US and Hong Kong having the largest number of additions. Among the hyperscale operators, Amazon and Google opened the most new data centers in 2018, together accounting for over half of the total. The research is based on an analysis of the data center footprint of 20 of the world’s major cloud and internet service firms, including the largest operators in SaaS, IaaS, PaaS, search, social networking, e-commerce and gaming.
On average each of the 20 firms had 22 data center sites. The companies with the broadest data center footprint are the leading cloud providers – Amazon, Microsoft, Google and IBM. Each has 55 or more data center locations with at least three in each of the four regions – North America, APAC, EMEA and Latin America. Alibaba and Oracle also have a notably broad data center presence. The remaining firms tend to have their data centers focused primarily in either the US (Apple, Facebook, Twitter, eBay, Yahoo) or China (Baidu, Tencent).
“Hyperscale growth goes on unabated, with company revenues growing by an average 24% per year and their capex growing by over 40% - much of which is going into building and equipping data centers,” said John Dinsdale, a Chief Analyst and Research Director at Synergy Research Group. “In addition to the 430 current hyperscale data centers we have visibility of a further 132 that are at various stages of planning or building. There is no end in sight to the data center building boom.”
Customers face uncertain times – the pace of change, digital transformation, security, Brexit, compliance and of course, being able to concentrate on their main line of work. They are turning to MSPs in particular for help, and those MSPs in turn need to be in a position to advise and provide a strategy. Standing still is not an option; managed services is a natural way to meet those extra customer needs without requiring them to commit to major capital spending and so is expected to continue to grow in 2019.
Worldwide IT spending is projected to total $3.76tn in 2019, an increase of 3.2% on 2018, according to the latest forecast by Gartner, with enterprise software up 8% in 2019 and up by a similar amount in 2020. The limiting factor, however, is resource to implement these changes, and this is where managed services plays its part.
Skills are in short supply; customers’ management may have its heart set on using data analytics; it wants to ensure compliance for regulatory systems, it wants to plan for the future, but it can’t do it on its own.
Managed services suppliers and providers are being asked to do much more than offer technology. Increasingly, they are being asked to talk through the process of managing the process of change for staff, how to draw up a management plan as well as IT strategy, spotting the gaps and matching resources to meet new demands. Gartner’s forecast in January 2019 says this is one of the key issues – half of the IT workforce are underskilled and cannot support digital initiatives. When it takes six months to hire externally, nine months to retrain, outsourcing to managed services looks more and more attractive. The key is presenting a skills and business alignment process.
“Despite uncertainty fuelled by recession rumours, Brexit, and trade wars and tariffs, the likely scenario for IT spending in 2019 is growth,” said John-David Lovelock, research vice president at Gartner. “However, there are a lot of dynamic changes happening with regards to which segments will be driving growth in the future. Spending is moving from saturated segments such as mobile phones, PCs and on-premises data centre infrastructure to cloud services and Internet of Things (IoT) devices.”
Organisations are expected to increase spending on enterprise application software in 2019, with more of the budget shifting to software as a service (SaaS) and managed services. With the shift to cloud, a key driver of IT spending, enterprise software will continue to exhibit strong growth, with worldwide software spending projected to grow 8.5% in 2019, Gartner says. It will grow another 8.2% in 2020 to total $466bn.
“In addition to buying behaviour changes, we are also seeing skills of internal staff beginning to lag as organisations adopt new technologies to drive digital business,” he says. “Nearly half of the IT workforce is in urgent need of developing skills or competencies to support their digital business initiatives. Skill requirements to keep up with technologies such as artificial intelligence (AI), machine learning, API and services platform design and data science, are changing faster than we’ve ever seen before,” he says.
The agenda for the 2019 European Managed Services and Hosting Summit, in Amsterdam on 23 May, aims to reflect these new pressures and build the skills of the managed services industry in addressing the wider issues of engagement with customers at a strategic level. Experts from all parts of the industry, plus thought leaders with ideas from other businesses and organisations will share experiences and help identify the trends in a rapidly-changing market.
Gartner’s vp of research Mark Paine will deliver a vital keynote at the MSH summit in Amsterdam entitled “Working with customers and their chaotic buying processes”. This will be a view on how the changed customer buying process has become hard to monitor, hard to follow and can be abruptly fore-shortened. Who are the real customers anyway, he is asking?
Angel Business communications are seeking nominations for the 2019 Datacentre Solutions Awards (DCS Awards).
The DCS Awards are designed to reward the product designers, manufacturers, suppliers and providers operating in data centre arena and are updated each year to reflect this fast moving industry. The Awards recognise the achievements of the vendors and their business partners alike and this year encompass a wider range of project, facilities and information technology award categories together with two Individual categories and are designed to address all the main areas of the datacentre market in Europe.
The DCS Awards team is delighted to announce Kohler Uninterruptible Power as the Headline Sponsor for this year’s event. Previously known as Uninterruptible Power Supplies Ltd (UPSL), a subsidiary of Kohler Co, and the exclusive supplier of PowerWAVE UPS, generator and emergency lighting products, UPSL is changing its name to Kohler Uninterruptible Power (KUP), effective March 4th, 2019.
UPSL’s name change is designed to ensure the company’s name reflects the true breadth of the business’ current offer, which now extends to UPS systems, generators, emergency lighting inverters, and class-leading 24/7 service, as well as highlighting its membership of Kohler Co. This is especially timely as next year Kohler will celebrate 100 years of supplying products for power generation and protection. Kohler Uninterruptible Power Ltd prides itself on delivering industry-leading power protection solutions and services.
The 2019 DCS Awards feature 26 categories across four groups. The Project Awards categories are open to end use implementations and services that have been available before 31st December 2018. The Innovation Awards categories are open to products and solutions that have been available and shipping in EMEA between 1st January and 31st December 2018. The Company nominees must have been present in the EMEA market prior to 1st June 2018. Individuals must have been employed in the EMEA region prior to 31st December 2018.
The editorial panel at Angel Business Communications will validate entries and announce the final short list to be forwarded for voting by the readership of the Digitalisation World stable of publications during April. The winners will be announced at a gala evening on 16th May at London’s Grange St Paul’s Hotel.
Nomination is free of charge and all entries can submit up to four supporting documents to enhance the submission. The deadline for entries is : 1st March 2019.
Please visit : www.dcsawards.com for rules and entry criteria for each of the following categories:
DCS PROJECT AWARDS
Data Centre Energy Efficiency Project of the Year
New Design/Build Data Centre Project of the Year
Data Centre Consolidation/Upgrade/Refresh Project of the Year
Cloud Project of the Year
Managed Services Project of the Year
GDPR compliance Project of the Year
DCS INNOVATION AWARDS
Data Centre Facilities Innovation Awards
Data Centre Power Innovation of the Year
Data Centre PDU Innovation of the Year
Data Centre Cooling Innovation of the Year
Data Centre Intelligent Automation and Management Innovation of the Year
Data Centre Safety, Security & Fire Suppression Innovation of the Year
Data Centre Physical Connectivity Innovation of the Year
Data Centre ICT Innovation Awards
Data Centre ICT Storage Product of the Year
Data Centre ICT Security Product of the Year
Data Centre ICT Management Product of the Year
Data Centre ICT Networking Product of the Year
Data Centre ICT Automation Innovation of the Year
Open Source Innovation of the Year
Data Centre Managed Services Innovation of the Year
DCS Company Awards
Data Centre Hosting/co-location Supplier of the Year
Data Centre Cloud Vendor of the Year
Data Centre Facilities Vendor of the Year
Data Centre ICT Systems Vendor of the Year
Excellence in Data Centre Services Award
DCS Individual Awards
Data Centre Manager of the Year
Data Centre Engineer of the Year
Nomination Deadline : 1st March 2019www.dcsawards.com
Energy Supply, Power and the Challenges Caused by a Bunch of Zeros and Ones.
By Steve Hone CEO and Founder of DCA Global, The Data Centre Trade Association
As Global Event Partners for Data Centre World I have been doing a great deal of research on behalf of the team for the 6 Generation Data Centre Zone which will be unveiled for the first time at DCW London in March. During this research I came across many interesting articles and stats; some new and some a little older which are still very relevant today. The inspiration for this month’s foreword is based on this research along with the “Powering Data Centres” theme for this month’s journal.
Although most of our focus remains firmly on energy usage within the data centre itself, there is equally growing consideration given to the way the energy is generated in the first place and the losses incurred over the network before it even reaches our data centres.
What can you can do as a large power consumer to improve energy sustainability and lessen the carbon impact of these losses? As a business this largely depends on the country you reside in. Assuming your data centre draws power directly from a national power grid, you can look to procure your supply from a power generator with less carbon impact (e.g. hydro, nuclear and geothermal). In many cases, if these are available such contracts encourage further investment in non-fossil fuels and continue the move towards more sustainable systems. Transmission and distribution losses will tend to be the same by trying to eliminate fossil fuels you can lower the overall carbon impact of powering your data centre.
Increased Power requirements and the resultant thermal challenges
Escalating compute requirements are continuing to create power and thermal challenges for today’s data centre managers. Although there are still many 2-4Kw racks in circulation, the introduction of high-density racks and blade servers meant the data centre design had to change and evolve. These architectures are inherently more scalable, adaptable and manageable than traditional platforms, they deliver much-needed relief in complex and crowded data centres. They also introduce power and thermal loads that are substantially higher than those of the systems they replace. In many cases, they have already pushed the cooling infrastructures of older design facilities beyond their limits.
The long-term solution to these challenges requires broad industry innovation and collaboration. This needs to come from not only the organisations who’s job it is the cool things down when they get hot but also the guys who are generating the heat in the first place - that being the server manufacturers themselves.
PUE is like Marmite, love it or hate it, what can’t be denied is it has helps focus us on improving the way we design and operate our data centres. Although there is always more we could do, for many of the data centres who follow the best practice guidelines of the EU Code of Conduct, most have already harvested the low hanging fruit and the only thing left to improve is the compute itself which is often outside their control. As a result, I fear we may have reached the point where only a concerted effort on both sides of the “1”, will return any further major dents in overall performance or energy savings. With the support of the DCA progress is being made in this front; we have already seen the development of many liquid-based alternatives to traditional air cooling and continued RnD into the next generation of processors designed to run both faster and hopefully cooler.
How can zeros and ones cause so much trouble?
Forbes reports that collectively we are creating 2.5 quintillion bytes of new data every day. If you consider the energy needed to manage and transport this much data consumed more energy worldwide than the whole of the United Kingdom did last year you quickly see how all those little zeros and ones soon add up. The shift to cloud is helping to relieve the pressure locally for many businesses as they migrate applications over to the hyperscalers to reduce costs, risk, IT complexity and their own carbon footprint. However, although these cloud providers are infinitely better equipped to handle your data it’s important to recognise that the issue does not go away by simply outsourcing the problem as technically you are just moving it from your back yard to someone else’s.
This exponential rise in the amount of data we are generating and storing is equally being driven by all of us on a personal level as well. A rapid increase in the popularity of streaming video, use of social media platforms and a complete reluctance to delete anything we think has intrinsic or sentimental value are all adding to the massive mountain of data we are generating. This apparently is also just the tip of the iceberg, and we haven’t even considered the additional processing power required for Artificial Intelligence (AI).
Experts forecast that an explosion of Artificial Intelligence and internet-connected devices is on its way with IoT projected to exceed 20 billion devices/sensors by 2020. Although I don’t necessarily subscribe to the analyst’s timeline here you can’t ignore the potential impact this could have. We roughly have 10 billion internet-connected devices today and doubling that to 20 billion will require a massive increase to our data centre infrastructure and the reservation of even more power to maintain service availability to a hungry and rapidly growing consumer base.
Thank you again for all the contributions made by DCA members this month. The theme for the next edition of the DCA Journal focuses on “workforce sustainability” - already a big problem for our sector which has no quick fixes and needs to be collectively addressed. Joe Kava, VP of data centres for Google, summed it up nicely in a keynote address at last year’s Data Centre World conference, he said, “The greatest threat we’re facing is the race for talent” so please take advantage of this opportunity to publish your thought on this subject by contacting firstname.lastname@example.org for copy details and deadlines.
Energy storage at Johan Cruyff Arena in Amsterdam shows the way to Data Centres
By Robbert Hoeffnagel, Green IT Amsterdam
The European EV-Energy project is working hard to map and promote legislation and regulations of local and provincial governments that can accelerate what is officially called 'decarbonisation of the energy and mobility sector'. This also affects the integration of data centres and smart grids. A project on battery storage at the Johan Cruyff Arena in Amsterdam shows how this can be achieved in practice and the benefits that this can bring.
Last summer, the Johan Cruyff Arena in Amsterdam officially launched a battery system for storing electrical energy. This opening followed an earlier project of the stadium where a large part of the roof was filled with solar panels. Generating energy through solar panels is interesting - especially if this energy can also be used immediately. For the Arena, however, many of the activities that take place here are planned in the evening hours. Storage of the energy generated by solar panels in batteries was therefore an important next step.
61 racks of batteries
It is therefore logical that last year's opening of a hall with 61 racks full of batteries has already received some significant attention. We are now more than six months further and it is becoming increasingly clear how important this project is - especially for the data centre industry. As Figure 1 shows, this project is not only about storing energy in batteries. In order to justify the relatively high costs of batteries, we need to develop a business case that is as broadly defined as possible. In other words: the batteries should be used in as many ways as possible so that the investments can be recouped. That is precisely the phenomenon that makes this project very relevant for data centres, who are now also discussing the possibilities that arise from integrating batteries and UPS systems with the energy networks of grid operators.
As shown in figure 1, a subsidiary of the Johan Cruyff Arena - called Amsterdam Energy Arena BV - has invested in a room filled with 61 racks full of batteries. These come from Nissan's electric car - the Leaf. After a number of years, the capacity of the batteries of these cars drops from 100% to 80%. This decline means that the batteries are no longer suitable for use in an electric car and therefore need to be replaced. What to do with so many ‘useless’ car batteries? It turns out, however, that these batteries are still perfectly suitable for storing electrical energy in, for example, an energy storage system linked to solar panels. The Amsterdam Arena has now installed 61 racks with 590 battery packs. Good for 3 MW and 2.8 MWh.
Generating and using
What exactly does the Arena use the stored energy for? This is first of all (see figure 2) to compensate for the mismatch between the moment of generation and the time of use. The 4200 solar panels on the roof of the stadium generate electrical energy during the day, while many sports matches and concerts, for example, require energy in the evening hours. These are serious amounts of energy. If the Arena is running at full speed in the evening, the energy stored in the batteries is sufficient to meet the energy requirements for an hour. If not all systems are actually switched on, the Arena can extend this period to 3 hours. Outside this period, energy will have to be taken from the grid.
It is interesting to note that it is of course not necessary to draw maximum electrical energy from the batteries every evening. At times when there are no events planned Amsterdam Energy Arena BV can use the storage capacity in other ways. This is shown in figure 2. Think of energy services that are delivered to the grid. This will give the local grid operator more and better opportunities to keep the grid in balance. This can be done by temporarily storing energy from the grid in the batteries of the Arena or by drawing energy from them and transferring that electrical energy to the network.
However, the Amsterdam Energy Arena also provides other services; for example, electric or hybrid cars can be charged via bi-directional charging stations in the stadium. But the other way around is also possible: temporary energy storage in the batteries of these cars. Peak shaving is also possible. Depending on supply and demand, peaks and troughs in energy consumption can be absorbed by using energy from the batteries.
Another remarkable application: backup power during events. Many major artists who give concerts in venues such as the Amsterdam Arena generally do not rely on the backup energy supply in the venues where they perform. In too many places there are problems with the quality and robustness of the network, in their experience. They prefer to bring their own diesel generators to ensure uninterrupted power supply during their events. With all the extra costs that entails, of course. In the case of the Arena, this is no longer necessary as these artists can now call on the battery storage.
Future for data centres
With this energy storage system the Johan Cruyff Arena in Amsterdam is an interesting example of what might be the future of many data centres. European projects such as EV Energy and CATALYST are working hard to enable the integration of data centres and smart grids. Batteries and UPS systems at the data centre are connected to the grid via smart management software. The advantages for grid operators are then, of course, clear. As with the Amsterdam Arena, they can then use the storage capacity of a data centre - the batteries installed there - to help keep the network stable. Because data centres may invest more in renewable energy generation, they may also be able to supply energy to the network. Peak shaving and a better organised form of backup power is also possible.
Of course, this also creates interesting opportunities for data centres. Until now, they function on the basis of a business model that has only one financial pillar: selling space for processing data. Especially in many commercial data centres we see that the margins on projects of this kind tend to decline: the projects are getting bigger, but the margins are getting smaller. However, an integration of the data centre and smart grid makes it possible - what we will just call - to put 'grid services' as a second financial pillar under the business model of a data centre. Provided this is done on the basis of sound agreements, new turnover will be generated. Initially of course modest in size, but at the same time with a relatively high margin.
The same applies, of course, to data centres that in the future want to supply residual heat to customers for a fee. These transactions will also have a relatively high margin and can therefore make an interesting financial contribution to the operation of data centres.
The project on battery storage in the Johan Cruyff Arena (figure 2) could very well serve as an example to the data centre industry. Although at the moment the storage capacity at the stadium is not yet sufficient to supply energy to external customers, this project does show that developing and delivering energy services offers interesting opportunities to data centres. The energy transition facing the data centre and ICT sector could thus offer unexpectedly great opportunities - not in the least financially.
The preservation of power is crucial to the operation of a datacentre and every facility will have many battery back-up systems to prevent a mains failure from being a catastrophic disaster.
These systems can be identified under two categories.
1) Critical Infra-Structure Systems, such as:
2) Essential Facilities Systems, such as:
Clearly the battery underpins each standby power system and by definition of being a sub component, prevents the battery manufacturer from being directly part of the datacentre supply chain, in either the design and build phase or during operations of the facility.
With such an array of different types of systems that use batteries, the accountability of battery maintenance therefore falls within the equipment vendor’s Service Level Agreements (SLA’s) or the Facility Management maintenance team.
Where SLA’s make provision for vendor neutral battery supply, generic battery warranty compliance is managed by the battery maintenance experts.
Where systems are not supported with SLA’s the risk of failure becomes the direct responsibility of the Facility Management team.
The complexities of successful battery maintenance are therefore key to preventing a power failure being blamed on to the battery.
Thankfully datacentres are designed with redundant power paths, this prevents the vulnerability of one battery string being the single point of failure, but nevertheless the battery remains the primary frontline alternative source of power in a mains outage.
As shown in diagram, a good battery will be needed in all power outages and while bad fuel is a single point of failure for the datacentre, the battery will protect against the majority of power outages.
The frontline reliance on the battery to cover the short duration outages creates an equal importance of both preventative battery maintenance and the integrity of fuel quality. Together they are essential for the reliability of the datacentre.
The importance of maintenance
The battery is an indirect cost within any datacentre CAPEX budget, being a component of the emergency power system equipment. The component status prevents any direct relationship between the datacentre owner and the battery manufacturer which leads to the reliance of protecting the battery investment (battery maintenance) by others in the supply chain.
The OPEX budgets are issued alongside vendors SLA’s for the supplied emergency power system equipment. For commercial reasons this allows the battery to be a none specific branded component, so the SLA will not be specific to the battery manufacturer warranty conditions.
This creates the need for using battery experts to deliver specific preventative maintenance to manufacturer’s warranty conditions. The correct interpretation is crucial to ensure the battery is not being blamed when the lights go out.
The IT server equipment in Datacentres cannot operate on the raw utility power supplied by the grid. Datacentre therefore depend on UPS Systems to provide quality filtered mains required by IT equipment.
In addition, under the direction of Tier certifications, the redundancy and size of the UPS Systems are dimensions on the total IT load of the datahall. Therefore, a Tier 3 datacentre with a 1000kW datahall will have a minimum of a dual path UPS design which gives say 15 minute run time protection.
This investment into the UPS System may be for 10 years, the battery may only be a 5 year product due to cost or even a 10 year product may be changed during its life, add in the scenario of battery replacement from a different brand and the complexities of managing the battery maintenance requires specialist knowledge if we are going to keep the lights on.
The battery manufacturer will determine the end of life in terms of the design life capacity value of the battery with the recommendation to perform a 3 hour constant current or power discharge test (with a DC load bank) to determine the battery capacity. This is normal for industries such as oil & gas, petro-chem, power generation where emergency back up systems are predominately batteries and chargers.
The battery performance verification will be confirmed as part of the UPS autonomy run time test, which is usually performed annually with an AC load bank and careful year on year analysis of the battery performance can help determine the end of life of the battery.
However because the run time is short the test engineer cannot measure the individual battery blocks during an autonomy test there is also a need to deploy specialist 3rd party battery experts to carry out more frequent maintenance work on the battery system.
Such maintenance will include taking impedance, conductance or resistance measurements on the individual battery blocks. The preference for the type of reading taken is subject to the preference of the 3rd party expert.
These results can also be taken with permanent battery monitoring systems and the collation of all this data helps identify problems with the batteries.
Another important and somewhat over looked part of maintenance is visual inspections. In North America they fall under the NERC & FERC regulation and they form part of good practice giving additional valuable contribution to battery maintenance.
Ensuring the integrity of power to the datacentre requires both a robust preventative battery maintenance system and a transparency of reporting for warranty compliance and end of life replacement.
Commercially battery suppliers will want closer relationships to manage replacements & warranty claims to safe guard future potential sales. Therefore it makes sense to engage and create relationships between facility owners, vendors and battery suppliers to increase reliability of operation.
Battery failures in the datacentre originate from the exclusion of the battery manufacturer during the design and build phase of the datacentre and is encompassed within generic SLA for emergency power system equipment and the differences of battery maintenance recommendations from 3rd party experts.
Creating an holistic maintenance program that combines historical – present & future predictions for the performance of every battery system will avoid the blame being placed on the battery when the lights go out
Paul is presently running a small team developing a vendor neutral low cost annual subscription battery maintenance software platform called BattLife.
BattLife addresses all the maintenance issues described in this article and is due for release Qtr1 2019.
By the IEC - International Electrotechnical Commission
A number of low voltage direct current (LVDC) trials are preparing the ground for a wider use of the technology, including for powering data centres.
Low voltage direct current (LVDC) is seen increasingly as an energy efficient method of delivering energy, as well as a way of reaching the millions of people without any access to electricity. It’s fully in line with the UN’s Sustainable Development Goal, of providing universal access to affordable, reliable and modern energy services by 2030.
In direct contrast to the conventional centralized model of electricity distribution via alternating current (AC), LVDC is a distributed way of transmitting and delivering power. Today, electricity is generated mostly in large utility plants and then transported through a network of high voltage overhead lines to substations. It is then converted into lower voltages before being distributed to individual households. With LVDC, power is produced very close to where it is consumed.
Using DC systems make a lot of sense because most of the electrical loads in today’s homes and buildings – for instance computers, mobile phones and LED lighting - already consume DC power. In addition, renewable energy sources, such as wind and solar, yield DC current. No need to convert from DC to AC and convert back to DC, sometimes several times, as a top-down AC transmission and distribution set-up requires. This makes DC more energy efficient and less costly to use.
LVDC used for powering data centres
The environmental gains from using a more energy-efficient system supplied from renewable sources make LVDC a viable alternative for use in developed countries as well as in remote and rural locations where there is little or no access to electricity.
“The potential benefits of LVDC already have been demonstrated by a number of pilot projects and niche studies in developed nations. For example, a pilot data centre run by ABB in Switzerland running on low direct current power has shown a 15% improvement on energy efficiency and 10% savings in capital costs compared to a typical AC set-up. This is interesting because data centres consume so much power,” comments Dr Abdullah Emhemed from the Institute of Energy and Environment at Strathclyde University in the UK. Dr Emhemed leads the University’s international activities on LVDC systems. He is also a member of a systems committee on LVDC and LVDC access inside the global standard-setting organization for the electro-technical industry, the IEC.
According to Emhemed, standardization work is required on “voltage levels, as well as safety and protection issues,” amongst other things, to help with the adoption of LVDC on a larger scale.
The IEC is working on the specification and ratification of these new standards. The remit of its LVDC systems committee is to identify gaps where international standards are needed.
Trial and error
Japan is one of the countries where DC trials have mushroomed. Several projects scattered across the country rely on DC power. They include the hybrid AC/DC Fukuoka Smart House inaugurated in 2012, which utilizes energy supplied from a number of different DC sources.
In Europe, one of the most advanced projects is in Finland. LVDC RULES began in October 2015. It is led by the Lappeenranta University of Technology (LUT) and financed by the Finnish Funding Agency for Technology and Innovation (TEKES).The project aims to take the final steps towards the industrial scale application of LVDC in public distribution networks by building on the data gathered from laboratories and research sites and transferring the technology into everyday use in Nordic distribution companies. The data is drawn from trials which started in Finland as early as 2008.
“The LVDC RULES project consortium has put together complete specifications for LVDC equipment optimized for public power distribution, especially in a Nordic environment,” explains Tero Kaipia, one of the researchers from LUT involved in the project. “The development of the equipment is in good progress and critical tests have been completed. Design methods and practical guidelines have been set up to facilitate the utilization of LVDC networks as part of a larger distribution infrastructure,” he adds.
While this project demonstrates a workable LVDC system, a number of key outstanding challenges have been identified by the researchers involved. Chief among them is the lack of appropriate Standards.
“Standardization at system and equipment level is an essential prerequisite for the wide-scale rollout of LVDC in Finland,” says Tero Kaipia. “Without standardization there will be incompatible components and it will be difficult to construct systems using components from different manufacturers. And most of all, the network companies will not buy LVDC systems, if the certified components and standard design guidelines are not available.”
In India LVDC is seen as one of the solutions for bringing electricity to the millions of homes which still have no or only intermittent access to power, as is the case in many other developing nations.
The Indian government’s Ministry of Power and the Rural Electrification Corporation (REC), a public Infrastructure finance company in India’s power sector, piloted a number of systems.
One of these projects is the Solar-DC initiative led by the Indian Institute of Technology Madras (ITT-M). As a result, an ecosystem for DC appliances and DC microgrid projects is emerging.
As part of this global drive, ITT-M has been working in collaboration with Telangana State Southern Power Distribution Company Ltd and REC to bring uninterrupted power to four hamlets in rural Telangana, which had been living without electricity for six to eight hours a day. The technology in this particular case comprises a 125 W solar panel, a 1 kWh battery, an inverterless controller unit and DC loads operating on a 48V DC internal distribution line, installed in each small hamlet.
Other similar trials have also been taking place in the Indian states of Bihar, Assam, Rajastan, Karnataka, Odisha and the city of Chennai.
“The Indian Bureau of Standards has adopted a 48V standard for electricity access suited to local needs. Further discussions are required to formulate a universally accepted IEC standard for electricity access,” says Vimal Mahendru, member of the IEC standardization management board (SMB) and Chair of the IEC systems committee on LVDC.
About the IEC
The IEC (www.iec.ch) is the International Standards and Conformity Assessment body for all fields of electrotechnology. The IEC enables global trade in electronics and electrical goods, Via the IEC worldwide platform, countries are able to participate in global value chains and companies can develop the standards and conformity assessment systems they need so that safe, efficient products work anywhere in the world.
By Esworth Hercules, EMEA Sales Manager, NDSL Limited
Preventing Downtime: How battery monitoring mitigates the risk of an unplanned outage and delivers rapid ROI.
The most critical part of a UPS system is the battery; it is also the most likely to fail. Neglecting to monitor and maintain your battery could reduce the level of protection and increase that likelihood. UPS systems and their associated batteries are designed to be durable and dependable; however, maximising your batteries requires proper care and attention. Monitoring your batteries will help eliminate unplanned downtime due to battery failure.
Today’s mission critical data centres are designed and built to provide access to digital services 24/7/365, this entails the use of backup systems for power, cooling and other essential services, one critical backup system to provide the continuity of power are UPSs and batteries and associated generators.
The”Maintenance Free” misnomer
Even though static UPS battery systems provide essential protection against power losses, their batteries require constant monitoring, which helps predict premature failure and extend battery life. “Maintenance free” batteries still require some maintenance as it is critical to check the battery health regularly. Without regular maintenance, your UPS battery may experience:
With proper monitoring the health of the battery can be determined, allowing for scheduled replacements without unexpected downtime or loss of backup power. Having a semi-permanent battery monitoring system greatly reduces the time and cost associated with battery maintenance.
The Consumer Demand
Today's “always-on” customers are demanding reliable, continuous power. Outages due to battery failure can be an expensive and disastrous event, potentially causing huge losses in revenue (or even worse, customers). But it is almost always avoidable. Ensuring that critical backup power is ready and available when needed is much simpler and cost-effective than many data centre operators realise.
This article provides an overview of why you need a battery monitoring system and highlights the benefits that are attained when one is installed. This paper provides the facts from which you can draw your own conclusions.
Why do you need to monitor your backup batteries?
Studies indicated that “UPS system failures and human error are the primary root causes of outages – an automated battery monitoring system can easily help prevent both.” (Ponemon Institute, 2010, 2016).
“Batteries can fail for any number of reasons or from a combination of conditions. Batteries are the least reliable component of a backup power system and some studies attribute the percentage of critical power failure events due to batteries at 65% or higher.” (Ponemon Institute 2010, 2016)
Batteries can fail for a variety of reasons including, but not limited to:
A reliable way to ensure full power protection from a UPS system and to prevent unplanned downtime is to use a semi-permanent electronic battery monitoring system to monitor the batteries daily. Continuous monitoring of key operational parameters during the charging and discharging cycle is vital. These parameters need to include: voltage, current, internal cell and ambient temperature, and ohmic value captured in an automated battery monitoring system to track and alert users to potential individual failures or weaknesses. While many UPS systems are capable of monitoring voltage per string, this does not provide enough detail to ensure individual blocs are in a good state of health.
By automating the testing and measurement of your batteries’ health and performance, you eliminate the risk of human error and the dangerous and unreliable task of hand-held testing. Batteries are typically filled with solutions (electrolytes) containing sulphuric acid. This is a very corrosive chemical and can permanently damage the eyes and cause serious chemical burns to the skin. Removing human interaction while performing battery maintenance reduces that risk completely.
Investing in a daily battery monitoring system will not only help prevent unplanned outages, it will also quickly generate a high return on investment – in many cases in the first few years of deployment.
Battery monitoring = fewer Preventive Maintenance (PM) checks and extended battery life
A battery monitoring installation is partially justified by the reduced maintenance costs alone — not to mention protecting the company from downtime, lost customers and lost revenue.
To ensure uptime, some data centre operators routinely replace their batteries at a four or five-year interval, whether their batteries are demonstratively failing or not. With a battery monitoring system in place, this practice can be abolished. Operators would know the health of their batteries daily, allowing them to extend battery life and to replace blocs when the batteries begin to show signs of failure – delaying major expenses.
The graph below demonstrates the cost of quarterly testing compared to the cost of using a Cellwatch battery monitoring system and reduced periodic testing: The battery monitoring system costs include, the hardware, software and installation in the first year and demonstrate the savings in subsequent years.
What to look for in a UPS monitoring system
When evaluating battery monitoring systems, be sure to consider one that is modular and scalable and has a proven track record for accuracy, repeatability and reliability. The system you implement should also be capable of monitoring all backup batteries – UPS, Generator, Switchgear and Communications Gear – with one system. A suitable system should also be capable of storing battery trending data. This is important when comparing the present state to the initial baseline established when the battery was new or when the monitoring system was installed. Since communication to the device is vulnerable, storage at the monitoring site is critical.
It is important to establish battery management processes based on internationally recognised standards. These standards should be used as a benchmark to compare the monitoring processes and performance metrics to industry best practices. The IEEE Standard 1491, Guide for Selection and Use of Battery Monitoring Equipment in Stationary Applications is regarded as the key guiding practice for selecting reliable battery monitoring systems.
The standard recommends measuring ohmic value, which is a key parameter in accurately determining battery health. Every battery monitoring system takes the ohmic reading in a slightly different way. We recommend considering electronics that draw less than 2 amps during the ohmic value test. When comparing trending ohmic values, readings should be taken at the same measurement point using the same hardware.
Trending changes in ohmic values on a daily basis provides the most reliable mechanism for detecting existing and predicting future bloc failures. Since VRLA cells are known to fail in as little as two days, monitoring ohmic value quarterly or monthly will not catch the rapid failures that put critical loads at risk.
See the example of a battery monitoring software user interface below. It is very clear which cell is in alarm.
The Cellwatch battery monitoring solution comes with an event driven centralised management system for easy monitoring of hundreds or thousands of blocs, as well as viewing all batteries in all sites.
Batteries are the least reliable component of the UPS system. Planned manual battery testing on a semi-annual, quarterly or even monthly basis does not ensure your batteries are fit for purpose. Like an annual test of your car’s roadworthiness, after a large pothole how meaningful is the annual test, it’s only good for that day.
The IEEE recommends maintenance together with resistance or impedance testing, yet this still cannot tell the user what’s happening to the battery between visits. Proactively preventing downtime and managing data centre battery assets with continuous automated battery monitoring provides a number of meaningful benefits, including:
Using a power back up system that fails to monitor battery condition is incomplete and increases the risk of a total system failure due to unexpected battery failure. Batteries can be faulty during the initial installation, so relying on these without proper commissioning and ongoing testing is a risk. Investing in a battery monitoring system to manage these assets and ensure that critical batteries are healthy and ready to perform when a load is applied is the best defence, and will generate a quick ROI justifying the financial investment.
With assistance from a data centre operator, a comprehensive ROI evaluation can be completed in a few days.
Many UPS power failures are not due to UPS problems but are actually caused by battery failure. In many cases, valve regulated lead acid (VRLA) batteries can fail within just a few days.
A proactive battery monitoring system which allows you to replace suspect cells before they fail, eliminating costly downtime and disruption due to battery failure, should be treated as a strategic asset, rather than a short-term cost burden. Our battery monitoring technology ensures that critical batteries are in a good state of health and will function when required.
Data Centre World, the leading Data Centre gathering in world, will welcome over tens of thousands of Data Centre professionals on 12th-13th March. This unmissable event will deliver focused streams highlighting the latest innovations in the Data Centre space, expanding its cutting-edge conference to report advancements in DCIM, energy efficiency and cost management, facilities management, critical equipment and edge computing.
International visitors will benefit from a practical conference programme sharing invaluable insight and inspiration for all technology professionals, from data centre managers, IT directors, engineers, consultants or industry experts. Hosting a raft of panel discussions, roundtables, seminars and real-life case studies and workshops, Data Centre World is tailored to both individual development and business needs.
An A-list line-up of over 800 visionary speakers will be presenting at this year’s conference including: Robert Tozer, Professor at London South Bank University, John Shegerian, Executive Chairman at ERI, Armand Verstappen, Manager Site Operations at eBay, Vasiliki Georgiadou, Project Manager at Green IT Amsterdam, Ian Holford, Data Centre Regional Operations Manager at HM Government.
Data Centre World 2019 will also be premiering the first edition of the “6th Generation Data Centre” -- addressing the raft of new technologies that are pushing the boundaries and driving efficiencies in today’s market. Innovation is non-negotiable for all IT professionals, but given the array of new developments in the data centre space, it’s impossible to know them all. Data Centre World has assembled them all in one place so you are armed with all the insight you need to take your infrastructure to the next level.
The exhibition floor will host over 700 industry-leading suppliers showcasing the latest data centre products and solutions including 2BM, Anord-Mardix, Corning, E+I Engineering, Excool, Edmunson Electrial, Huawei, Munters, Rittal, Riello UPS, Schneider Electric, Socomec, Stulz & GS Yuasa.
Data Centre World stands alongside other FREE to attend Techerati events, including Cloud Expo Europe, DevOps Live, Cloud & Cyber Security Expo, Smart IoT, Big Data World, AI Tech World and Blockchain Technology World – Techerati: Every emerging technology. One digital transformation journey.
Register for your FREE ticket and receive complimentary access to all 8 co-located events at www.datacentreworld.com/pressrelease
Data Centre World, Event Director, Rabinder Aulakh says, “Data Centre World, London 2019 promises to be the best instalment yet, with an insightful programme of world-class speakers and a show floor packed with leading international suppliers. We are really looking forward to this year’s show, particularly with the introduction of our brand-new feature the 6th Generation Data Centre, which will showcase the future of the Data Centre Industry.”
By Gareth Spinner – Director Noveus
The demand for data continues to grow at pace and the perquisite of having power to supply the Data Centre is not diminishing and thus we continue to face the fact that more power is going to be needed. The Data Centre model works very much on the basis that the maximum theoretical capacity must be secured to ensure funding is achieved and that the business model works for any operator; they can’t commit to customers or tenants if they haven’t got an agreement with the network provider.
This is the secure and safe method of operating, however with the changing landscape to meet the requirements of Smart Grid and the Low Carbon Economy the way networks are operated is changing with more push from Regulators to facilitate local energy networks, more low carbon generation and lower more efficient running of the networks. The high level of small generation and potential growth of battery storage is going to require System Operations in real time being taken up by the local Distribution Network Operators balancing loads at Grid substation level rather than all sitting with National Grid for the UK as a whole.
This local balancing and the need for flexible services and potentially Demand Side Management creates a new energy market, where the local System Operator will be seeking to contract with providers of generation and users of power to be flexible with their export and import. Is this something a Data Centre can do? Is willing to do? Or under their service provision to their customers is able to do?
The added challenge for the Data Centre is the cost of energy; the non-commodity aspects are becoming a much larger slice of this and these network and tax charges are difficult to negotiate unless the Data Centre has a different approach. The flexibility option may be a way that the Data Centre can impact its costs but as we know everything has a price.
Overseas it is also not unusual for Datacentres and large consumers of power to enter into arrangements with generators, for some by necessity as the grid is less perfect but better economically and then with batteries as support for peak lopping solutions are found. This is also possible in the UK with the added benefit of grid cost avoidance with private wire connections.
The true panacea would be all capacity being socialised across all users, where the users pay for what they use and network company shuffle the demands to meet all customer needs. The challenge is having a robust network with monitoring and controls to give an absolute comfort to the Data Centre that they can trust the Network Operator to do it, else they will want to retain control and their own 100% back up.
With this huge challenge and global landscape on power, does the DC strategy of N+1 power still stand? Should the N+1 or N+2 be held at data storage level, and thus more thought is required to having more small flexible DC sites with resilience in Data Storage and worry less about Power. A reduced power requirement could deliver huge beneficial impacts on operating costs with no loss in service.
All of the new options also help achieve a much greener energy footprint so that is good ethically.
So, the power landscape is getting more and more interesting.www.noveusenergy.com
American organizations are forecast to spend 1.75 times more than European organizations on information technologies from 2018 through 2022. According to a recent study from IDC, Western Europe Risks Losing the Technology Race, China is also set to outpace Europe in key areas. While European entities will invest more into IT than their Chinese counterparts through 2022, the latter will invest 47% more into innovation accelerators.
"That Europe trails the US in its use of digital technology is often accepted as a given. What's worrisome is the size and potential widening of the gap between the two," says Mark Yates, Research Manager, Digital Transformation, IDC Europe. "And, of course, there are companies out there that are able to do more with less or that have not yet pushed their IT systems to their full potential. And there is a great deal that can be done by restructuring the organization and resetting business goals. But so much innovation today depends on use of cutting-edge technologies, that new spending will almost always be needed to remain competitive."
The most extreme technology example cited in the study is artificial intelligence (AI). Already used in security, customer service, and ecommerce, AI is being increasingly deployed to improve manufacturing, logistics, staff recruitment and management, and healthcare. Forward-looking enterprises consider AI to be crucial for reducing costs, facilitating revenue growth, and improving customer experience. AI is utterly dominated by U.S.-based organizations, which are expected to invest more than 4.5 times more than those based in Europe from 2018 to 2022, even though they have similar GDPs. (The ratio holds even if investments by government and IT firms are excluded.) The largest spending area cited in the study is the Internet of Things (IoT), where total spending is around 16 times higher than for AI, mainly because the immediate benefits are usually more apparent on balance sheets and cashflow statements. Again, both U.S.-based and China-based enterprises are forecast to invest more both in absolute terms and as a percentage of GDP through 2022.
"It's important to recognize that European goods and services are still in high demand," says Marc Dowd, Principal Client Advisor, IDC Europe. "Many European enterprises, especially in Central and Northern Europe, have done an exceptional job of streamlining operations, innovating new business, and maintaining high standards without a lot of cutting-edge technology. But that won't last. Whether deployed via the cloud or client-based systems, ERP and CRM solutions are essentially commodities. So too are a lot of tools used to ensure quality. European firms will need to up their technology game considerably over the next few years if they wish to stay competitive both globally and on the continent."
IDC's Western Europe Risks Losing the Technology Race uses IT investment forecasts to argue that Western European organizations may be falling behind businesses in other developed regions in their deployment of new technology. The document analyzes the reasons why WE businesses may be at risk and provides guidance for correcting the situation and ensuring European competitiveness.
The integrated systems market in Europe, the Middle East, and Africa (EMEA) showed significant growth in 3Q18, reporting $873 million in user value, with year-on-year growth of 20.1%, according to the latest International Data Corporation (IDC) Quarterly Converged Systems Tracker.
Traditional converged systems, made up of certified reference systems & integrated infrastructure and integrated platforms, had a flat performance in this quarter, but still accounted for over 60% of total sales in EMEA. Meanwhile, hyperconverged systems have continued to show strong growth, reporting $348 million in 3Q18, with 70.9% YoY growth.
"Hyperconverged continues to see increased adoption in the EMEA market as companies make use of these systems' simplicity, allowing IT managers to shift their focus higher up the stack," said Eckhardt Fischer, senior research analyst, European Infrastructure at IDC.
"Similar to last quarter, the growth in integrated systems has mainly been driven by a strong hyperconverged segment, while the more traditional certified reference systems & integrated infrastructure and integrated platforms segments saw a flat trend in U.S. dollar value terms, with little differentiation between segments," said Silvia Cosso, research manager, European Infrastructure at IDC. "Overall demand generated by the advantage of having pre-integrated, fine-tuned systems able to simplify the management and deployment of datacenter infrastructure, but their higher acquisition cost limits their deployment to specific use cases."
In the EMEA region, Western Europe's revenues account for about 78% of sales, of which the U.K., France, Germany, and the Nordic region still represent the lion's share, around 75%. The Western European hyperconverged market is still being driven strongly by a handful of vendors that have found greater success by focusing on the larger 500 companies.
Central and Eastern Europe, the Middle East, and Africa (CEMA) accounted for 22% share of EMEA market revenue in 3Q18, with Central and Eastern Europe (CEE) recording the strongest growth in EMEA. Sales of hyperconverged systems in CEMA exceeded both converged systems and integrated platforms sales for the first time and grew by 81% year-over-year.
"The majority of hyperconverged systems sales were driven by organizations from developing countries, mostly in the Middle East and Africa (MEA) region where the deployment of hyperconverged systems helps to address the lack of skilled IT resources as well as organizations' growing need for scalability and agility," said Jiri Helebrand, research manager, IDC CEMA.
Top 5 Vendor Storage Systems Value Table
A New Way of Purchasing Infrastructure and Applications
The following section provides definitions for the converged systems market. Given the overlap between converged systems and other infrastructure markets (including system infrastructure software), some definitions are taken directly from taxonomy documents covering enterprise storage systems, networking, servers, and system infrastructure software.
It should be noted that IDC's segmentation of this market was updated for the March 2017 Converged Systems Tracker. None of the segmentation updates have changed the size of the overall market. Specifically, the following changes have been made:
▪ A hyperconverged software vendor view has been added.
▪ Rack-scale hyperconverged solutions have been called out as a type of hyperconverged solution.
DW talks to Leo Craig, General Manager for Riello UPS Ltd, about the company’s positive start to 2019, battery storage, energy efficiency and smart grids, the edge, Brexit and beagles – yes, beagles!
You’ve started 2019 with something of a bang. Tell us about your new extended warranty and the thinking behind it.
Indeed, January saw us introduce an extended five-year warranty as standard on all our UPS up to and including 3kVA. Every other supplier only offers customers a one or two year warranty, three years at the very most, so it’s far above and beyond anything else that’s available in the market. And it’s not just the UPS itself, the warranty covers internal batteries too.
Basically, this is us putting our money where our mouth is. Because all our uninterruptible power supplies are designed and manufactured in-house, we have complete confidence in their quality and reliability. We know they’re up to the task, and this five-year guarantee is our way of providing customers with that additional peace of mind.
In some respects it’s similar to the unique Diamond maintenance contract we introduced a few months ago. Most other service providers wouldn’t run the risk of committing themselves to such a strict timeframe – an onsite emergency response in four hours with a guaranteed fix inside a further eight hours – for fear of failure.
For us, it’s about being completely clear about what we’re offering customers, and then living up to those promises, no ifs or buts.
That’s the aftercare side of the business covered, but what about product innovation? What can we expect from Riello UPS in the coming months?
There’s plenty in the pipeline! We’re increasing the power range of our super-efficient NextEnergy (NXE) UPS with a 400 kVA model. This new version uses the latest transformerless technologies to deliver the same 97% operational efficiency as our 250 kVA and 300 kVA versions, which have recently earned a place on the government’s Energy Technology List (ETL) of recommended products.
This’ll be followed by the launch of two rack-mounted versions of our best-selling Sentinel Pro series. The Sentinel Rack (SER) 1500 and 3000 ER models fit into standard 19 inch cabinets and sit at just 450mm deep, meaning they’re a great choice for upgrading power in server rooms with 600mm deep legacy racks. The 3000 VA version is fitted with a 6A battery charger to work with extra batteries to offer several hours’ extended runtime too.
Final development work and testing is well underway on the third generation of our Multi Sentry UPS, while we’re also busy with an upgrade to our award-winning Multi Power (MPW) modular systems, which will give our customers far greater flexibility when choosing the right modular UPS for their needs.
Riello UPS seems to be bucking the trend by continuing to grow in what is a relatively static marketplace. How are you doing this?
Here in the UK our sales have increased 25% since 2015, with our turnover now topping £22 million. While the wider Riello Elettronica group is now the second biggest UPS manufacturer in Europe.
There’s no secret formula behind this success. It boils down to putting the customer’s interests first and giving them the control. Sadly, too often in our industry those roles are reversed, with the supplier dictating the terms.
Fundamentally, it comes down to us providing the right product at the right price. The fact we have the largest stockholding of UPS in the UK obviously helps, as this means customers get their orders quickly.
It also depends on our sales and technical teams always being available to offer the right advice and training. And it means providing the right aftercare, whether that’s through initiatives like the new five-year warranty, or ensuring all our field service engineers complete a rigorous Certified Engineers Programme. It’s about showing customers you’ll go that extra mile for them.
Despite all the technological advances we’ve seen in recent years, deep down people still want to deal with people, and that’s another strongpoint for us. Many of our team have been with us for 10, 20 or even 30 years. That’s an incredible wealth of knowledge built up across the business. And that longevity also means we’ve been able to develop trusting, honest relationships with customers that date back for years.
That’s not to say we ever stand still though. We’re actively working to improve our presence in the IT reseller market and have agreed several new partnerships in the last few weeks alone. And while data centres will remain our bread and butter, we’re seeing rising demand from sectors such as healthcare and education too.
You continue to champion the cause of battery storage. Where are you with that at this present time and what does the future hold?
I’ll definitely keep banging that particular drum! The potential benefits of battery storage for data centres are obvious, and while it’s fair to say the industry hasn’t in the past embraced the possibilities with open arms, perhaps understandably so, the tide is slowly starting to turn.
There’s the realisation that the way we as a nation produce electricity is changing. The recent U-turns on building new nuclear plants by Hitachi and Toshiba are further proof of that. As we shift away from large-scale coal, nuclear, and thermal generation to renewables, it’s inevitable we move towards more dynamic smart grids, with diverse interconnected networks matching supply with demand in real-time.
This new energy mix will inevitably face the issues with frequency stabilisation that will only be overcome with the help of battery storage and demand side response mechanisms. And as the price of lithium-batteries continues to fall, the commercial case for data centres to get involved will only become stronger.
And does this rise of smart grids offer the opportunity to rethink the role of a UPS?
The possibilities are genuinely exciting. While it’s imperative for data centres to have a UPS protecting them against the threat of damaging downtime, how often is that crucial backup power actually called upon? In the main, it’s a reactive, underutilised asset.
But what if that UPS and its batteries were proactively working, and even earning money, for your data centre 24/7? That’s the reality of DSR, it has the potential to transform a UPS into a ‘virtual power plant’.
Cheaper off-peak energy stored and used instead of more expensive peak time mains supply, reducing a data centre’s power consumption and energy bills. Any surplus can even be sold back to the National Grid through DSR incentives such as Firm Frequency Response (FFR) or the assorted Reserve Services which cover any unexpected drops in power generation.
Energy efficiency and sustainability are obviously big priorities for Riello UPS?
They are. And they should be big priorities for everyone in our industry. Unless things change significantly, we’re facing up to an emerging electricity crisis. It’s not just up to the National Grid or the government to come up with a solution. We’ve all got to share responsibility and play our part.
For equipment manufacturers like ourselves, we always need to be on the lookout for ways we can improve the efficiency of our UPS and promote the most sustainable solutions. That’s why we were the first supplier in Europe to introduce an Eco Energy Level that rates our products based on their efficiency, enabling customers to easily make informed decisions.
For data centre operators it means getting off the fence and actually supporting schemes such as demand side response and battery storage, which not only make commercial sense but work for the greater good too.
In a wider context, we’re aware resources are precious and are proud of the steps we take as a business to continuously reduce our environmental impact. In the last year we’ve eliminated single-use plastics across our day-to-day operations and offset nearly 210 tonnes of carbon emissions, achieving carbon-neutral status. We’re also excited to renew our official partnership for a second season with the Audi Sport Abt Schaeffler team in the Formula E championship for electric cars, the pinnacle of electric vehicle technology and innovation.
Is 2019 the year when ‘the edge’ will truly take centre stage?
Just take a look around, whether you’re at home or in work, and the answer to this should be obvious. Whether it’s our smartphone-dominated personal lives, increased robotics and artificial intelligence on the factory floor, or automation taking on more and more daily tasks, as a society we’re increasingly reliant on data storage and processing power.
Our existing IT infrastructure is struggling to cope with this rapid digitalisation. Even data centres designed less than a decade or two ago weren’t built to handle today’s workload, which sees the average person interacting with a connected device nearly 5,000 times a day. And that’s before we even consider the rollout of superfast 5G wireless connectivity over the next couple of years.
That’s why moving towards the edge is inevitable, it’s the only realistic way to deliver the low latency, real-time processing that much of our daily lives now depends on.
What impact is this likely to have for data centres in general and UPS in particular?
With the boom in hyperscale facilities and the remarkable growth of large-scale cloud providers like Amazon Web Services and Microsoft Azure, ‘big is beautiful’ has become the conventional wisdom that has dominated thinking in the industry.
But edge turns that theory on its head. It’s probably best-described as a shift from data centres to centres of data, smaller, more flexible facilities more often than not installed in places not originally built with the needs of a data centre in mind – think a car park, a corner of a warehouse or factory, a disused office.
In recent months, we’ve worked on several edge projects using modular, containerised data centres. These are pre-built offsite before being transported to the end location. From the outside they might look like a big steel shipping container, but on the inside they include all the tech you’ll find in a standard server room – racks, cabinets, UPS systems, PDUs, cabling, and cooling systems.
Of course, these micro data centres rely on a clean and stable electricity supply as much as any enterprise or hyperscale facility does. In practice, modular UPS such as our Multi Power are the perfect partner. They deliver power in a compact footprint, so are ideal for the space-restricted layout of a containerised data centre. And the principle of modularity offers the in-built scalability to increase capacity or redundancy when required.
Does edge computing pose a threat to traditional enterprise data centres or hyperscales?
I wouldn’t say ‘threat’, it should be viewed more as a challenge or even an opportunity. While instantaneous, real-time processing requires the low latency that only edge computing can truly deliver, there’s still plenty of less time-sensitive storage and processing tasks, for example trend analysis, where a centralised data centre or the cloud will continue to make the most sense.
We’ve heard these predictions of the ‘death’ of the enterprise data centre many times before. They were wrong then and it’s wrong now. We need that healthy mix of edge, hyperscale, and cloud to meet society’s data-driven demands.
We can’t discuss the future without mentioning the dreaded ‘B word’. What impact do you think Brexit will have on our industry? Has Riello UPS taken any steps to ensure you’re prepared for any knock-on effects?
I honestly believe our politicians and many in the media are guilty of injecting too much ‘FUD’ (feat, uncertainty, and doubt) into the debate, which hasn’t been helpful for business confidence or society in general. Whatever happens on 29th March 2019, the world won’t stop turning and life will carry on.
That’s not to say we haven’t assessed the possible risks and proactively put contingency plans in place.
Normally we carry four weeks’ worth of stock for sales and servicing, but from January we doubled this to eight weeks. This means we’ll have ample supplies if, come the end of March, there are any delays shipping here from our manufacturing plants in Italy. We already do this to mitigate any issues during their summer shutdown period and are certain this increased volume will ensure we avoid any short-term problems.
Of course, there are other areas such as customs declarations where we may need to review and, where necessary, adjust our processes, but we’ll tackle those challenges as and when we have more details. On the whole, as part of a global business with representation in more than 80 countries, we are better-placed than most to manage the possible risks posed by Brexit and its immediate aftermath.
The start of trade show season is just around the corner. What can we expect from Riello UPS in the coming months?
Well, we’ve got a typically packed calendar, from major industry trade shows such as IP Expo through to networking events and roadshows with some of our partner resellers. It’s a great chance for our team to get out and about and meet lots of new contacts, as well as catching up with some familiar faces too.
Naturally, we’re all really looking forward to Data Centre World next month, it’s always one of the big highlights of the year. I can’t reveal too much, but it’s fair to say we’re pulling out all the stops this year, both on our stand D520 and our involvement in the show’s special “6th Generation Data Centre”, so it’s definitely worth paying us a visit if you’re attending!
Finally, we believe congratulations are in order – you’re a Guinness World Record holder?! Tell us more about that
Well that would be stretching it a little bit, but yes, I did play a very small part – along with my beloved beagle Ziggy – in setting a new world record for the largest ever single breed dog walk.
The event itself, Beaglelandia, took place back in April 2018 but we had a long wait until just a couple of weeks ago to get the official verification from the Guinness World Records people.
Obviously, it’s great for everyone involved to be part of setting a new record, and the event itself was lots of fun. But the most important thing about Beaglelandia was the £10,000 raised on the day for the charities Beagle Welfare and Unite to Care. Who knows how long the world record will last for? Chances are it’ll be beaten eventually. But that money is already making a massive difference, and will continue to do so.
Schneider Electric’s VP of Innovation and Data Centers discusses Edge Computing, Hyperscalers, 5G and the re-emergence of Liquid Cooling.
By Steven Carlini, Vice President Innovation and Data Center IT Division, CTO Office Schneider Electric.
The industry has seen some extremely interesting developments throughout 2018. One trend has taken centre stage as the vast majority of compute and storage continues to be funneled into the largest hyperscale and centralized data centres.
At Schneider Electric, we’ve seen a targeted move of the Internet giants to occupy colocation facilities as tenants, using these large facilities to deploy critical infrastructure and applications closer to customers. This need is driven by the insatiable demand for cloud computing and local compute, which is accompanied by the need to quickly reduce latency and Internet costs, resulting in the emergence of ‘regional’ edge computing – something that we describe as localized versions of ‘cloud stacks’.
At Schneider Electric we don’t believe this change will stop at the regional edge. As the trend continues to gain popularity, it’s likely that we will begin to see more of these ‘cloud stacks’ deployed in the most unlikely of places. After much deliberation on its definition, we believe 2019 is the year that we will really see the edge playing a dominant roll, spilling into the Channel and creeping ever closer to users in more commercial and retail applications.
Hyperscalers need faster deployments
Given everything that’s happened in 2018, it’s clear that the demand for cloud computing will neither subside, nor slow down. At Schneider, we believe the next 12 months will see it accelerate further, meaning the Internet Giants will continue to build greater levels of compute capacity in the form of hyperscale data centres.
The Market demand will mean these giants also need to build their facilities increasingly quickly. In some respects 10MW to 100MW projects may even need to be designed, built and become operational in less than twelve months.
One key to accomplishing such aggressive timeframes is the use of prefabricated, modular power skids, which combine UPS, switchgear and management software in one predictable and factory built, pre-tested package. A great example of the use of prefabricated infrastructure in today’s data centres is Green Mountain’s recent choice to add 35MW of capacity to their Stavanger and Telemark sites.
Since the lead-time for this type of power equipment can in some regions take up to 12 months, having a solution built and ready to deploy eliminates any delays during the critical path of the design and construction phase. In this case it has enabled Green Mountain to complete the first element of their staged project at the Rjukan site by April 1st, 2019.
One might also consider that within the data halls of other colocation and hyperscale providers, compute capacity will also become more modular, allowing the user to simply roll new racks and IT infrastructure into place. A solution such as this will need to be in some respects Vendor-neutral, allowing racks and IT to be quickly deployed, thereby removing the complexity and any accompanying timing challenges for the user.
IT and Telco data centres will continue to collide
The discussion around 5G has continued to move forward, but in order for it to deliver on the promise of ‘sub 1 ms latency’, it will need a distributed cloud computing environment that will be scalable, resilient, and fault-tolerant. This distributed architecture will become virtualized in a new way - namely cloud based radio access networks (cRAN) - that move processing from base stations at cell sites to a group of virtualized servers running in an edge data centre. In that respect, we believe significant buildouts will need to occur on a global scale in order that metro core clouds are available throughout 2019 and thereafter.
These facilities could be classed as ‘regional data centres’, ranging from 500kW to 2MW in size. They will combine telco functionality, data routing and flow management, with IT functionality, data cache, processing and delivery.
While they will enable vast performance improvements, it’s unlikely that they alone will be able to deliver on the promise of sub 1ms latency due to their physical location. Due to the increase in urbanization it’s not easy to find the space for new (and large) data centres within today’s cities. It’s more likely that the world will begin to see sub 1ms latency times when the edge core cloud deployment happens in 2021 and after.
This is where localized micro data centres will provide the vehicle for super fast latency, delivering high levels of connectivity and availability for both 5G providers and their customers.
A.I and liquid cooling
As Artificial intelligence (A.I.) continues to gain prominence, springing from research labs into today’s business and consumer applications, with it comes massive processing demands, placed on data centres worldwide.
A.I. applications are often so compute heavy that IT hardware architects have begun to use GPU’s for core processing, or as supplemental processing. The heat profile for GPU-based servers can be double that of more traditional servers with a TDP (total design power) of 300W vs 150W, which is one of the many drivers behind the renaissance of liquid cooling.
Liquid cooling has of course been in use within high performance computing (HPC) applications for sometime, but the new core application of A.I. is placing increased demands in a more intensive way, meaning it needs a more advanced, efficient and reliable mechanism for cooling. Liquid cooling is just one of the ways to provide an innovative solution as A.I. continues to gain momentum.
Cloud based data centre management
DCIM was originally deployed as on-premise software system, designed to gather and monitor information from infrastructure solutions in a single data centre.
While recent cloud-based management applications are deployed within the cloud, they enable the user to collect larger volumes of data from a broader range of IoT-enabled products. What’s more, the same software can be used across a greater number of large or smaller localized data centres deployed in thousands of geographically dispersed locations.
This new software, described by Schneider Electric as DMaaS or Data Centre Management as a Service, uses Big Data analytics that enable the user to make more informed, data driven decisions, mitigating unplanned events or downtime far more quickly than traditional DCIM solutions. Being cloud-based, the software leverages pools of data or “data lakes”, which store the collected information for future trend analysis, helping to plan operations at a more strategic level.
Cloud-based systems simplify the task of deploying new equipment within, or making upgrades to existing installations. This includes software updates for data centres in different regions or locations. In all cases, managing such upgrades on a site-by-site basis, especially at the edge, with only on-premise management software, leaves the user in a challenging, resource intensive and time-consuming position.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, the January issue contains the second, and this February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 1.
2019 will be the year that legacy modernisation reaches five tipping points
Predictions from Tim Jones, Managing Director, Application Modernisation, Advanced.
Gartner predicts that every dollar invested in digital business innovation through to the end of 2020 will require enterprises to spend at least three times that to continuously modernise their legacy application portfolio. These challenges are supported by research from analyst firm IDC which says that 70% of IT executives view the burden of legacy application systems as one of their top problems
So why will 2019 be the year of modernisation?
Read our views on the following five tipping points:
1. Digital transformation success
Despite the interest in digital transformation, many organisations are still using deeply entrenched, yet ageing legacy systems, on platforms such as OpenVMS and VME. This is a massive barrier to successful digital transformations. Most legacy systems can’t reliably support or integrate with modern applications, such as the Cloud. For those businesses looking to fully innovate and scale, a reliance on such ageing systems is going to result in legacy drag.
Organisations will find themselves hamstrung by systems that aren’t able to deliver agile IT– until they migrate to a more modern environment. The adoption of IT methods – such as the Cloud, DevOps and mini/microservices – is being driven by a need for IT to deliver greater efficiencies, scalability and manageability. Organisations are demanding agile technology solutions so they can keep up with competitors and adapt to a rapidly changing business landscape.
3. Technical debt
Simple maths will drive investment in modernisation. We know that dependency on ageing legacy applications is becoming increasingly expensive and problematic. These systems are increasingly leading to ‘technical debt,’ - a concept in software development that reflects the implied cost of additional rework caused by choosing an easy solution now, instead of using a better approach that would take longer - and lead to a drain on financial, physical and human resources.
Gartner has put together an overview on how to build a modernisation business case, to convince management to invest in application modernisation – read more here to help you reach that tipping point internally.
4. The skills crunch
There is still more than 200 billion lines of COBOL code in existence, and organisations are under increasing pressure to rapidly address ageing technologies due to a dramatic reduction in the availability of skills and a retiring workforce. Business leaders will soon have no choice but to move mission-critical applications – such as those built in COBOL/CICS for example – to an environment that has a wider pool of skilled people to support and maintain it. Again, Gartner has recognised this ‘skills crunch’ as a major tipping point that organisations face for the modernising of legacy application portfolios.
5. Pressure will drive an ‘acting with pace’ mentality
Finally, the pressure on IT to deliver more value-added services, greater agility and faster deployment times will create a catalyst for modernising business critical, legacy applications. We know that many organisations want to embrace Artificial Intelligence, Machine Learning and Robotic Process Automation for example. This pressure to innovate and embrace change will only intensify, shifting the mentality to ensure 2019 is the year when the journey to legacy modernisation can no longer be delayed.
Top tech trends in 2019
Tim Hall, Chief Technology Officer at managed IT services provider Blue Logic discusses some of the tech trends that will drive digital transformation in the new year…
The new year will be one of evolution and not revolution when it comes to business tech.
The likes of artificial intelligence and cloud are no longer technologies of the future, they are technologies of the now.
As such, organisations should spend the next 12 months considering how they can be used to drive digital transformation.
Here are the key areas you should be looking at:
AI is a comprehensive and complex technology that can help businesses work smarter, faster and better.
AI can be deployed across a range of functions to take over manual, repetitive and time-consuming tasks from employees. In addition, it can provide valuable insight into new data.
Instead of being a replacement for humans, it will be augmented into their existing roles. To succeed with AI, employees must be at the centre of it.
Analytics and AI:
AI can also be used to gather, segment, analyse and act upon on unprecedented amounts of data. It can do this in real-time, and spot trends and patterns that humans may miss.
AI and analytics will also be used to make highly accurate future predictions based on data that will allow organisations to streamline processes and spending.
Moving to the cloud:
Forward-thinking organisations have already started migrating to the cloud, and more will follow suit during the new year. Previous fears over security have been allayed.
Moving to the cloud shifts costs to an operational consumption model, allowing vast scope for business transformation. This enables businesses to scale up and down usage in line with demand leading to significant cost savings.
Cyber security remains one of the biggest challenges for organisations, and will remain a top priority in 2019.
Organisations may think that because they have up-to-date firewalls and anti-virus software in place that they are sufficiently protected.
The reality is that human error accounts for the majority of data breaches, and hackers are now using sophisticated phishing scams as their preferred method of attack.
Employee training and education is key to mitigating this risk as much as possible.
Compliance remains a concern:
No business can claim to be fully compliant with GDPR because there is no official checklist for what compliance looks like.
Organisations will need to continue to check and monitor their processes and procedures, and place data protection governance and security high on their agendas.
The Internet of Things will continue to be a major driver of digital transformation in 2019 with the rise of “smart spaces”.
Technologies can now monitor when an individual employee enters the office building, they can then track their movements and only light and heat the spaces they are in.
If those employees are in the office outside of standard hours, these technologies can also limit access to restricted areas by only unlocking doors to the kitchen, toilet, exit and so on.
Governments will turn to advertising vehicles in 2019 to support future smart cities
Says Richard Cross, CDTO, Clear Channel International.
Like many industries, the advertising sector has undergone a significant digital revolution over the last decade. From the rise of the computer and internet, to handheld devices and social media platforms, the advertising landscape bears little resemblance to its characteristics before the digital revolution.
One adverting medium is finding that its traditional qualities are just as advantageous in the digital age however – Out of Home (OOH) advertising. While online media has struggled against increased advertising noise, audience fragmentation and the rise of ad blocking, OOH still provides a trusted platform which audiences can’t ignore. You can’t ad-block Time Square, nor can you fast forward a bus shelter.
But crucially, the medium has also undergone its own tech-fuelled transformation. Just 10 years ago, a typical OOH billboard advert was a 3m x 5m poster-board, placed in a location with a large footfall of people. It would remain there, with the same message, for up to weeks at a time.
Today, networks of connected digital screens have transformed the sector, providing advertisers with a creative and flexible medium that can deliver contextually relevant messages, in real time, at scale. This flexibility means that, for example, on a Monday morning, advertisers can promote messages about audiobooks for commuters, but on Friday, in the exact same location, can promote weekend activities, like cinemas or new bar openings nearby. Digital Out of Home (DOOH) is an extremely flexible platform, the only real limitation is how creative brands are willing to be with their content.
Looking to 2019 and beyond, we’re seeing OOH having an increasing role beyond advertising – in today's smart cities. With two thirds of the world’s population expected to live in cities by 2050, networks of highly visible and pedestrian accessible screens capable of delivering real time content to its citizens is proving a popular communications vehicle for governments to speak to a city’s residents quickly and effectively.
In times of emergency, for example, OOH networks can be used as a vital communications channel. In the aftermath of the devastating Mexico City earthquake in 2017, Clear Channel immediately replaced all advertising content on its network of 29 digital billboards with emergency telephone numbers, live news updates and details of local spaces providing safe refuge, reaching over two million residents.
Moreover, DOOH screens incorporate interactive technologies so that it has the capability to be used as a mass, two-way communications channel. In France, the government used over 8000 DOOH screens nationwide to gather residents’ opinions on the COP21 debates around climate change. We’re also expanding our assets to offer citizens additional services, connectivity and utility whilst on the go such as free WiFi, mobile charging stations and live wayfinding information.
The OOH industry is also supporting cities meet some of their environmental challenges. Twenty years ago, Clear Channel pioneered public bike sharing schemes, launching the first of its kind in the City of Rennes, France. Today, we are installing electric car charging facilities and putting air quality sensors into our panels to provide cities with valuable live environmental health data - whilst continuing to declutter streets by removing unwanted street furniture.
We predict that 2019 will see more governments and municipalities adopting the new digital capabilities of out-of-home to communicate with and support their citizens, in an innovative and intelligent way. Harnessing the capability of OOH in future smart cities will largely be a question of how creative governments are willing to be, and we’re well-placed to support them on that journey.
The Year Ahead: CyberArk’s Top 2019 Cyber Security Predictions
Lavi Lazarovitz, Head of Security Research at CyberArk Labs.
Cyber security’s 2018 megatrends and emerging threats have created the perfect storm for a potentially tumultuous 2019. But it doesn’t have to be this way. Organisations must look to the threat horizon and collaborate to out-innovate and out-manoeuvre the attackers.
As we head into 2019, here are five security predictions for the near year:
1. Prediction: Emerging ‘Unique Human Identities’ Under Attack
We’ll see a new wave of attacks against emerging ‘unique human identities’ – or newly engineered biometric markers for digital and physical authentication. Biometric fingerprint, voice and face ID authentication controls have proven effective in consumer devices, and organisations will increasingly use authentication methods such as embedded human microchips. Attackers will increasingly target these identities to gather biometric data for future modelling purposes and other nefarious uses. Genetic consumer-services and biometric stores within organisations will become key targets, further elevating privacy concerns.
2. Prediction: Government Social Media Becomes Regulated as Critical Infrastructure
Governments will start counting government sanctioned social media accounts – both for elected officials and agencies – as critical infrastructure. Much like government text messages are regulated in numerous ways, social media will become regulated as well.
Social media has emerged as a critical tool for governments to communicate with citizens – whether it’s individual politicians and elected officials, or official accounts for government agencies and organisations.
Whilst social media allows for the rapid dissemination of critical information, it also has a dark side, illustrated in the past year by the false missile alerts that sent residents of Hawaii and Japan into a panic. This provides a glimpse of how attackers could use official social accounts to spread chaos.
3. Prediction: Trade Wars Trigger Commercial Espionage
Government policies designed to create ‘trade wars’ will trigger a new round of nation-state attacks aimed at stealing intellectual property and other trade secrets to gain competitive market advantages. Nation-state attackers will combine existing, unsophisticated yet proven tactics with new techniques to exfiltrate IP, as opposed to just targeting PII or other sensitive data.
While these attacks will predominantly be carried out by malicious external attackers, we’ll also see an uptick of insider attacks, especially in cutting-edge industries like autonomous cars (much like occurred at Apple in June 2018). We’ll see attacker dwell times extend as nation-states spend more time conducting reconnaissance and carrying out these trade-driven attacks. We’ll also see the emergence of nation-state weapons becoming commercialised on the black market. This same phenomenon happened after Stuxnet, Petya and NotPetya, where cyber criminals take pieces of code from massive nation-state attacks and incorporate them into their attacks.
4. Prediction: Supply Chain Meets Blockchain
Blockchain will transform the supply chain in 2019. Following allegations of nation-states targeting the supply chain at the chip level to embed backdoors into both B2B and consumer technologies, organisations will embrace blockchain to secure their supply chains. The distributed nature of blockchain makes it well suited to validate every step in the supply chain – including the authenticity of hardware and software. We’ll continue to see increased attacks early on in the supply chain, and there will be greater need for this level of validation.
It’s never too soon to start preparing for the year ahead. In an age of tighter regulations and increasing threats, companies should ensure they get ready for the new year and implement forward thinking strategies to guarantee compliance and security.
Malcolm Harkins, chief security and trust officer at Cylance.
Terrorist Related-Groups Will Attack Population Centers With Crimeware as a Service
While terrorist-related groups have been tormenting organizations and individuals for years, we anticipate more potentially destructive attacks in 2019. Instead of breaking systems with ransomware, adversaries will leverage new tools to conduct harmful assaults on targeted subjects and organizations. From attacks on data integrity that essentially kill computers to the point of mandatory hardware replacements, to leveraging new technology for physical assaults such as the recent drone attack in Venezuela, attack surfaces are growing and enemies will take advantage. To combat this, organizations must take inventory of their attack landscape to identify and mitigate potential threats before they are exploited.
There Will Be a Revolt From Security Buyers on the Rising Cost of Controls
As the security industry grows, the cost of controls and the number of breaches grow with it. In fact, the 2018 Verizon DBIR report identified over 53,000 security incidents this year, including 2,216 confirmed data breaches. As the endless cycle of cyber attacks continues, the security industry will come under assault from its customers for perpetuating a growing burden of cost that’s not productive to the mission of an organization. Better technology should allow customers to better manage their costs, and organizations who do not understand this will face waves of backlash in the in the new year.
AI-Based Technology Will Distinguish Sensitive From Non-Sensitive Data
Currently, parsing through data to determine what is sensitive versus non-sensitive is a manual process. Users have to classify data themselves, but users are lazy. In 2019, AI-based technology will gain the ability to learn what’s sensitive and automatically classify it. This development will necessitate increased consideration of how to manage this data, and furthermore how to control it.
Companies are also beginning to automate penetration testing, allowing pen testers to work on more unique or advanced red team/pentests. Additionally, these automated processes allow for control validation, which lowers costs and provides researchers with a higher degree of assurance. In order to keep up with this rapid growth, traditional companies will need to accommodate automation by further developing their solutions or seeking integrations with new automation-focused industry vendors.
Current Biometric Methods Will Actually Increase Privacy Penalties and Risk
While some organizations are currently adopting end-user behavioural analytics in their networks, these technologies can be costly and increase privacy risks. Data is being collected then processed on the endpoint, leaving it susceptible to attack. In 2019, organizations must begin to adopt continuous authentication to protect crucial identifying information. With this technology, end users’ biometric footprints will be able to determine identity without incurring the privacy penalty, risks and costs that traditional biometrics or central-based behavioural analytics typically face.
Christian Nagele, Interim Head, EMEA, Datto and Ian van Reenen, VP, Engineering, Endpoint Products, Datto have made some predictions below, on what they foresee for IT infrastructure and end user systems' providers next year, including how things might change after BREXIT, continuing security and privacy issues from GDPR, the need for IoT device visibility and the ongoing skills shortage across the technology sector as a whole.
No rest from regulation
according to: Ian Woolley, CRO of Ensighten.
Regulation was a hot topic in 2018 spurred on by GDPR coming into force and it will continue to dominate conversation in 2019 as other global policies such as the California Consumer Privacy Act (CCPA) play out. The challenge we’ll see for global organisations is managing the nuances of regional data practices simultaneously. Technology will help companies navigate this but as we’ve seen with GDPR there are various interpretations of what regulation means. As such, many businesses may opt to employ the strictest data practices and processes companywide to avoid potential slip ups and penalties.
Still searching for answers
Data breaches have saturated the media this year and business leaders are starting to now realise the true impact a website hack can have on an organisation. The financial and reputational risks, as well as possible job losses will ensure that security is at the top of the priority list for 2019. As some businesses are having this revelation late, we’ll see more legacy hacks and leaks come to the fore. Despite the urgency to address data vulnerabilities, most companies are still in the education phase of data governance and how and why breaches occur. Therefore, we will see more companies scramble to protect themselves as they identify the real threats lurking beneath their website supply chain. Once companies have a clear picture of where they are vulnerable, we’ll see more investment in thorough data governance.
Glory hunting hackers and advances in AI
Many businesses fear that hackers will leverage AI to unlock new ways to infiltrate websites and apps at scale. We may see video and audio manipulated to fool consumers but AI will most commonly be used to configure and learn defence tools to inform future breaches or to bypass more advanced security implementations altogether. While many industry commentators focus on how hackers will evolve, a great deal of criminals will still prey on businesses that don’t have the basics covered, for example overlooking unauthorised third party technologies running on websites. This will be the main cause of breaches and leaks throughout 2019.
As we’ve seen with the rise of Magecart, there is also a growing trend of groups taking credit for their crimes. We will see more named attacks in 2019, as hackers look to carry out bigger and more damaging assaults on businesses, especially e-commerce brands.
The birth of the hybrid ‘marketing security’ team
As many website hacks have highlighted in 2018 one of the core causes is problems with third-party technologies. Via chat boxes, form fill and unapproved third-party tags on a website, criminals can gain access to customer data sometimes even without the organisation’s knowledge. The challenge is that marketers are generally in charge of this data but haven’t necessarily been accountable for the protection and security of this data. In 2019, businesses will view security more holistically. To do this companies will look to bring more senior security talent in house to navigate the new data landscape and regain control, rather than outsourcing security to multiple vendors. But this will squeeze an already limited pool of skilled professionals. With lack of talent available we will likely also see a shift in the role of the marketing team – businesses will put more onus and investment in upskilling marketers so that they have a marketing security remit. At a more senior level, we’ll see the CMO and CISO start to work more closely to mitigate security vulnerabilities.
2018 has been a learning curve. New data regulation has revealed issues that many companies were not even aware of. This, in the long term, is a good thing for data owners and also their customers. However, businesses are still in the process of addressing the security of their data and this will continue to trip up organisations in 2019. Constant, thorough data governance will be a core requirement next year – brands that neglect to put the right processes, technology and people in place will pay the price.
The corporate data centre has long been the dependable engine, quietly powering the work of the organisation, only paid attention to when there is an outage, or a large capital expenditure needed.
By Vicky Glynn, product manager, Brightsolid.
However, as technology develops, and data increases exponentially, the need for increasing scale and power means that the day of the corporate data centre is coming to an end. Technology developments we see in the workplace today - from IoT, analytics, blockchain and machine learning - are beyond the capabilities of traditional technology solutions that have dominated the years past.
Recent reports outline that 90% of the data available today was created within the last two years, and it’s likely that over the next two years, we will see data grow by a further 40 times. To manage the data boom, flexibility, agility and scalability are becoming key priorities for IT leaders. No longer is it economically justifiable to have an internal infrastructure, when the need for an array of applications and data storage is constantly evolving, and the plethora of credible alternative cloud solutions is strong.
In parallel, there is pressure from vendors to go down the cloud route. Traditionally, the investment of a data centre could span across 10 to 15 years, however with shorter IT contracts frequenting the industry in line with rapidly-evolving technology, it becomes difficult for businesses to forecast larger tech investments.
According to Gartner, by 2025, 80% of enterprises will have shut down their traditional data centre, versus 10% today. For these businesses, the crucial planning period will take place over the next three to six years. Establishing an enterprise cloud strategy can be challenging for IT leaders who are unsure how to begin the planning stage, what lines of business to include, how to produce a strategy document that details the initiative, and which partners to engage to help establish a cloud infrastructure.
Three considerations for IT leaders during the planning phase are:
A major part of planning is deciding which type of cloud solution is right for the business. Below are four common solutions, including the benefits and drawbacks of each:
To colocate, or not to colocate?
Colocation provides businesses with a secure environment to house their infrastructure off-premise. This allows the business to forgo the space and running costs of a data centre environment, including the building maintenance, security, air conditioning, power and bandwidth, as well as the need for in-house expertise to run it. However, businesses should keep in mind that there is still an investment to be made in hardware and software. In colocating their infrastructure, there is also the risk of outages during the lifting and shifting period.
Keeping things private…
Businesses can turn to a specialist provider to offer them private cloud. These providers typically offer best of breed equipment, which can be customised to the client’s needs. However, there are also constraints within this option; it is still finite in terms of scalability and businesses may be restricted to solutions by vendor lock-in. This can prevent the business securing a competitive price or tie them into using a vendor’s solution that isn’t quite right for them. The boundaries are also blurred in terms of who (provider or client) manages what part of the cloud package.
…or going public?
The benefits of selecting a hyperscale public cloud provider, such as Amazon Web Services (AWS), Microsoft Azure or Google Cloud Platform, are the high level of support offered with its services, its ability to innovate at a rapid pace, regularly introducing new services, and the cost-savings it can offer your business. It can also provide ‘software as a service’ for smaller businesses that do not require the infrastructure.
However, with its rapid innovation, comes some disadvantages for the end user. It can be difficult to maintain the training staff require to keep up-to-date with changes in the solutions offered by the provider. Businesses may become reliant on a certain solution and at any given time the provider may choose to discontinue this service. With access to a public cloud, it can also be difficult to tell where the responsibility of security lies and could work out costly for those looking for a bespoke programme.
The best of all worlds
Remember the trifecta for IT leaders? Flexibility, agility and scalability. To achieve this, I recommend adopting a more hybrid approach.
The Gartner IT glossary defines hybrid cloud as “policy-based and coordinated service provisioning, use and management across a mixture of internal and external cloud services”. Put simply, hybrid cloud is a modern infrastructure solution that uses a blend of on-premise, private cloud and public cloud services with orchestration between the platforms.
We find that hybrid cloud is growing in popularity in our customer base. As organisations progress their cloud journey, they often realise that a single cloud model may not realistically meet their needs, and that a blended approach to cloud is a more achievable goal. Organisations need to ensure that they have the infrastructure and technology that is ready to act and adapt when they are, and hybrid cloud can be easily scaled up or down to match the needs of the business. The flexibility offered by hybrid cloud will not only help organisations in how they can adapt to change but will in turn attract customers who expect organisations to offer ease of use/access when they choose to interact with them.
Furthermore, for businesses not quite ready to ditch the traditional data centre, they can incorporate their legacy equipment into the hybrid solution and maximise their legacy infrastructure investment – at least until the time comes to go ‘cloud only’. The trifecta that is afforded through a hybrid cloud solution will undoubtably support the future-proofing of an organisation’s technology and infrastructure.
So, will the corporate data centre survive another year?
Migration to the cloud doesn’t happen with the click of a finger, but 2019 and the imminent years to come are certainly going to be a period of serious planning for IT leaders looking to enable positive change within their business. Whether the corporate data centre will survive these changes in the long-run, is a question yet to be answered, but it is unlikely. Thankfully, the aforementioned cloud solutions are modernising the way we store and manage data and appear to be here to stay.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, the January issue contains the second, and this February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 2.
“In 2019 biometrics will continue to contribute to a simplified customer experience when it comes to online authentication. The new version of 3D Secure, for example, will be completely adapted to mobile devices and enable the implementation of secure biometric identification technologies such as fingerprints, iris scanning, and facial recognition. As more consumers get comfortable with biometric authentication the limitations and drawbacks of passwords will become increasingly salient. While this development is welcomed, it also creates an imperative for the industry to create and rally behind technical standards and established best practices, which can also inform emerging government regulation around this technology.”
Enterprises will wake up to the full potential of security keys thanks to advances in authentication standards
“While the public at large might not be familiar with them, security keys are becoming increasingly common in corporate environments, and in 2019 we will see increased interest in the secure and convenient forms of authentication they can offer. This year Google opened many eyes by detailing how a multi-year case study on Google employees using FIDO Security Keys resulted in zero phishing attacks -- and also greatly reduced support costs and increased employee productivity. As such, we are expecting enterprises of all shapes and sizes to follow suit. Security keys also offer a decentralised approach to authentication, with data never leaving the device, which pairs very well with the new regulatory landscape around data protection, especially in regards to GDPR.”
Strong Customer Authentication (SCA) will become norm rather than novelty
More online platforms will be able to ensure SCA
“The ability of online platforms to leverage strong authentication will only accelerate in 2019. This year the FIDO2 strong authentication protocols have already made waves in replacing passwords with cryptographically secure logins using convenient alternatives like on-device biometrics and security keys. With leading browser support from Google Chrome, Microsoft Edge, Mozilla Firefox -- as well as (forthcoming) platform support in Windows 10 and Android -- we predict that the incorporation of stronger authentication capabilities will continue to proliferate to millions of new internet users next year as well.”
Banks will have to start prioritising SCA
“Given that banks within the EU have to fully comply with the new Strong Customer Authentication (SCA) requirements of the Payment Services Directive (PSD2) by September 2019, we will inevitably see an increased investment in robust authentication security, including technology such as biometrics. While European banks are behind Asia and the United States when it comes to deploying modern authentication, the presence of open and highly scalable standards for them to adopt will help them catch up in 2019 as they find new ways to efficiently authenticate users without compromising security.”
There will be a pick-up of SCA amongst Government agencies
“The public sector is often slow at adopting new technologies, but we are currently seeing a rapid pick-up of simpler, and stronger, authentication security as the ways in which employees and citizens access government services continue to digitalise. While often invisible to many consumers, and under-reported in the media, this will bolster up the security of millions of day-to-day public service interactions across the globe in 2019.”
Rajen Sheth, Director of Product Management for Google Cloud AI and ML
“The Industrial Revolution magnified the human's physical ability and made it so that you can actually build more capable products and replicate them at a faster rate. I think AI is the same for people's mental abilities. If you're talking about 2039, every business will be transformed by AI the same way that every business over the last 25 years was transformed by the internet. It's this fundamental shift—similar to what we experienced with the internet, similar to what we experienced with PCs before that. We're just at the very beginning.
“Start by thinking about how we do things today. From the moment we wake up in the morning all the way through to when we go to sleep, we make decisions, big and small. I'm always amazed at how many of these decisions are made using very little data and imperfect data. AI will supercharge every small or big decision we make. So if you think about an everyday decision—like what podcast to listen to, what should I buy, what route should my car take to work—we'll make decisions based on all the world's information, as opposed to just the information I happen to have in my head.”
Automation, Education, and AI's Effect on Society
“AI is going to change the types of jobs that are out there. It probably won't change [them] as drastically as people think, because in a lot of these situations, the need for human intuition, empathy, and decision-making is still going to be needed. We need to be able to prepare the workforce for that next generation of jobs. I think there's an opportunity to rethink how we train people. The average person changes careers multiple times and learns things on the job that didn't exist [when they were] in college. This concept of continuous learning needs to be baked into how people work, so they can evolve as technology evolves. If you think about a classroom—a teacher standing in front of 30 students or a university professor standing in front of 300—we need to rethink how to redo education both in terms of personalizing it to the individual while making this kind of continuous education possible on a daily basis. AI is going to be a key to that. AI can make it so that education is built into everything that we do on the job, at home. It can scale in a way that the education system right now can't. One thing Google did recently is publish our AI Principles. It's almost like our constitution for how we believe AI should be used. [Often] a lot of uses of AI are in a grey area. A lot more work needs to happen, but we've turned [the guidelines] into an operating principle. We're learning as we build this technology, so it's important to set ethical AI principles at the outset; to think about these situations are they come up and adjust. As we've examined it, there are very few things that are just unquestionably good or unquestionably bad.”
User experience will continue to be the main driver for competitive differentiation
Alan Coad, UK&I MD, Google Cloud
A great user experience will continue to be the main driver for competitive differentiation
“The ease at which we as consumers can access information, entertainment, shopping, make social connections and get what we want via personal devices continues to propel new expectations in the way we want to consume the products or services we love. No doubt we are reaching for the internet to discover even mundane items such as kitchen towels online; we expect instant gratification, with same day delivery or same day travel showing enormous growth and we expect to be wildly assisted at every touch point.
“These changes present huge opportunities and the enterprises who will thrive in this rapidly changing landscape are those client-obsessed businesses who not only put an emphasis on a great user experience, but also can predict changing consumer habits, and course correct, fast. Agility, iterative software development practices and a culture of continuous innovation has become essential for keeping pace with rapid change - but getting ahead of this demands an ability to extract meaningful insights from large and distributed data sets, in real time. We are now entering an era of digital transformation where ‘going fast’ is giving way to an imperative to ‘go smart’ as a means of gaining that competitive edge and creating a truly intelligent enterprise.”
Established businesses can drive the next wave of disruption
“For the past decade we have come to expect that disruption will come in the form of a software startup from Silicon Valley. While it is true that incumbents are often disadvantaged by cultural inertia and legacy, many are well on their way to becoming truly agile, bringing software development back in-house, adopting the latest iterative techniques and utilising the tooling and modern cloud platforms that enable collaboration and rapid application development at scale. Equipped with the ability to build software at ‘startup speed’, we are now seeing these enterprises look to operationalise their vast data stores to mine insights and drive their innovation pipeline with new sources of value. We predict that 2019 will be the year where enterprise digital transformation shifts from ‘going fast’ to ‘going smart’. With the power of Cloud ML/AI, we see the real possibility for many of our most admired brands to disrupt and redefine their industries. It is a great time to be a consumer.”
What does 2019 hold for Managed Service Providers?
Here’s what Edmund Cartwright, sales and marketing director at Highlight thinks…
With 2019 rapidly approaching, service providers are under more pressure than ever as customer expectations continue to rise. Service providers must go above and beyond to exceed these expectations and fulfil promises that have been made. Conventional contracts and SLAs alone are no longer sufficient and will not provide the assurance that customers require to operate their businesses smoothly and hassle free.
Where many service providers are falling short is in their lack of service differentiation and corporate transparency. If a provider continues to operate in this way with their customers, they will be heading towards an undesirable business outcome in 2019 where they struggle or fail to meet and exceed revenue as well as customer experience performance targets.
While large service providers have begun and will continue to acquire software companies with technologies that enable holistic integration provisioning and support systems, underpinning and automating service and account management activities, it’s the smaller, newer players that will have to employ technology to ensure they can survive and thrive.
The emergence of cloud services and solutions is expected to continue its exponential increase as more businesses outsource IT needs in order to streamline and create cost efficiencies. Corporates will face the challenge of selecting the best technologies from thousands of SaaS, IaaS and PaaS vendors. 2019 will be an even tougher competitive landscape for service providers considering the saturation of the market. Small and medium service providers will have to “box clever” to win, retain and grow their business based on excellent technology and superior customer relationships.
Many providers will look to develop in house applications to bring the differentiation they so desperately need. Others will proactively partner with technologies that enable them to offer superior customer experience whilst employing the least number of staff to achieve sustainable competitive advantage.
Whichever path these service providers decide upon, it will be critical to ensure that the decision makers of their customers’ businesses are at the forefront of their technology choices. Service providers will need to collaborate and communicate with their customers effectively in order to determine which services are essential to their customers and why.
In 2019, the winners will be the service providers who use their business intelligence (knowledge management) to deliver the right solutions at the right time, ensuring that business challenges are solved while maintaining necessary security measures to protect sensitive data.
Transparency at every level will be the priority, whether it’s network and application performance levels, financial reporting or account communication between service provider and customer. Service providers who are open and transparent with performance data will empower corporate decision makers to better analyse and craft informed strategies (data driven decisions) to fulfil their business needs. This will drive revenue and cost lines to achieve improved company value.
We predict that 2019 will be yet another year of significant turbulence for MSPs small, medium and large. It will be down to prioritisation of transparency and collaboration through the best technology that will enable an MSP to thrive in a tough market.
Low code IT solutions set to empower staff
A new wave of low-code solutions can revolutionise corporate IT, allowing businesses to accelerate their move away from cumbersome legacy systems. Bob Dunn, associate vice president of EMEA and APAC with Hyland, highlights five ways low-code can be harnessed to great effect.
Low code development strategies present a golden opportunity for entrepreneurial organisations to re-shape their entire IT functions around the needs of workers and customers. Whereas previously most departments had to request a new IT project from management and then hope it was approved before waiting for their system to be installed (which could take quite a bit of time with custom development), now departments can get up and running much more quickly and make their own tweaks to the tools they already use.
With low-code technology platforms, front-line users are empowered to take charge and constantly evolve their software solutions. This is a significant step forward, as until now the disconnect between technology and its end users has been a major stumbling block to many projects.
Five key advantages of a low-code solutions
Of course, there are pitfalls to allowing devolved responsibility for IT platforms! Suitable security must be put into place to encourage better visibility across the organisation. It is crucial that low-code solutions are implemented in an environment where information is shared between departments, and senior managers have a full overview of what is happening. Such a system - which can be built using an enterprise information platform with native content, process and case management capabilities - has many advantages beyond IT implementation but is particularly important when encouraging a highly adaptive, entrepreneurial approach.
Matt Hooper, SVP Global Marketing, IMImobile
We will see a more intelligent approach to customer experience automation in 2019. By taking a Communications Platform as a Service (CPaaS) approach, customer experience teams can bring together various sources of customer data from different systems, allowing organisations to align and automate communications alongside customer journeys and processes. This will enable them to intelligently trigger and manage customer interactions in the future. Not only will this help improve customers’ digital experience, it will provide productivity gains and reduce the risk of companies investing in technologies and projects that don’t deliver.
Jason Chester, Director of Global Channel Programs, InfinityQS.
Around this time of the year, I am often asked about my predictions for the manufacturing sector over the next 12 months. While personally, I do not like to make predictions on such a small and immediate timeframe, I do think there are some emerging ‘mega-trends’ that are often lost amongst the industry noise.
The biggest one that I predict will soon emerge as a major focus area across the manufacturing industry is process optimisation. The industry talks so much about topics like the factory of the future, smart factories, digital transformation, Industrial Internet of Things (IIoT) and Industry 4.0, yet we often address each in isolation of the other. I believe that all these concepts are beginning to coalesce around a single major strategic opportunity for the industry, defined as process optimisation.
Sounds simple right? Unfortunately, even relatively simple optimisation problems are often very complex (take the Travelling Salesman Problem in computer science as an example), and as a result, what people refer to as ‘optimisation’ today is often just a very broad approximation of this concept. To achieve and maintain true optimisation around critical dimensions of cost, value and risk across highly disaggregated supply chains and manufacturing processes is a very complex undertaking. It is one that has evaded our traditional manufacturing methods and management techniques - up until now.
As we move into 2019 things are starting to look different. We are at the point where the maturing and convergence of a number of critical technologies are making the possibility of achieving genuine real-time industrial optimisation a tangible opportunity in the very near future. IIoT combined with Big Data and the evolution of cloud software, platform and infrastructure ‘as-a-service’ will enable commercial organisations in all corners of the world - from startups to large multinational enterprises- to properly optimise their operational processes. This will be invaluable for manufacturers who need greater visibility across their production lines and supply chains and will enable them to rapidly improve operational efficiencies. That is my big prediction for the new year and beyond…
The industry faces new demands every year and in 2018, data centre providers saw the call for more environmentally-responsible operations and increased data demands from the proliferation of AI, machine-learning and automation. So, what will 2019 bring?
By Jackson Lee, Vice President of Corporate Development at Colt DCS.
Data centres have traditionally been known to operate in silos. Owing to their large campus area requirements and heightened power needs, they tend to be located in the outskirts of cities, separate from network providers, cell towers and fibre network providers.
In 2019, this will change with an increase in the building of compute farms – where all these functions will converge to operate within a large campus. Sights such as cell towers located at the bottom of data centres will become commonplace.
Operations will shift from being functionality-led to being customer-centric, to provide for every aspect of the data journey. Data centre providers are thinking more about the entire data delivery journey and will increasingly invest in other areas of the business – such as network functions – to ensure they are able to provide the necessary infrastructure to ensure the data journey can start and end within the same location.
Bye bye bitcoin?
If there is one buzzword that took 2018 by storm, it would be bitcoin. The immense popularity of cryptocurrency meant that many industries had to prepare for the incoming wave of data and power needs stemming from data mining. Warnings were abundant – the tail-end of 2017 saw data centre providers bracing for the impact that bitcoin popularity would have on the industry.
But then, nothing happened.
Owing to the fluctuating value and uncertainty surrounding the trend, it did not end up much of an impact on data centres. As a form of currency, bitcoin definitely still has legs and it will continue to grow in 2019. However, mining as a process is not as power-hungry as anticipated and hence, data centre providers can collectively breathe a sigh of relief.
Where in the world will the data come from?
In 2019, markets such as India and South America will feature heavily in conversations around growing data consumption. The starting signs have begun to appear with the recent news discussing the acquisition of Brazilian data centre operator Ascent.
Mumbai, Chennai and Brazil are big nations with even bigger populations. These emerging markets are also freshly riding the wave of mainstream smart phone adoption. As video streaming and social media usage picks up the pace, so does the demand for quicker, and more reliable data processing. These markets will need hyperscale data centres to answer to the sudden peak in demand for data storage and processing.
In contrast, markets at the forefront of tech innovation such as US and Europe will turn towards the edge. As AI and IoT adoption take centre stage, the need for information processing to occur much closer to the user and the cloud becomes critical so as to minimise latency. These markets will see a growth in the number of edge and micro-edge data centres to allow for improved speed, reliability and efficiency in data processing.
2018 saw the implementation of the long-awaited GDPR regulation, the most important change in data privacy regulation in 20 years in the EU. The roll-out signals what’s to come in the coming months, with governments looking to introduce more stringent data protection and sovereignty laws.
In fact, India is already on the cusp of introducing a law that states that data generated in India is to be stored within the country.
Whether it be having to build operations closer to the user (e.g., with GDPR) or turning to other markets to avoid strict regulations, these laws will unavoidably have a significant impact on the data centre industry. It will also play a critical role in influencing the strategy for data centre providers looking to penetrate these markets.
With exciting tech trends, international developments and regulatory changes posing new challenges, 2019 will definitely be an exciting year for data centre providers around the world!
“Digital transformation” in the data centre has achieved buzzword status among media and marketers everywhere, but there are real benefits to be gained from embracing the migration from manual, analogue information systems to automated digital ones. Diverse companies like Boeing, Google, IBM and Uber achieve notable successes by embracing digital processes. CEOs of other companies are catching on.
By: Olivier Alquier, Vice President of Enterprise for Europe, CommScope.
There will be more demand for digital transformation initiatives than ever before in 2019 because CEOs are finally realising that, for their companies to stay relevant, their businesses need to transform. As 5G networks and devices start to come online in 2019, this will drive several kinds of technology that will be needed. Data centre managers simply need to be prepared. There are five technologies in particular that will continue to significantly impact data centres in 2019: 5G wireless, the Internet of Things (IoT), virtual/augmented reality (VR/AR), blockchain, and artificial intelligence (AI). In this article, we’ll look at how these technologies support the trend toward digital transformation in the data centre.
In a very real sense, we’re moving toward the Internet of Everything, and the final link between devices and the network is likely to be wireless. 5G wireless technology delivers significantly higher data rates and significantly more data in many more places. Mobile workforces, from sales teams to delivery, maintenance and repair organisations, will all use more sophisticated applications that rely on faster and more reliable data transfers. And even employees at the office will use 5G services to rely even more heavily on their phones and other portable devices. For example, sales organisations are giving their reps videoconferencing capabilities so prospects can view testimonials or product demonstrations, and repair personnel are accessing videos that show the steps in a maintenance procedure. Two bigger examples are driverless cars and entire “smart cities” that require huge bandwidth.
While widespread deployment of 5G wont’ happen for at least a few years, data centres in 2019 have to prepare for it. Infrastructure must evolve to support higher wireless bandwidth and more ubiquitous data usage. Companies and building owners are looking beyond just Wi-Fi to enable strong and consistent in-building mobile wireless services with distributed antenna systems (DAS). In the outdoor environment, service providers are upgrading and expanding their fibre networks to carry wireless data back to the core of the network or, in many cases, to edge data centres for situations where local processing is required for low-latency applications like driverless cars or remote surgery. Technology like C-RAN (Cloud Radio Access Networks or Centralised Radio Access Networks) and edge computing will be implemented in 2019 to support 5G wireless services because there will be more data processed at the edges of the network.
One of the 5G use cases is to specifically support IoT applications. As companies deploy thousands or millions of wired and wireless sensors that produce large amounts of raw data, this data is turned into useful information providing value for the user. The processing of the data is being placed closer to the source to reduce the communication requirements. For example, beverage companies are providing dispensers with IoT sensors that can report on each machine’s usage and inventory and relay that information to an application that can schedule a re-order and delivery of new supplies. In this case the raw data triggers an action to replace the inventory and this is forwarded to the cloud as opposed to routine status messages that often contain no new useful information. Autonomous truck fleets now leverage the IoT to report location, fuel levels, weight, temperature and other metrics to an application that determines estimated delivery times, fuel stops, or cargo tracking.
For other emerging industrial IoT applications, where production processes are controlled in real-time, the network infrastructure needs to have very low predictable latency and very high reliability. In 2019, data centre operators will use augmented intelligence in autonomous systems to make use of expanded peer-to-peer communications at the edge, which is new to 5G networks. Fibre optic infrastructure will provide reliable low-cost transmission capacity for edge data centres while placing edge data centres close to the IoT sensors and actuators will reduce transmission latency and transmission costs.
Augmented reality (AR) is the use of a device such as a cell phone to display relevant data while the user is watching or doing something live. This year has seen an increase in AR as technicians hold phones while repairing a product to see a schematic of that product with instructions on how to make the repair. On the other hand, virtual reality (VR) is a complete immersion in a virtual world through the use of an audio-visual headset. Some remote training classes, for example, are using VR headsets to learn production procedures before ever setting foot in a manufacturing facility.
AR can be supported with today’s networks because the data is often downloaded to a handheld device. VR, however, will require real-time video over an Internet communications link. If the connection is unreliable, bandwidth too low, or latency too high, the experience is degraded and may become useless.
Whether VR, AR, or otherwise, enterprise communications are now commonly video-based, with video content originating from mobile devices. To ensure that this video content is of the highest possibile quality, the new year will see higher-speed 5G networks designed to enable peer-to-peer traffic with a greatly enhanced capability to support data generated at the end-user device.
Blockchain is a decentralised database that is encoded, unchangeable, consensual, verifiable and permanent. When people hear “blockchain,” they think of Bitcoin, but there are myriad potential uses for the technology. It can be - and is - being used for anything that requires a permanent, secure, and verifiable record that can be accessed in a decentralised fashion.
In logistics, for example, blockchain is being used to establish trusted information such as where a product was made, when it was made, when it shipped, where it is located and arrived, and how it was used. It’s a shared public encryption system in which distributed users participate in the operation of the blockchain. Everyone holds the blockchain data with a reduced vulnerability to attack.
Anything that requires or can leverage a distributed ledger can benefit from blockchain. It can be applied whenever an exchange of information needs to be recorded and verified, so it has the potential to disrupt many organisational functions, from finance and procurement to manufacturing, IT and sales. Because of the benefits, this technology will be increasingly adopted in 2019, and this will impact the data centres housing the distributed ledgers.
Artificial intelligence (AI) has been slow to develop largely because many people start with the idea of a machine thinking like a human. What’s more, it has historically required massive computers to support it. This is now changing, however. The cost of computing has greatly decreased, AI algorithms have improved, and edge computing has enabled AI to be deployed in new ways. This change will continue and be more dramatic in 2019. Indeed, companies like Google and Facebook are already using AI to refine facility operations, increasing availability, reducing operating complexity and cost.
Developing the models on which AI runs is done on large central cloud resources. This model is then downloaded to the edge layer where the execution of AI provides benefits in local time-sensitive environments. The edge AI analyses and controls the local process, for example, and then feeds back information to the cloud, helping to improve the higher layer model.
To prepare for a continued AI push in 2019, corporations will adopt high-speed, low-latency networks coupled with high-end performance edge compute. Layering AI between local and centralised resources will combine the power of central cloud with the agility and performance of edge-based AI.
It’s About the Infrastructure
All of the technologies discussed have been around for some time, but are in the process of maturing and showing benefits that will bring about more adoption in 2019 and will force a digital transformation in the data centre. To prepare for digital transformation, data centre managers will implement advanced network infrastructures to make their networks faster, more ubiquitous, more reliable and more secure. By migrating to higher-speed networks, expanding the reach of fibre and wireless links, and adopting edge computing strategies, companies can lay a firm foundation for digital transformation in 2019.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, the January issue contains the second, and this February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 3.
A year of test and learn for 5G
Says Adrian Criddle, General Manager and Vice President at Intel.
“The roll out of 5G has been the subject of much hype throughout 2018, with the conversation mostly around which companies are winning the race to invest, develop, test and roll out in the UK. Following the release of the first 5G mobile devices in early 2019 we will start to see companies and government bodies experimenting in earnest with the new technology, and for the first time, make the potential realities more visible to consumers. With long promised roll outs commencing next year we can expect to see early teething problems and new applications as early adopters test and learn with the new technology over the course of 2019. It will be a year of preparation and experimentation in the run up to widespread 5G rollout in 2020”.
A blueprint for smart cities
“With the first 5G handsets in consumers hands and an increasing number of autonomous driving trials on the horizon, governments and councils will join in by sharing their plans for the application of technology in towns and cities around the UK. London will always be an early adopter, but other major cities in the UK such as Manchester, Birmingham and Bristol have all made significant commitments to adopting smart city technology as early as 2020. Much of the investment and pledges we see in 2019 will relate to infrastructure, to prepare our cities’ networks for computing at the edge and the volumes of data generated by the systems we connect through 5G”.
Real time AI
“In 2019 we will see a rise in more customer focused AI projects, with businesses investing more in AI that creates deeper understanding and engagement with their customers. Retail, finance, and entertainment are leading this charge, and also demonstrating early success with AI applied to business efficiency in real time. In the past AI was primarily used to crunch data to find patterns and insights in historic data, but in 2019 we’ll see AI-generated insights being applied in real time to make environments more efficient, or to create a better experience for customers. In the second half of 2019, new chips will enter the market that cram greater power and efficiency into less space, which will speed up AI and make it more powerful for business and consumers”.
The emergence of the data centric workplace
“The total area market for data centre technology is predicted to grow from $160 Billion in 2021 to $200 Billion in 2022, and points to an overall trend of data centricity across businesses. In 2019, the first 10nm powered PCs will enter the market that will set the bar for performance and let manufacturers and businesses experiment with that power in new ways than the past. To compete and deliver against business objectives we’ll see organisations rely on data to drive efficiency and identify new opportunities. Ambitious new projects and approaches to business will have data capture elements that will track the success of ventures but also establish a means of using data in real-time as a core business function. Speaking more broadly, the computer landscape has evolved dramatically. With the emergence of new 10nm technology, we’ll start to see faster, bigger and more powerful computers that will open doors to new possibilities in data-centric working”.
2019 predictions from Interxion: 5G’s impact not fully felt, the rise of multi-cloud and blockchain, eSIMs and the edge take centre stage
Andrew Fray, Managing Director at European data centre experts Interxion.
5G’s potential won’t be fully realised
The 5G tsunami is well on its way and it will hit our shores in 2019, with CCS Insight predicting that we could see 1 billion 5G users by 2023. Its rollout next year has the potential to completely transform every industry, from manufacturing and marketing to communications and entertainment. Fast data speeds, higher network bandwidth and lower latency mean smart cities, connected transport, smart healthcare and manufacturing are all becoming closer to a reality.
Despite the first deployments of 5G and the launch of the first 5G-compatible devices next year, we don’t expect the impact of widespread 5G implementation be fully felt in 2019. Instead, for many businesses, 2019 will be full of continued investment and focus into rearchitecting existing networks and infrastructure ready to host 5G networks.
“Multi-cloud” will be the new buzzword
Conversations this year have been full of the continued development and challenges of cloud adoption, and next year will be no different. A major learning from this year for many companies is that putting all of their eggs, in this case workloads, in one basket – whether it’s private cloud, public cloud or data centre – isn’t the best strategy. Instead, businesses are increasingly turning to multi-cloud adoption, consuming and mixing multiple cloud environments from several different cloud providers at once. In fact, multi-cloud adoption and deployments have doubled in the last year, and this will remain a major topic for 2019. Major cloud providers are also showing increased interest in multi-cloud, with public cloud providers such as Amazon and Alibaba offering private cloud options, as well as a number of acquisitions and partnerships that will allow the marrying of cloud environments.
In 2019, multi-cloud will become the new norm, allowing businesses to realise the full potential of the cloud, giving them increased flexibility and control over workloads and data, whilst avoiding vendor lock-in.
eSIM will take centre stage
It’s been suggested that, despite the build-up around 5G, it’s actually eSIMs that will be a game changer in the technology and telecoms sectors. Up until fairly recently, uptake of eSIM has been slow, as operators have been concerned with how the technology will impact their businesses. However, the eSIM market is estimated to grow to $978 million by 2023, with demand being driven by adoption of internet-enabled technology which requires built-in cellular connectivity. This year, we’ve already seen a shift in mindset in relation to the technology. New guidelines introduced by the GSMA have also contributed to increased awareness of the capabilities.
In 2019, we’ll see a large number of operators, service providers and vendors trial and launch new eSIM-based solutions. The impact of this growing emergence will be broad and will pave the way for significant developments in consumer experiences, in everything from entertainment and ecommerce to automotive.
Living on the edge in 2019
Edge computing has been on the horizon for a number of years now. However, it’s yet to be fully understood. In 2019, 5G deployments and the increasing proliferation of the IoT will be key drivers behind ‘the edge’ gaining significant awareness and traction. Business Insider Intelligence estimates that 5.6 billion enterprise-owned devices will utilise edge computing for data collection and processing by 2020.
As we move into next year, the edge will continue to be at the epicentre of innovation within enterprises, with the technology exerting its influence on a number of industries. Businesses will look to data centre providers to lead the charge when it comes to developing intelligent edge-focused systems. In terms of technological developments, a simplified, smarter version of the edge will emerge. The integration of artificial intelligence and machine learning will provide greater computing and storage capabilities.
Adoption of blockchain will start to accelerate, especially in financial services
Up until fairly recently, blockchain has remained a confusing topic for many businesses, especially those operating in highly-regulated industries such as the financial services sector. As a result, many financial institutions have been slow to embrace the technology. However, next year, more use cases for blockchain will be uncovered and will make an impact. According to PwC, three-quarters (77%) of financial sector incumbents will adopt blockchain as part of their systems or processes by 2020.
In particular, we’ll see an increasing number of fintech partnerships built over the course of the coming year as more financial companies look to harness the technology’s potential.
Cloud gaming will require a new approach to networking
According to Microsoft’s head of gaming division, the future of gaming is cloud gaming. This new form of gaming promises new choices for players in when and where to play, frictionless experiences and direct playability, as well as new opportunities for game creators to reach new audiences. However, based on a subscription model, the highest quality of service is critical for subscriber retention. Delivering this superior user experience, often across multi-user networks and devices, is dependent on low-latency. For gaming companies looking to keep up with demand for connectivity and bandwidth, it’s all about having the right infrastructure to deliver game content without delay or disruption.
Next year, we’ll see more gaming companies take a new approach to networking and harness the power of data centres to optimise performance, maintain low-latency and provide the resiliency and scalability to cope with the volatile demands of today’s gamers.
Looking Forward: IT Security Trends for 2019
By Andy Samsonoff, CEO of Invinsec
Protecting an organisation from cyber crime is a relentless task, as both security solutions and means to attack continue to evolve. The repercussions of a security ‘incident’ can be costly, in terms of financial loss, data recovery and damage to reputation.
As we near the end of 2018, many of us will be looking ahead to 2019, identifying what developments may impact our personal and business security, and how we can best prepare for them. We have therefore drawn upon our extensive knowledge of IT security, to bring you our predictions for next year’s cyber security landscape.
Nature of attacks
Threats to security come in two distinct guises: deliberate and accidental. It may be tempting to only consider threats coming from ruthless cyber criminals with sophisticated software, but the reality is that much damage can be done via carelessness. A sloppy approach to securing hardware, or having predictable, shared passwords, makes organisations extremely vulnerable and easy to compromise.
That said, even the most robust security infrastructure can be subject to attack. The means and motives for committing cyber crime are evolving, as criminals find new ways to bypass or break into systems. Below are five key areas to consider:
This carries two inherent risks:
#2: Phishing attacks
The practice of Phishing often using email, will continue to be a problem. It is used to extract personal or company data, usernames or passwords to gain improper access often through the insertion of malware using bad links or documents.
#3: APT’s (Advance Persistent Threats
These are complex, multistage attacks designed to collect data (social engineering), download more malicious code, move through and around your network, make initial infections and extract data with the ability to silently leave your network(s).
#4: Cloud Application and Data Centre Attacks
The ability of having faster and more reliable internet connections has allowed for the growth and expansion of cloud applications and cloud data centres. With every new application that moves to the cloud, it requires you to trust another vendor, their software and their security to protect your information.
The inherent risk is that users can access applications as well as your data from almost anywhere as long as they have the user’s credentials. It becomes a bigger risk when those users connect to free or public wi-fi.
#5: Shadow IT Applications
We are going to see an increase in shadow IT applications being used, we can see that over the next few years these applications are going to cause serious damage. Industry professionals sometimes refer to them as renegade applications, where employees download non-corporate-approved (and potentially insecure) applications to the same devices used to access company data. Companies should consider whitelisting applications and restricting the ability to download new software.
And one for 2020:
Predicted security trends for 2019/20 show that AI is poised to help forecast, classify and potentially block or mitigate cyber threats and attacks.
One fundamental idea to AI is machine learning. Over the past few years it is being incorporated into many security applications. Machines will battle machines in an automatic and continuous learning response cycle and this is will continue to enhance security postures.
Don’t be scared, protect yourself
As the opportunities to commit cyber attack evolve, so must your security provision. Expert advice, round-the-clock monitoring and competent recovery software will offer the best chance of achieving ‘business as usual’ in the event of a cyber attack.
AR moves from fun to function
By Lenovo’s UK General Manager, Preben Fjeld.
In the year ahead, augmented reality (AR) technology will shift from consumer entertainment, to a growing appetite within business. This will be enhanced by the advent of 5G, bringing to life the rich and meaningful capabilities of AR in helping drive efficiencies in training, maintenance and knowledge transfer into immersive environments.
For instance, using AR glasses as part of a larger technology system can give manufacturing and field workers real-time data to help reduce errors and improve accuracy, safety and quality. With AR remote assistance, a worker on an offshore oil rig could be assisted by an office-based worker who can see what’s happening live through the glasses. With object recognition, AR glasses worn by an airplane mechanic working on a tarmac could connect to a remote server to automatically identify the parts being worked on and pull up schematics and other critical materials. With new AR workflows and tools, a factory line worker on the first day can see through their AR glasses for step-by-step guidance on how to accomplish the task at hand with minimal training.
Moreover, there will be a rise in demand for more hardware and software agnostic solutions to ease users’ pain point of experiencing incompatibility between AR headsets and glasses with AR content and platforms. Content creators and hardware makers may converge to develop a more holistic and seamless hardware, as well as software solutions for businesses. In the near future, enterprises will deploy AR and VR together, creating even more benefits beyond this.
The future of security
With humans often marked as the weakest link in security, 2018 was exasperated by challenges due to the growth of mobility, BYOD, remote working and the gig economy. Looking at the next twelve months, AI is being touted as the pathway to protection; but as with many powerful tools, it can be used for both good and evil, as AI platforms become growingly favoured by cyber criminals too. We expect to see much more focus on machine learning to address security vulnerabilities, as well as more of a focus on end-to-end security solutions versus a patchwork collection of discrete tools.
There are four spaces where companies and end users need to focus to protect themselves – data, identity, online and devices. The trend from two-factor to multi-factor authentication on personal devices, for example, will continue to grow as security industry bodies like the FIDO Alliance integrate with Windows Hello to enable safer authentication. The rise of smart devices in the home and office that are all interconnected will also introduce security vulnerabilities that will need to be addressed. A crucial aspect will be to learn from users through heuristics and new learning models addressing not just changes in technology, but also changes in human behaviours. Companies will need to understand their multi-generational workforce, to better manage and protect devices, as well as develop strong security protocols and practices.
Offering a number of lifecycle and other benefits, DaaS (Device-as-a-Service) is a smart way to address security issues, particularly as they become increasingly complex and frequent due to the expanding mobile workforce. In response, companies will need to seek agile, customisable solutions and greater control of the device ecosystem as well as the security implemented with it. This is a growing trend; almost 30% of CIOs who responded to a Gartner study in 2018 are considering DaaS as part of their device strategy in the next five years, and a recent IDC study shows total market value tripling between now and 2020. In the interim, there are inherent challenges that will need to be solved. These include keeping up to date, customisation issues and concerns over safely managing the influx of BYOD.
Analytics at edge
Jack Norris, senior vice president, data and applications of MapR, talks about four major developments he sees in data analytics, DevOps, AI and GDPR Compliance in 2019.
Analytics at edge - 2019 will see the pendulum shift to a focus on performing analytics at the edge
Organisations will save time and money by processing and analysing data at the edge versus moving it back to a core, storing it and applying traditional analytics. Use cases include anomaly detection (fraud), pattern recognition (predicting failures/maintenance) and persistent streams. Autonomous vehicles, Oil and gas platforms, medical devices are all early examples of this trend that we will see expand in 2019. Cost drivers for this trend are bandwidth (semi-connected environments as well as expensive cellular) considerations and storage (reduce the amount of data sent to the cloud).
2019 is the year containers and AI meet in the mainstream
NVIDIA announced open source Rapids at the end of this year. Harbinger of how the focus on operationalising AI, better sharing across data scientists and distributing processing across locations will drive containerised. Another rising technology that drives this prediction is Kubeflow which will complement containers and distributed analytics.
GDPR Compliance focus moves to operational focus
The initial work phase of complying with GDPR was for organisations to look at how they controlled data placement and privacy. Now, organisations will look to monetise that GDPR data in some way. The opportunity in 2019 is to aggregate the models, semantics and reporting of GDPR data and efforts and develop as a revenue source.
Serverless uptake is driven by an industry move to consumption-based pricing
Serverless and consumption-based pricing will save DevOps money in 2019 by optimising costs around idle functions, storage, and developer tools usage.
5G’s new consumption models will have IT leaders re-evaluating their network design
Ray Watson, VP of Global Technology at Masergy, comments:
5G’s new consumption models will have IT leaders re-evaluating their network design
4G was priced on a consumption-based model with per-gig pricing, capped with overage fees. But 5G will likely come with new monthly fixed pricing models based on bandwidth. These changes will open the door to new and novel use cases for fixed wireless that were not considered previously. It will trigger CIOs to re-evaluate their network design including active and backup connectivity.
Wireless aggregation hubs will become the next location for edge computing
As IoT explodes, it will increase the demand for more localised data processing. Enterprises will need to process that data at the edge of their network, rather than at the data centre, where it has the potential to overwhelm backend infrastructure. The most convenient place will become the wireless aggregation hub or tower, where micro data centres or even mobile edge computing “cloudlets” will best accommodate IoT’s processing demands. These solutions will be built under or near a cell tower location in order to leverage the tower’s multi-gigabit fibre connection.
Machine learning will begin to shift to artificial intelligence
2019 will be the year that machine learning begins the shift to artificial intelligence through the use of complex simulations of biological neurons instead of simple mathematical ones. Machine learning models currently use simplified mathematical models of neurons. But with specialised hardware, better neuron simulations will lead to the next generation of machine learning, the simulation of biological brains. We can see this in specialised hardware such as the SpiNNaker project, Brainchip's Akida, and Blue Brain's neuromorphic cortical columns. While AI is not real yet, we’re starting to see early evidence of the shift
What Can Scientists and Engineers Expect from Artificial Intelligence in 2019?
ask Jos Martin, Senior Engineering Manager at MathWorks.
As technologies such as AI, deep learning, IoT and data analytics converge, their potential has grown exponentially to the point where technology which was once deemed a pipe-dream, has become reality. Undertaking the legwork behind this progression has mostly fallen to the likes of scientists and engineers who are responsible for undertaking the research, development and launch of the new applications. But as they are carrying out this work – often for the first time – there are the usual teething issues, for example determining new functionalities, design workflows and developing new skills on the job.
Of course, for the real potential of AI to be experienced in every sector, improved accessibility is crucial, as well as appropriate use cases, for scientists and engineers to reap the benefits in the sector in which they work. AI is not solely the realm of the data scientist. Now, scientists and engineers play a considerable role in spearheading the trialling, testing and acceptance of deep learning in industrial applications. Change will be driven by solution providers who will require the tools to cope with the expanding complexity of datasets. This will be in tandem with embedded applications and growing development teams, in order to generate greater collaboration, higher productivity workflows, interoperability and a shift away from relying on IT team for all their technology requirements.
Interoperability is a keystone for establishing a comprehensive AI solution. Still in its early stages of development, best practice AI use cases are few and far between, with no defined standards or guidelines in existence. At the moment, deep learning frameworks utilise a narrow spectrum of applications and production platforms, but in order to work effectively they must combine a range of components from separate workflows. Unfortunately, this has a tendency to cause friction and negatively impact productivity. Yet, there are organisations coming to the rescue. Take ONNX.ai, the company looking to remedy interoperability issues by empowering developers to use the most appropriate tool for the task in hand, cultivate easy collaboration in models, and incorporate new solutions onto a greater selection of production platforms.
2. AI is for everyone, actually
It’s no longer only data scientists driving the uptake and experimentation in deep learning – but engineers and scientists too. Technical curiosity, automation tools and business imperatives all drive AI uptake, and will encourage more scientists and engineers to welcome AI. Using the latest workflow tools, the technology will become far more user-friendly, making AI more straightforward to use. Consequently, tools such as time-series data including audio, signal, and IoT that are utilised by many engineers, as well as image and computer vision, will have the functionality to be used on far more use cases. On a practical level the impact will be wide-ranging, from early disease detection during cancer screenings – via improved pathology diagnosis – to unmanned aerial vehicles (UAV) using AI for object detection in satellite imagery.
3. Domain specialisation desired
The use of AI is growing particularly fast in industrial applications, yet these need new requirements for specialisation. As AI-driven technologies, for example smart cities, predictive maintenance and Industry 4.0 become more practical and less of a dream, a new set of principles need to be realised. It’s vital to turn our attention to mass-produced, low-power and moving machines. These need form factors, safety-critical tools that necessitate a great deal of reliability and verifiability, and forward-thinking mechatronics design approaches that combine electrical and mechanical components. Another roadblock is that service and decentralised development teams tend to be the ones who are accountable for the developing the specialised applications in question, as opposed to the company’s IT department.
4. Giving AI the edge
Edge computing will provide a more suitable processing method for running AI applications, where local processing failed to deliver. By progressing sensors and low-power computing architectures, edge computing, which demands real-time, high performance and increasingly complex AI solutions, will be enabled. Furthermore, edge computing plays a vital role in terms of safety for autonomous vehicles, where they need to understand the environment in which they function and evaluate driving options in real-time. There is a huge potential to drive down costs in locations which are particularly remote, such as deep-sea oil platforms, as they usually have expensive or restricted internet access.
5. Complex systems spur collaborationAs the use of machine learning and deep learning in complex systems grows, additional collaboration and participants will be required.
As the complexity and broadening scope of deep learning projects increases, a consequence of data gathering, synthesis and labelling, larger and decentralised teams must be created to cope. To have the ability to install inference models to data centres, cloud platforms, and embedded architectures such as FPGAs, ASICs, and microcontrollers, embedded and systems engineers will require additional flexibility. In addition, their knowledge and experience must encompass component reuse, power management, and optimisation. With the volume of training data growing exponentially, the latest tools will be vital for the engineers responsible for the development of deep learning models to fulfil their role, and lifecycle management of the inference models must be handed over to system engineers.
Predicting the future isn’t the easiest of tasks, especially when it comes to the tech scene. Every year, experts with incredible industry insights and experience predict what they anticipate in the coming months, and every year a good number of their predictions don’t happen as envisaged – if they even happen at all.
By Stefano Sordi, CMO Aruba S.p.A.
When it comes to data centres, in particular, the growing data deluge and rising sensitivity around how data is used and stored presents many exciting challenges presents presents many challenges for anyone trying to predict what the future holds. The rise, as well as the complexity, of various emerging technologies, only adds to this complicated mix. This makes it even more difficult to predict what will happen in 2020, let alone 2025.
With this in mind, perhaps a safer approach to exploring what the future holds would be to take a more long-term and holistic view. Rather than focussing on the next step in the evolution, why don’t we discuss the overall direction of things? Instead of arguing over what may or may not happen in the short term, why not have a discussion about what our interactions with customers and prospects are telling us about the bigger picture and what it is likely to look like in the future?
More outsourcing to cope with data deluge
If the amount of data generated and captured grows as foreseen by most experts and analysts, organisations will certainly need to look at expanding their existing data centres. But, for any organisation that provides services not directly linked to the building of data centres, building and maintaining a data centre isn’t the most straightforward scenario. Indeed, it’s one that could present an unnecessary drain on time and other resources.
Currently, most companies choose to move to data centres and outsource when they are starting new projects. When starting from scratch, they chose an external data centre because it’s faster and more efficient for them. It spares them having to invest in their own on-site data centre. As companies cannot predict their workloads when they move to a new IT infrastructure, they choose the cloud to be able to increase their capacity as their workloads go up, without having to roll-out more investments. For example, if you choose a pay-as-you-use option, then outsourcing also makes scaling-up easier. If not, you risk investing heavily and may be left with costly unused equipment.
It is also important to mention that another major that is pushing many organisations to outsource is safety, more than price. Most organisations don’t want the risks related to on-site data storage. These risks are financially high and pose a significant threat to your corporate reputation, so even though outsourcing is sometimes (but not always) more expensive, it’s still less risky.
Focus on flexibility over strict preferences
As cloud-based services become a staple of the business process, the focus on what sort of solutions to adopt will shift from strict preferences to flexibility. The focus will shift to an agnostic approach rather than the type of platform (private or public). Fewer people are focussing on the exact type of cloud, enabling organisations to transition more easily to a new and complex, modern infrastructure.
In the last few years, we’ve seen a growing amount of requests for ‘hybrid cloud solutions’. If customers still choose colocation for their core workloads, they will tend to locate their new projects in the cloud. This is backed up by the Observatory’s research of the Politecnico in Milan, which found that the use of cloud has grown by 18% and 28% respectively when it comes to Hybrid and Public clouds compared to 2017.
Focus on green energy
As issues surrounding global warming and resource preservation become more widely discussed, the conversation around data centre energy consumption will become more and more prominent. With the ongoing digitisation of business processes, as well as the creation of ever more data despite increasing data regulations, it will come as no surprise that data centres consume enormous amounts of energy. By 2025, it is estimated that data centres will consume one-fifth of all the electricity in the world.
But going green isn’t exactly easy. Green energy is more expensive to produce than standard energy, and however good green energy makes them look, organisations tend not to want to pay more. So a sustainable alternative must not only be green, but also consume less energy (so that organistions are not out of pocket).
The cooling system of our data centre is designed to only cool down parts of the data centre, not the entire building. For example, the cooling system only cools 30% of the building, while most cooling systems will cool 100% of the building. As a result, the amount of energy needed to cool down the data centre is much less.
Cost reduction can also be achieved by choosing the right form of power generation. On-site power generation is emerging as a viable option for managing the way businesses generate and use power. It’s not only more environmentally friendly, but it also yields significant cost savings. In addition, it’s also a major asset in terms of business continuity, one of tomorrow’s major priority for businesses. Data centres simply cannot afford to go down. If the information system is unavailable, operations may be impaired or stopped completely. With on-site power generation, data centres can provide an additional layer of security to their power supply.
There are so many other issues we could explore, but what should be clear by now is that there is still some work to be done. However, with the right planning and processes in place, organisations cannot only make sure they are prepared for what is to come, but they can also set themselves up for continued success beyond 2025.
Rittal has issued predictions for the data centre sector as the industry moves towards greater AI-based monitoring capabilities and the processing of data in real time with edge computing.
Trend 1: Data Centres will Acquire Greater AI-based Monitoring Capabilities
IT data centre specialists will require assistance systems featuring artificial intelligence (AI), or they will soon find it impossible to operate large and complex IT systems in a fail-safe way.
According to the IDC, by 2022, half the components within large data centres will include integrated AI functions and therefore be operating autonomously. Essentially this means that administrators will be reliant on predictive analytics and machine learning - designed to provide predictive fault forecasts and support optimised load balancing - to ensure maximum reliability of their data centres.
Trend 2: Processing the Flood of Data in Real Time with Edge Computing
We’re on the cusp of the roll-out of the 5G mobile communications network. The many transmission masts this requires means that the mobile communications infrastructure will have to be expanded through edge data centres.
It will also increase the amount of data that network operators and other companies have to process. CB Insights forecasts that every user will generate an average of 1.5 GB of data per day with an internet-enabled device by 2020.
Decentralising IT infrastructure through edge data centres means data can be processed at source, leading to low latency and enabling real-time applications for the control of industrial robots or autonomous vehicle systems. Edge data centres are connected to the cloud to deliver additional data analysis.
Essentially, businesses now need to examine how to expand their IT capacities flexibly over the next couple of years and how to evaluate edge concepts with this in mind.
The general trend towards standardisation is another key factor in achieving the fast deployment time and scalability that the market is demanding from data centres.
Trend 3: The cloud market will benefit from hyperscale data centres
Acceptance of the cloud continues to grow, and is particularly prevalent across mechanical and plant engineering.
At the same time, investments in hyperscale data centres are increasing globally, an indication of the further spread of the cloud as an operating model.
Researchers at Synergy Reach expect that there will be more than 600 hyperscale data centres worldwide by 2020 – currently, the number stands at around 450.
It’s why Rittal recommends that IT managers now consider how to balance their on-site edge (or core) data centre and cloud resources, to optimally support application hosting and high availability in line with their corporate strategy.
Trend 4: Optimised technologies will increase energy efficiency
Alongside high availability, energy efficiency is seen as the second most important management issue when it comes to operating a data centre.
The energy efficiency of new data centres has improved by roughly 60 percent over the last decade according to the Borderstep Institute. At the same time, however, energy requirements have continued to rise as IT capacities have grown.
For data centre managers, optimising the energy usage of their entire data centre should be the number one priority in the coming year. Hybrid cooling units that integrate free cooling with refrigerant-based cooling are one example of new approaches to cost optimisation.
Trend 5: The Nordic countries’ locations will help to cut costs
The Nordic region has become an attractive location for cloud and co-location providers.
Countries such as Denmark, Finland, Iceland, Norway and Sweden offer renewable energy sources, a climate favourable to data centres, very good internet connections and a high level of political and economic stability. Analysts expect the turnover of data centres in the region to grow by eight percent per year until 2023. One famous example is Norway’s Lefdal Mine Datacenter (LMD) with whom Rittal is a strategic and technological partner.
LMD sources its electrical power entirely from renewables while the cooling system uses local sea water. As a result, the facility achieves a power usage effectiveness (PUE) of 1.15 and operating costs for customers are low.
“Alongside the trend toward greater standardisation, we will see companies expanding their IT infrastructure in a more decentralised way in 2019. This will support the digital initiatives that now form an integral part of a successful corporate strategy.
“One way of doing this is through edge data centres, which can be put into operation very quickly and on the company’s premises as IT containers for instance, and which thus support the digital transformation in all branches of industry.
“Rittal is offering an array of solutions for edge infrastructures, ranging from rack solutions to turnkey IT containers with cloud connections,” says Andreas Keiger, Executive Vice President of the Rittal Global Business Unit IT.
100 TB of transaction history: one of the largest databases in Europe, with 1.5 billion messages per day; from 200 person-days to one click for tracking data; ten times more iso-budget data generated. How Euronext deals with its data.
Following its split from the New York Stock Exchange in 2014, Euronext became the first pan-European exchange in the eurozone, fusing together the stock markets of Amsterdam, Brussels, Dublin, Lisbon, and Paris. Euronext comprises close to 1,300 issuers, reporting a total market capitalization of 3,700 billion euros at the end of March 2018.
In 2016, Euronext began the typical process of migrating its data to the cloud. Except that this migration had nothing typical about it at all. First off, the Euronext database contained 100 TB of data– one of the biggest in Europe. Then there was the fact that this was not just a simple transfer of a database to a hosted platform. The idea was to create a governed data lake with self-service access for business units and clients in an effort to monetize new services and generate additional revenues.
Migrating to a governed cloud
“We use Optiq, an incredible trading platform, with systems that practically work in nanoseconds,” explains Abderrahmane Belarfaoui, Chief Data Officer (CDO) at Euronext. The huge Euronext database is the active memory of transactions handled directly by the stock exchange operator (1.5 billion messages per day). “The database is compressed (at a rate of 400%) but some information was not being archived for lack of space,” says Belarfaoui about the problems of the old system.
Before 2016, Euronext stored its data on site, on hardware from one of the big names in the industry. But Euronext’s storage needs continued to grow, especially following several acquisitions, such as the Dublin Stock Exchange and Fast Match in the US.
“Our IT infrastructure had reached the end of its lifecycle in our European operations, where regulators were expecting that Euronext store more and more data,” Belarfaoui recalls.
“Moreover, sometimes we had to wait six to twelve hours after market close on days with important events, such as the UK Brexit vote, before we could send the data to business units and clients.”
The situation prompted the CDO to look at moving to a hybrid cloud model. “We still keep trading platform information on an on-site server because lag times are not yet available on the cloud,” explains Belarfaoui. “We also use AWS Managed Services in serverless mode together with Amazon S3 to have access to a data warehouse with unlimited storage capacity. For analysis, we use Amazon Redshift. And taking advantage of the cloud’s great scalability, we can run the whole system while anticipating events that cause high volumes on the markets.”
Still, the transition to a Platform as a Service (PaaS) does require one key condition: remaining independent of the cloud provider.
Euronext chose Talend Big Data to absorb real-time data in the data lake, including internal data from its own trading platform; and external data, such as from Reuters and Bloomberg.
“The core of the data lake is managed by Talend. It was very important for us to keep this “independence”compared to the layers below Talend. So, if tomorrow Euronext wants to change clouds, they can,” says the CDO, happy about the greater flexibility.
In an ultra-regulated world, Talend has also proven to be highly adept at meeting the challenges of data lake governance and regulatory compliance. Being able to safely open data involves knowing it inside out, keeping track of changes and the history of data feeds, and knowing how to classify them in a granular structure.
“We have an Amazon S3 storage that is shared by everyone. So I have to know who owns data from the start (the data owner), who has access to what, whom to ask, who can use it, and who has priority over whom. Our data stewards protect the organization of our data,” adds Belarfaoui.
This governance strategy is applied in very specific tools, such as the Talend Data Catalog. A dictionary is created together with each technical project for each individual market. These dictionaries are used to find the history of end-to-end data, from the sources to the reporting. “Now I can see when S3 is the data source, I can add value to the data, combine it with other data, and convert it into other data in Redshift,” says the CDO, who is very satisfied with the new process. “I can also add tags. Typically, we add the storage duration. For example, whether data has to be kept for ten years, or five years (per MIFID II), or if it should be archived.”
At the same time, data lineage with Talend drastically reduces impact analysis costs. “One simple example comes to mind: we plan to change the value of an index on the British stock market. Once we integrate it into our systems, it propagates itself pretty much everywhere. Currently we have to figure 200 person-days just to find the index in our different systems. But with the dictionary, we are able to run this data lineage with just one click.”
Monetizing stock market data
Two years after its launch, the governed lake project with Talend and AWS is a success. “The initial returns are more than positive,” says Belarfaoui. “On the technical side, we can manage ten times more iso-budget data.”
Beyond the improved architecture, the migration is also positioning Euronext to become a “data trader.” The stock market operator wanted to be able to refine and add to its wealth of data in order to monetize it. In fact, the sale of data already brings in 20% of Euronext’s revenues.
“Traders actually sell, buy, and make their investment decisions in milliseconds. They have a huge appetite for aggregated data in real time. Who sells which stock, to whom, at what price and when. We are in the best position to track performance of the CAC 40 or other indexes and sell that information to investors through our Datashop platform,” says Belarfaoui.
In addition to clients, this project also involves giving data scientists and business units self-service access to this data, which they can analyze in data sandboxes for tasks such as market monitoring. Belarfaoui explains: “We can set up an environment for a data scientist in less than one day, compared to the 40 days it used to take, and we have moved from D+1 analytics to real-time analytics. This is fundamental to understanding markets, clients, competitors, and how they interact.”
This is a real turning point for Euronext. “In 2016, we identified the need, but we didn’t have the capacity to do it. At the time, we could only relay the volumes of market activity to market regulators (Mifid II). Today, we can dig deeper. Under the General Data Protection Regulations (GDPR), I have to know where personal data is stored. If I receive requests for modification or deletion, I can find the data, thanks to the dictionary,” elaborates Belarfaoui. “Similarly, a user who searches a transaction can instantly see if it is confidential. Once data is identified as being critical, the data steward can deny user access.”
Euronext is just at the beginning of its digital transformation. A study is currently underway on the deployment of Talend’s Master Data Management (MDM) solution. “We are working on ‘golden sources’ within all of our systems (CRM, trading, billing, finance, various departments, subsidiaries, etc.). The goal is to make all Euronext data even cleaner and of higher quality, such as by being sure that a client is consistently represented across all systems. Such standards will make our data even more usable,” predicts Belarfaoui.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, the January issue contains the second, and this February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 4.
2019 will get even more personal - smart technology and the future of UK retail
By Anshuman Singh, European Head of the Digital Business, Mindtree.
We live in the age of rapidly evolving customer demand. Never before have UK shoppers wanted so much for less, and at faster delivery speeds. Couple this with the apparent ‘demise’ of the UK high street and it would seem that retailers have never had such a pressing challenge on their hands, to survive, thrive and ensure continued customer loyalty whether in store or online.
The use of E-commerce sites is on the rise, but despite the headlines, 90% of sales today do still take place in physical stores. In 2019, it’s unlikely that the proliferation of technology to make the customer experience more personalised will be left in 2018.
Data and algorithms will become more real-time
The online shopping experience is becoming increasingly preferable to consumers because of its many benefits, such as the ease of access and the proliferation of online-only discounts. The data, created and left by consumers as they search for products and move between websites to seek the best deal, is gold dust for retailers who forever talking about getting ‘stickier’ with the customer. This data collected can then be harnessed for a multichannel customer experience.
As retailers have come to learn data they store is invaluable, retailers will come to believe that rapid, real time data proliferation and processing can enable them to make better decisions and drive more revenue. In 2019, real time insights will come to the fore and algorithms and automation will begin to play a bigger part too.
The physical store will be reimagined
Next year retailers could also look to drive innovation and investment around the format of their stores. Seasonal footfall changes dynamically and retailers could be looking to innovate planning cycles around that. More contextual and targeted sales, marketing and operations can revolve around this approach.
We could also see the growth of more sophisticated technology used in physical stores. For instance, M&S recently launched its “Mobile, Pay, Go” service, which allows customers to seamlessly shop and pay on mobile. Indeed, next year the retailers will learn from online leaders and even start ups, to integrate online elements to physical stores.
Personalisation and logistics will be revamped
Amazon has had yet another fantastic year financially. Not only does the organisation hold huge market share, it’s even encouraging other businesses to innovate, recently the company announced a new initiative that encourages people to start their own small package delivery businesses.
The way in which Amazon has developed its logistics and delivery could be harnessed by the retail sector. It’s arguable that retail so far is taking too narrow a view on personalization and delivery could be optmised further. For instance, more active delivery areas could be created to meet the demand of the time-poor customers of today who are out at work instead of at home, available to collect parcels. In 2019, the focus will make deliveries ever more efficient and effective, with retailers taking cue from Amazon.
The retailers who will thrive in 2019 will be those who use technology to make their stores more personalised and experiential for consumers, so traditional bricks and mortar stores provide a comparable service to online retail.
Joe Drumgoole, Director of Developer Advocacy, EMEA - MongoDB
“Technical recruitment based on keyword searching, skills matching and Linkedin profiles has long been flawed, but the scale of the deficiencies in this process shows more greatly with each passing year.
“Brexit will exacerbate the problem as it puts pressure on traditional tech hiring markets overseas or, at the least, increases the complexity of accessing the talent pool. As large, well financed tech companies continue to throw money at their well paid tech employees, the rest of the world is going to need to do something different. The watchword for 2019 will be “make” not “buy”.
“In the 1950s and 60s, a computer science degree didn’t exist. Programmers in those days came from more diverse backgrounds and had to work with more intractable systems. They needed deep knowledge of the hardware and software and they programmed in assembly language (imagine building an IKEA table but first you have to grow a forest and harvest the trees). They had to learn as they went. Take Tony Hoare, the inventor of Quick Sort, as an example. He studied Classics in college and invented his famous algorithm in 1959.
“We’re coming full circle. This is not to say that everyone can be be a programmer any more than everyone can fly a plane or play a piano well. Technology companies need to be more willing to make programmers from whole cloth. That requires a process to quickly discover who has the aptitude for programming. Conversion training is not a new thing but in 2019 we will start to see industrialisation of this process.”
“Most businesses have been operating at Terabyte scale (the level at which single node databases are most effective), but we are now operating in a world where modern data-centric businesses have reached petabyte scale.
“It means that there will be many challenges, but many more opportunities. The data is still fragmented and each of the fragments remains in a terabyte silo. However, the most advanced organisations have understood that centralising the core subset of data required to run their businesses can lead to deeper insight, great efficiency and more effective innovation.
“We are about to enter a revolutionary period as the cloud enters mainstream. One of our most innovative customers, HM Revenue and Customs, has been using MongoDB on-premise for many years and is now moving to a cloud based solution. However, it’s not a revolution without losers. The losers in this next phase are those organisations who continue to cling to technologies and approaches that predate the internet.
“Next year is when we get serious about the integrated petabyte. A single store with total, real-time insight into a business organisation.”
The Late Majority Embrace The Cloud
“When you realise that the costs for an Ethernet cable deploying a petabyte may run to $50,000, the cloud starts to look less like an option and more like an endgame. We have already seen some very conservative organisations go all in on the cloud in 2018.
“There is a late majority of organisations who have long resisted the cloud. They are characterised by a number of key attributes. First, they tend to be long-lived - usually at least 50 years old - which means that they have a long tail of live technology from mainframes to mobile apps. They will tend to have large user populations, which increases their risk when exploring major infrastructure changes. They have often recently embraced mobile and tend to have many different, legacy relational databases supporting their services. This year they will finally migrate.
“The digital transformation process involved in moving to the cloud will require moving most applications in situ. There is a huge issue for these late majority organisations in that they have already struggled to connect their user populations to their overstressed relational databases. The death notice will be posted when they discover the traditional SQL vendors cannot accommodate the scale up/scale down dynamics of the the cloud.”
“When the internet rolled out, the obsession was with perimeter defence. Build a secure network with a set of firewalls as gatekeepers. The most recent breaches at Quora and Marriott have taught us that privacy and security of software must be fundamental to the design, and happen as one of the first steps in development. That means usernames and passwords. We must start as we mean to finish and infrastructure vendors have as much responsibility as any to ensure that building secure, privacy enabled systems is as natural in the future as building insecure systems was in the past.”
Tech 2019 Predictions
By Steven Noels, CTO and co-founder of NGDATA.
Forget Mandarin and Spanish. It’s all about Python!
In 2019, we won’t see more a significant increase in the number data scientists. Instead, we’ll see people who don’t strictly work with data incorporate elements of data science into their existing job roles.
Recently, JPMorgan Chase announced that it is putting hundreds of new investment bankers and asset managers through mandatory coding lessons. Not only that, but current employees would also be upskilled to meet the requirements of an evolving technology landscape. Data scientists have generally been viewed as ‘those nerdy guys in the corner who work with math or data’. That’s no longer the case. Increasingly, we’ll see job descriptions requiring some degree of familiarity and proficiency in data science techniques; indeed, in the future we expect a wide variety of jobs to require employees to write or at least understand code in order to find valuable data and actionable insights. So forget learning Mandarin or Spanish; learn a digital language instead.
AI is the new email
AI is definitely here to stay, but the hype train hasn’t reached its destination yet. It’s very clear that AI is having a profound impact on the world today. Whether it’s being used to improve healthcare, security, or pour better beers, it’s changing the way we do business. However, we’re still very much within the ‘hype cycle’. The c-suite know it’s relevant, but they’re still to make AI a strategic priority – instead, they are adopting it cautiously, on a case-by-case basis. For example, AI’s main use right now is mostly with chatbots, cognitive servicing and doing complex calculations at extreme scale.
What’s so fascinating about this is that people are more likely to engage with the result of AI (and becoming familiar and comfortable with such tools) in their personal rather than their professional lives. In this way, it’s very similar to how email gained in popularity, first as a consumer tool and only later as the main communications platform for business.
In 3-5 years’ time, we’ll start to see AI taking over business models and being incorporated into organisations’ overarching business strategies, but this will only come once the general public have accepted the technology – for example, in customer service chatbots or consumer products like Google Home. Once this acceptance becomes widespread, we’ll see the real beginnings of a business AI boom.
Data regulation upheaval
There’s no doubt GDPR has had a substantial impact. Within the first few days of the legislation formally coming into effect, some of the biggest blue-chips were coming under fire for being non-compliant, most recently the Marriot hotels data breach. Companies are struggling to get to grips with the new rules, and they leave a lot of grey. It’s like going to school, sitting an exam and there being no correct answer and no guidance on how to prepare or mitigate sanctions. One of the most significant unintended consequences of GDPR is that companies are now extremely fearful about using personal data – even though this caution threatens to stymie data-led initiatives that are so important to developing new products, services and business strategies.
This fearfulness is a real shame, as it’s delaying progress that would ultimately benefit businesses and their customers. In 2019, we hope that organisations will become braver, and understand that the GDPR need not limit their ambitions to use consumer data to make better decisions. Being brave and showing explicitly respect for customer data privacy, will generate trust between brands and customers and will positively impact customer loyalty and brand value.
The cloud goes enterprise size
In recent years we’ve come to associate the use of cloud applications with trendy start-ups working out of coworking spaces and coffee shops. Now the capabilities of cloud working are being truly leveraged by large global corporations.
But cloud is not the end goal, just the means to achieving an acceleration of business strategy and agility. In 2019, we will see enterprises reach for the cloud as they adapt to a digital-first world that demand real-time action and interaction with data.
The race is on for companies to dominate the cloud scene. Could we see a race to own the B2B data transaction, in the same way as Amazon has taken over the consumer ecommerce landscape? In the past few years the likes of Google have worked hard to be front of mind for businesses’ cloud needs and time will tell if they can convert their big consumer base into enterprise size accounts.
General IT (2019: another step in the move towards the edge)
In general IT, there has been much talk in the past years about the reversal of the pendulum: If in the past decade IT complexity has moved to the core of the network and the cloud, with users interacting through relatively thin clients (web browsers, mobile devices), in the next decade we will see more data being processed at the edge. Consumers will be interacting with IT using either more sophisticated devices (be it self-driving cars, drones, AR/VR headsets, AI, rich interactive interfaces) or more dispersed and numerous devices (IoT sensors and actuators, smart tags, etc.). In 2019 we will take a step further in the realisation that “edge” is no longer just an analyst vision, but an increasingly concrete reality for most IT managers.
This is a long-term and gradual trend, but a clear shift in awareness is hitting now. The move towards the edge is obviously a positive trend for Opengear’s OOB business.
Network Management, Infrastructure Management, Data Center (2019: the year of Virtualization and Automation in Enterprise Networking Management)
In Network Management, Infrastructure Management, Data Center the big themes will continue to be Virtualization and Automation, with a jump in Enterprise awareness in 2019.
In the past decade, most areas in IT went through a major transformation that enabled large service providers to operate at “web scale” and the enterprise to drastically improve their levels of efficiency. The technology involved included virtualization (which decoupled layers of infrastructure), orchestration (which allowed vendor-neutral approaches to management) and DevOps (a transformation on culture and operational processes). If in the 2000’s a system administrator could take care of 30 servers, in 2019 that number is larger than 3000, a two order of magnitude jump.
A persistent incongruence has been: if every other field in IT has been transformed by virtualization, orchestration, DevOps, why is it that Networking has remained “old school”? While other IT silos seem to be collapsing into a “converged infrastructure”, network remains separate. Is it cultural? Are networking people more conservative? Is it caused by the tight grip on the market by one or two major?
If I am managing an application infrastructure, I can rely on always-on network connectivity, apply automation techniques to reduce the need for human intervention. If I am managing a server or storage infrastructure, I can rely on always-on network infrastructure, apply automation techniques to increase scale and efficiency.
Now, if I am managing the network infrastructure, I cannot rely on always-on networking connectivity. Networking is different in the sense that it manages the infrastructure that supports the automation efforts of all other IT silos. While virtualization and orchestration technologies are now being applied to networking, it won’t change the fact that, different from other IT silos, monitoring, managing and repairing the physical layer of networking will remain relevant in the long-term.
In IT infrastructure management, if I am managing applications, the network is always there to save me. If I am managing servers or storage, the network is always there to save me. If I am managing databases, the network is always there to save me. If I am managing the network infrastructure, who do I rely on? That’s where Opengear comes in. We are the last-resort infrastructure that supports network infrastructure management.
Software-defined Wide-Area Networking (SD-WAN) (2019: the year of SD-WAN – but everyone knows that)
There is no better demonstration of the intrinsic dependency of the physical layer of networking and geography than Wide-Area Networking (WAN).
In the past couple of years, we have witnessed a remarkable convergence of previously disjointed technologies to address a general problem. Traffic management and Quality-of-Service, data encryption and compression, VPN end-to-end security, network redundancy and failover, remote provisioning techniques were combined and packaged together with new cloud-based provisioning and configured to take advantage of network-function virtualization (NFV) to create a very compelling solution for WAN connectivity and management.
It has become moot to predict explosive growth in SD-WAN. 2019 is the year where every single enterprise managing remote branch connectivity will be looking at SD-WAN deployments. The cost and resiliency differences between old-school WAN networking using MPLS circuits and access routers and SD-WAN are clear and compelling.
While cloud-based provisioning has alleviated the cost and pain of deploying a WAN solution, reducing the need to send a team of engineers to each location to setup and configure the router and network uplink, it is important to remember that the laws of physics still apply: any tool or technology that relies on an infrastructure to manage the infrastructure gets into a deadlock when there is a disruption. Provisioning of remote devices relying on existing in-band network connectivity is still subject to deadlock (whether it is done from a NOC or the cloud).
With more sophisticated, fast-evolving, multi-vendor software stacks being deployed at the edge, the opportunities for something to go wrong multiply. While a traditional access router can be deployed and go untouched for months or even years, in SD-WAN, the software components are updated continuously from the cloud. Compared to traditional networking, this is convenient and secure, but is still dependent on the stability of WAN links and remote physical infrastructure.
Opengear is leading the way in extending OOB solutions not only to provide last-resort connectivity to remote sites and minimise the need of truck rolls as it has been doing for traditional WAN deployments, but also to extend the reach of other monitoring and management systems so that they can manage the edge devices even before the WAN connectivity is established or when there is a disruption. We see this as one of the key areas of growth for OOB and Opengear in the next years.
Specific technologies that will be hot in Network Infrastructure Management in 2019
If virtualization transformed the big blocks of IT in the past decade, containerised applications are transforming networking and networking management in 2019. Containers allow the flexible deployment of applications anywhere in the infrastructure, while avoiding the weight and complexity of traditional virtualization. When most of your nodes or devices are not full-blown data center servers, size and simplicity matters. Though there are many challengers, Docker is the most popular container format and is likely to remain so in 2019.
Use of traditional network management protocols (like SNMP, IPMI, NETCONF) for both monitoring and configuration of devices have been the mainstay of network management for as long as we remember. It has been plagued by vendors making sure their implementation of the standards was always “unique”, but it was better than nothing. Large-scale service providers have abandoned that approach in favour of less structured methods that remove the dependency on hardware vendors. Instead of getting structured monitoring data from SNMP gets, apply big data and AI techniques to parse less structured log and event information. Instead of using SNMP set to configure a particular parameter in a device, use an orchestration tool like Ansible or Puppet to replace the entire configuration file.
In the Enterprise, there are good reasons to apply both methods and the debate will continue. On the more structured side of the market, OpenConfig and streaming telemetry is gaining traction as a user-driven set of standards. At the same time, the attempts to “converge” networking and apply the same techniques used in other IT silos will continue. This debate will be central to both technology providers and IT users in 2019.
As the leading provider of OOB Infrastructure to the market, Opengear supports both parts of that debate by offering capabilities in both camps. But in its own NetOps Automation Solutions, Opengear is favouring the vendor-neutral, protocol independent approach by leveraging open technologies such as Git, Docker, Ansible, ZTP to enable enterprises to operate their networks with scalability and efficiency without precedent in Network Management.
The Evolution of OOB
While always-on network connectivity is the saviour of managers of any other IT silo, Out-of-Band (OOB) is the infrastructure of last resort that saves Network Managers when the production infrastructure is disrupted.
An Out-of-Band Management System provides a secure alternate path so that a network engineer can reach the console port of any network element in the infrastructure even if the production network is disrupted. This has been the established definition of OOB for the past two decades.
As Network Management automates, virtualizes and goes “NetOps”, OOB has also to evolve and be less about connecting people to ports.
Artificial Intelligence: What’s in store for 2019?
Ben Lorica, Chief Data Scientist, O’Reilly Media.
IoT will become more engrained into the running of everything
says Martin Hodgson, Head of UK & Ireland at Paessler.
“Over the past year, we’ve seen more of our critical infrastructures focusing their efforts on implementing IoT projects.
“As we enter 2019 the number of connected devices will only increase as more organisations begin to realise the benefits of IoT technologies. Consequently, next year will see the birth of a smarter IoT – whereby fully connected businesses will begin to pull data for more predictive use.
“Imagine a world in which electricity providers can predict, and prevent potential outages, or healthcare institutions can predict, and stop, machines from failing. Industries that are proactive in connecting more of their devices will benefit from increased insights into their critical infrastructures’ performance. The benefits really are a no brainer. With the ability to implement predictive maintenance solutions, improve production on the factory floor and reduce downtime, in sometimes life-threatening situations – we can see why IoT will become further engrained over the coming year”.
Human awareness of IoT security risks will come under the spotlight to ensure we are keeping pace with technological change
“As businesses strive to embrace full connectivity, the concern around employees’ technology expertise in the field only increases. With companies progressively adopting IoT, business leaders are fast realising that those employees involved in the installation, maintenance and control of IoT systems are generally not all IT experts. Nor are training courses helping them to keep pace with the new capabilities of machines in order to properly assess risks.
“Consequently, business leaders are recognising potential gaps in their cybersecurity measures. After all, it only takes one compromised device to hack the entire chain. And now, with critical infrastructures such as healthcare, electricity, and water suppliers implementing IoT solutions, the need for comprehensive training is imperative.
“Next year we will see the vast deployment of human resource departments across multiple industries as organisations strive to ensure that employees are prepared for both hardware downtime as well as external cybersecurity threats. This means human operations and fast moving IT will no longer be siloed. As machines are increasingly embedded in the workforce, humans will need the correct IT training to spot and deal with potential cyberattacks and malfunctions”.
Hybrid IT management is complex. With the advent of flexible working, bring-your-own-device policies, and the increasing usage of cloud-based services, traditional network perimeters are eroding. This brings substantial new security challenges: the attack surface is widening with more devices offering poorly defended entry points into private networks and information stores.
By Nicolas Fischbach, Global CTO at Forcepoint.
Modern working practices often allow for ‘anytime, anywhere’ access to company data by employees, meaning data is scattered across public and private clouds, removable media and can even find itself mixed with personal information on mobile devices. What’s more, the reduced friction demanded by employees has driven more relaxed IT security policies, changing an organisation’s risk posture.
It all adds up to a more dynamically and continuously shifting threat landscape - one that requires an equally transformative view when it comes to security. Cybersecurity products need to move past the ‘one size fits all’ approach of old, which gives undue prominence to legacy and basic threats and isn’t much use, even until after the damage has been done. Point product assembled solutions also often create so much noise that they drown out the more important things, failing to surface important events. What is lacking is context and given the additional risk of insider threats to businesses, there is a crucial need for any modern cybersecurity solution to incorporate it.
By examining how people interact with critical business data and IP, and understanding how and why these interactions occur, security teams can be enabled to determine the normal pattern of data usage. Identifying the unusual behaviour that may be the start of something much more serious apart from the everyday, is the only scalable defence mechanism.
To address risks like the small user error that turns an email lure into a ransomware debacle, to sporadic, anomalous activities that once presented in context, could be the early signs of a malicious insider threat requires a behaviour-centric approach.
To surface what requires further investigation, and what is the routine user activity on the network, requires categorising risk along a ‘continuum of intent’ – accidental, compromised or malicious users. By classifying activity in this way, frictionless protection can be developed that doesn’t get in the way of everyday user needs and access to data.
Being able to monitor and react accordingly in real time is also critical to meeting global compliance standards such as GDPR; as avoiding the financial penalties and hits to reputation and trust that comes with a public data leak, will come from the ability to mitigate escalating issues in their tracks.
Examining human interactions with data over time does offer a certain level of context, as a history can be established and help derive intent. This approach is the best way to clearly understand where on the continuum of intent a suspicious activity may lie.
Activity monitoring can cover a wide variety of behaviours across a network, from access to cloud-based apps to device use and internet browsing. This tracking can happen with all data pseudoanonymised, so that staff only ever uncover a user’s identity when there is a strong need to, under planned and controlled circumstances.
Implementing this kind of workplace or hybrid IT monitoring, however, is not without its challenges. There remains much misplaced fear and doubt amongst staff, particularly when it comes to privacy. If communicated in the wrong way, or not communicated at all, staff may incorrectly assume their every move is under scrutiny to monitor their productivity rather than ensure their and the company’s security. What must be stressed from an internal perspective is that such tools and programs are in place to automatically prevent cybersecurity disasters. Care must be taken to remain transparent and explain that monitoring is mutually beneficial.
For instance, because of the context provided, scenarios where employees’ computers are hacked and data is accessed in their name can be quickly identified as being out of line with their usual behaviour. This helps to avoid damage to that employee’s reputation which would likely come if protective monitoring was not in place.
Finally, today’s dynamic threat landscape requires an equally dynamic and automated approach to security. It calls for one that places human cadence and behaviour at its centre, underpinned by context, rather than rules and static enforcement policies. Organisations that put this in place will be able to stay one step ahead, giving them the time and capability to spot and respond to the next attack, rather than struggling to recover from the last one.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, the January issue contains the second, and this February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 5.
Blockchain predictions 2018
By Fady Abdel-Nour, Global Head of M&A and Investments, PayU
2018 has seen hype around blockchain technology reaching a crescendo. The potential for the technology is huge and it promises to unlock a wave of innovation. That said, it’s certainly fraught with complications, from regulation to adoption. Looking ahead to what 2019 has in store for blockchain I’ve summarised a few predicted trends for the year:
● A rise in traditional institutional investors adopting cryptocurrency. The recent surge in the capitalisation of cryptocurrency alone, from less than $20bn in 2013 to over $540bn in 2017, means cryptocurrency is becoming almost impossible for traditional institutional investors to ignore. Crypto assets offer a unique combination of returns and volatility that is extremely attractive to investors looking to diversify a portfolio of stocks and bonds. Indeed, research shows that a 2% exposure to crypto assets in a portfolio could, on average, boost returns by up to 200 bps.
● We’ll see more governments looking to adopt blockchain technology. It’s clear that blockchain has a wide variety of use cases and is capable of hugely speeding up processes. In addition to this, it can hold large volumes of data and isn’t confined by geographic borders, making it an attractive option to authorities. It won’t be long before we see governments starting to capitalise on these benefits, and we may see many take their first steps in 2019.
● Banks will build their own private blockchains. We will see more banks fully embracing blockchain and integrating the technology into their current system infrastructure. However, instead of using public blockchain, banks will likely start to build their own private systems. This is a positive step for banks looking to embrace the technology and innovate existing systems with the privacy and scalability offered by private networks, but it will lack the level of interoperability offered by public networks.
● Increased demand for blockchain experts. As blockchain technology gets a better reputation and becomes increasingly mainstream, more companies will look to apply it to business offering. In the current business landscape, this technology is largely unused by most companies and, as a result, specialised blockchain experts are few and far between. As we see blockchain technology find its place in the world of business, there will be increased demand for blockchain experts to architect, oversee and advise on its application.
● Regulators will become more comfortable with cryptocurrency as an investment asset class. While in the past, crypto based investments may have been rejected by regulators owing to questions of control, in 2019 it is likely that we will see regulators becoming more comfortable with these sorts of investments. This will allow for the creation of associated security tokens and exchange-traded funds.
The future of payments partnerships
Matthias Setzer, Chief Commercial Officer, PayU
Throughout 2018, the explosive force of the global payments industry has remained relentless. In fact, McKinsey estimates that at a 7% growth, the payments industry will be a $2 trillion dollar-industry by 2020.
If the year gone stands for anything, it is the time when traditional banks stopped viewing payments companies and fintechs as competition. The industry is now firmly entering 2019 in collaboration mode.
Below is a flavour of what we can expect in the coming year.
Increasing harmony between retail and payments
Towards the end of 2018, we saw new partnerships across the retail and payments industries, with e-commerce giants like Walmart partnering with payments titans like PayPal and also Swedish payments start up Klarna raising $20 million from global fashion retailer H&M.
United in the goal to fight for a bigger share of the wallet, the benefits of strong retail and payments partnerships are clear. Retailers can meet customer expectations by delivering a seamless payment process and the payment service provider increases transaction and gains better access to a global network of engaged consumers.
The battle for the hearts and minds of consumers will intensify
The payments industry will have to run fast to keep pace with expectations of a new breed of consumer who expects payments to be invisible and instant. It is likely we will see an increase in partnerships founded on putting customer-centricity, experience and value-added services at the core of what they do.
Specifically, and with the value of the cross-border market expected to reach $994 billion within the next two years, payments companies will need to partner carefully to capitalise on the huge market opportunity. Ultimately, the 2019 consumer will necessitate payments companies to rethink how they can best support ambitious merchants as well as hungry international consumers. But where they look to partner will be as important as who with.
More strategic overseas partnerships
It is likely we will see a number of new overseas partnerships in the coming year. In 2018, we saw the likes of Tencent partnering with online Japanese mobile payments company Line, as well as Brazilian fintech Nubank. We also saw different markets starting to collaborate, with the UK and India joining together on fintech and infrastructure finance and most recently, the UK and Turkey building a partnership on Islamic fintech.
Specific regions are also gearing up to make themselves more attractive. Markets like South East Asia continue making themselves appealing to investors with an enticing mix of booming middle-class populations and rocketing smartphone adoption. Whereas regions like Latin America are mobilising themselves for innovation, for example, the new Mexican fintech law or fintech friendly licenses in Brazil.
Throughout 2019 we will continue to see partnerships accelerate the industry’s transformation, connecting the old and new worlds and creating a perfect storm for payments. Collaboration will be the new mantra as the whole payments ecosystem continues to recognise the need to partner in order to embrace change. There is space for these partnerships, and I am excited to see how they can help drive the industry.
2019 Predictions for Data Storage and Cloud
By Florian Malecki, International Product Marketing Senior Director.
Petabyte era: Petabyte-size data management used to be a challenge only large enterprises would face. With data growing ten-fold – according to IDC - the petabyte era will start barreling down on mid-sized organizations too. What used to be an anomaly, will start to become the norm for SMBs and mid-size organizations. Mid-sized organizations in particular will find their IT architectures simply can’t scale with their data growth. Unlike large enterprises, they won’t have the skills or budget to cope either. The demand to bring data management, protection and cost-effective scale out storage into a single frictionless environment will rise. The benefits will be far-reaching: operational complexity will be eliminated, costly over-provisioning of storage will become a thing of the past, intelligent automation with features such as ‘set and forget’ SLAs will create dramatically improve RTOs and RPOs.
Silos will collapse and converge: With the explosive rate of data growth, the current approach to fragmented data storage, management and protection will be untenable. IT fragmentation with a multitude of point-products has created silos, complexity, vulnerability and out-of-control storage costs. Data silos will have to collapse. If not, IT infrastructures will buckle under the weight of their own data. While the market is already moving toward converged data management and protection solutions, the convergence criteria will shift. Unlike legacy approaches that simply converge secondary and back-up environments, convergence will encompass the entire data environment including primary, secondary, on-prem, off-prem, private and public cloud as well as data management and protection. With multiple technology stacks collapsed into a single, frictionless and converged infrastructure, organizations will be able to achieve immense business value. Data compliance, digital transformation and business agility will be far easier to attain. Plus, overlaying intelligent analytics in a fully converged data environment will make predicting vulnerabilities and the means to eliminate business downtime a reality.
Bifurcated Cloudification: IDC estimates 40% of organizations have moved some workloads and data back from the cloud to on-prem because of cost, performance and continuity concerns. Rather than press pause on the cloudification of data infrastructures, organizations will instead recognize the benefit of a bifurcated or hybrid cloud strategy. Advanced data management will intelligently push and pull workloads to and from the cloud. The blended and intelligent deployment of public and private cloud services will deliver on the promises of unparalleled cost, scale, data management and data protection benefits - and total business continuity. Public cloud will deliver the cost and scale businesses need as they operate in increasingly virtual environments and adoption of Microsoft Office 365 becomes commonplace in businesses. Simultaneously, private cloud services will intelligently recognize and then protect business critical data and applications to provide total business continuity in the event of failure.
A $15b channel charter: Customers will look to their IT partners to help them build converged data management and protection infrastructures that scale with ease, are fluid between cloud, on-prem and off-prem, promise to protect against vulnerability and provide instant recovery from downtime. With this evolving customer charter, we'll see a continuing move away from the traditional channel labels of MSP and VAR. The $15+ billion business continuity category represents huge upside potential for channel organizations in 2019. It opens up new business opportunities in terms of entering new markets, expanding their offering within their existing customer bases, and the ability to grow important recurring revenue streams through cloud services.
by Robert Cowham, Senior Consultant, Perforce
In a world where IoT and AI are increasingly prevalent, software becomes more complex than ever before, with more components and greater pressure to get releases out fast. Against that is the need to create digital artefacts that are safe, secure and – in many industries – also compliant. This is why some big themes have emerged in the world of software development that are going to take centre stage in 2019. Here are a few.
Greater emphasis on software development security, particularly in IoT – software development is where flaws and vulnerabilities can be created, so it is vital that this part of the product lifecycle is secure, especially in safety-critical markets like automotive and medical devices. Companies also need to be able to demonstrate to external auditors and compliance bodies the processes behind creation of code and other digital assets. This is why companies worldwide are using techniques and processes like testing earlier in the development process; continuous code inspection; use of coding standards to ensure code quality and compliance to achieve that goal. Application lifecycle management (ALM) also has a role to play in ensuring that quality is maintained in future iterations or releases. This is important given that many products will have multiple updates in their lifespan.
DevOps and Agile at scale – two popular development methodologies that have already experienced success at team or departmental level are increasingly being adopted on an enterprise-wide scale. Extrapolating the benefits of Agile and DevOps to this level brings its own challenges, for instance, giving the business the control and visibility it needs over large-scale projects, but without limiting individual motivation and flexibility. Ways that organisations are tackling this include Agile planning management; adoption of hybrid approaches (such as Agile Kanban and Agile Waterfall); software tools that enable multiple contributors across different systems, file types, locations and jobs to collaborate seamlessly.
Shift left – everyone is a tester these days – shift left has popped up as a recurring theme in recent months. It describes how testing is not only happening earlier in the software development lifecycle (and also throughout, in other words, continuous testing), but more tests are being carried out by software developers themselves rather than just unit tests and then everything else being tested at a later stage by QA and test managers. Shift Left benefits include earlier discovery of problems, which makes them easier and less costly to resolve at this early stage rather than when the product is about to hit production. It represents a cultural shift for software development teams, who are not usually trained to carry out more than unit tests. The way to solve this is to get testers involved early and give them tools that automate the process and carry out as much as possible in background mode, for instance static code analysis.
As businesses and consumers we are dependent on efficient software more than ever before and likewise, software development – and the tools that support it – are at an unprecedented fast-changing stage of their evolution.
What we learned about Artificial Intelligence in 2018
By Parry Malm – CEO/Co-Founder Phrasee.
Artificial intelligence is easily the most often discussed and seldom understood buzzword in common use today.
Debates about the nature of “intelligence” and AI’s implications for humanity date back decades. The fact is, such debates and concerns are largely academic at this point. AI technology has already progressed well beyond the theoretical and is now impacting our daily lives. 2018 was a seminal year for AI, not least because we learned what AI can and, more importantly, can’t do.
Things AI can do
Things AI can’t do
Ruban Phukan, VP Product Cognitive First at Progress:
1) Applications of AI and Machine Learning will start playing a leading role in the digital transformation of manufacturing. Data science will move from research labs to production.
2) More manufacturers will move from condition based maintenance to predictive maintenance with IIoT. This will significantly minimize unplanned downtime, quality issues, costs of maintenance and risks.
3) More OEMs, especially the critical and expense equipment manufacturers, will offer uptime-based services to their customers. This will also require them to provide asset management as a service for their equipment.
4) With digital transformation of manufacturing there will be a growing need of Industrial Apps. hpaPaaS (high productive application Platform as a Service) will play a crucial role in helping manufacturers rapidly build applications with great UI/UX for both internal use as well as for their customers.
5) IIoT will not just transform maintenance and field services but play an important role in transformation of other aspects like inventory management, supply chain optimization, managing bottlenecks.
6) We will start seeing early signs of convergence of technologies like AI/ML, AR/VR and Blockchain with smart solutions.
From Patrick Smith, Field CTO EMEA at Pure Storage
Containers and the Hybrid Cloud:
Container technologies have proven to be very attractive, particularly in DevOps environments, but the initial lack of persistent containerised storage made them a poor fit for many enterprise production applications. This held back adoption.
Automated, intelligent and scalable storage provisioning makes deploying large-scale container environments to an enterprise data centre possible.
As a result of the development of container storage-as-a-service, in 2019, we believe that the new normal will be running production applications in containers irrespective of whether they are state-less or data-rich. Container hosting environments ease development, accelerate deployment and simplify scaling as part of the maturing adoption of DevOps and Site Reliability Engineering methodologies. This means that businesses can drive fast innovation and delivery of new features to their customers.
Container adoption will increasingly be driven by the demand for cost effective deployments into hybrid cloud environments with the ability to flexibly run applications either on-prem or in the public cloud based on their requirements or characteristics.
NVMe over Fabrics: Next-generation media
We introduced NVMe into our arrays some time ago, before making it standard across the FlashArray line in May 2018. Cost-effective all-NVMe for every workload can be a crucial advantage for all businesses. NVMe makes everything faster – databases, virtualised and containerised environments, test/dev initiatives and web-scale applications.
We expect NVMe over Fabrics to move from niche deployments and take a step towards the mainstream next year. This is the next logical evolution. With price competitive NVMe-based storage providing consistent low latency performance, the final piece of the puzzle will be the delivery of an end-to-end capability through the addition of NVMe-oF for front-end connectivity.
This will be particularly applicable to environments seeking better performance, even lower latency and less compute overhead; all enabled by NVMe-oF. When you can deliver on these promises to the business using the same underlying infrastructure, it just makes sense that NVME-oF will become a popular choice.
No industry will escape the influence of the tidal wave of change precipitated by artificial intelligence (AI) and the reality is that it will fundamentally change human processes within the next decade.
By Eli Fathi, CEO, MindBridge Ai.
The financial services industry is no exception, with larger firms feeling the pressure of new entrants tackling the “low-hanging fruit” of AI to quickly gain a competitive advantage, while they struggle to define how AI will fit into existing infrastructure, processes, and industry regulations.
This struggle boils down to a set of barriers to adoption, such as data inertia, the perceived lack of technical skills, and how AI fits into existing industry regulations and compliance. These issues are driving some firms to adopt limited forms of technology, such as robotic process automation (RPA), and claim the benefits of AI when they don’t exist.
We must understand these challenges in order to break them down and foster a better path towards unbiased AI adoption for the industry.
Data, data everywhere
Data is the backbone of many business processes, so it’s no surprise that firms turn to AI to help sift through and understand what the data is saying. Financial services organisations encounter additional challenges associated with the vast amount, complexity, and number of their data sources and if these aren’t addressed during the AI adoption process, the problems will only grow.
Rather than treat data as an intangible by-product of business processes, it must be a first-class citizen, respected as a key enterprise asset, and embraced from the CxO level down throughout the organisation. As more data is created and structured, there is also an increased likelihood for privacy, security, and functional risks. Organisations must move away from thinking that data functions are owned by a single department. In some financial institutions, marketing is the custodian of the data, mining it to offer additional services to the customer base, resulting in a limited organisational view of the potential opportunities.
The keys to smoothing the big data problem is to treat all data sources as the crown jewels of the organisation by exercising good data hygiene and clearly defining the relationships and ownerships throughout the organisation. For successful adoption, firms must ensure that all data sources are accessible, understandable, and secure. In addition, firms must build data literacy at the senior levels so that decision makers have enough information to execute clear and realistic strategies.
Many firms have adopted technologies such as RPA, intelligent automation, or intelligence process automation (IPA) and claim benefits to clients that are like those provided by AI-based solutions. RPA and AI are not interchangeable solutions but rather complementary. RPA is best suited for automating existing rules-based processes that are repetitive, often time-consuming, and based on well-structured data. AI is far more powerful, using the data to help make decisions and predictions based on reinforced learning, providing insights that go beyond the rules. It is AI that will truly revolutionise the way we think, act, and talk about financial services.
Communication and collaboration
As with any new technology, AI is met with a degree of scepticism. There is a huge amount of investment going into the AI space now, however, the outcome of this investment is not often communicated to a wider audience. In order for AI to be seen as truly collaborative, these technologies need to be understood by everyday people, particularly those who are hesitant to understand it. One of the biggest barriers to adoption of AI is people not having a full understanding of how products work, so innovators need to think outside the algorithm box if it going to be adopted on a wider scale.
In AI we trust
The most disruptive element of AI is the ability to codify human intelligence and apply it on a vast scale, but some companies are still hesitant to adopt this revolutionary technology. For AI to be accepted and adopted on a wider scale, organisations must work to build their trust in technology as a genuinely useful tool for collaboration. Taking the world of audit as a use case, the deployment of AI-based solutions not only increases the efficiency of the task but also gives suggestions as to why a certain transaction has been flagged as unusual. AI cannot be seen as just ‘a black box,’ it must be a collaborative tool that humans can understand and use to make decisions based on the provided insights.
AI will breathe much-needed innovation into traditional industries, such as financial institutions, and will make significant impact if firms adopt the technology and incorporate it into their operations effectively, generating new services and delighting clients in new ways.
Personalised interactions build lasting relationships between brand and customer.
By Stuart Robb is CEO and founder of Equiniti Data.
In a world driven by data, we share a little bit about ourselves, either consciously or subconsciously with virtually every interaction we have with a brand and savvy businesses must use this insight to convert information into opportunity. Identifying what customers want can present a challenge but it is a worthy process in helping brands to communicate with consumers in a way that is personal yet respectful. Making these communications meaningful is only possible when a personal connection is made between the brand and the individual. By compliantly extracting details about consumers from every interaction, brands have the power to deliver a truly personalised experience that makes their business stand out from the crowd and ahead of the competition.
In the business world, customer experience (CX) has become a key component of any forward-thinking business strategy. Around two thirds of marketers say their organisation already competes mainly on CX and this is a trend expected only to grow with Adobe’s Digital Trends report identifying CX as a top priority for marketing and technology professionals in 2018. As businesses across the globe continue to invest in the experience of their customers, they look to new and innovative ways to stand out from the competition and build customer loyalty. Delivering a personalised service is certainly a key component in the overall customer experience. Consumer choice and expectation has grown exponentially in recent years and customers now not only appreciate but often expect a personalised approach from their favoured brands.
The power of emotion
The first challenge for brands is to get to know who their customers are and what drives their buying choices. We know that when it comes to customer loyalty, emotion counts. When people can relate to a business on an emotional level, it inspires a connection which ties them to the brand because of the way it makes them feel. Clever businesses know that loyalty means customers will choose your brand time after time often regardless of price. Exceeding customer expectations drives loyalty that will forge longer lasting and ultimately more profitable connections.
Data analysis and the profiling of customers forms an essential part of personalised marketing. For marketers, data is everywhere: there is certainly no shortage of this valuable commodity however it is how we access and use it that can transform how businesses communicate and relate to their customers. In order to unleash the full potential of customer interactions, brands must build strong data portfolios and use the information in an intelligent way, providing the insights needed to create relevant and unique experiences that make people take notice.
EasyJet CEO Johan Lundgren announced earlier this year that the airline would work hard to optimise its use of passenger data for the specific purpose of enhancing customer experience and driving loyalty stating he would make EasyJet “the most data-driven airline in the world”. Having appointed the company’s first ever Chief Data Officer, Lundgren identified that by getting to know customers through data analysis and using that knowledge to communicate with them in a personal way, the business could transform more customers from occasional users to loyal followers.
A personalised approach
Personalised brand interactions not only enhance CX but also have the power to influence consumer behaviour and buying habits. Research conducted by Accenture found that 56% of people are more likely to purchase from a brand that uses their name when making contact and 65% are more likely to purchase from brands that market to them in a relevant way based on their preferences and purchase history. It certainly seems that personalised communication that is tailored to a customer’s specific interests motivates them to buy. In a competitive marketplace, expectations are high and consumers have developed an expectation to engage with businesses in more ways than ever. If your business doesn’t connect with its customers in a way that appeals to them personally, there are plenty of competitors out there who can.
All in the detail
The more detail a business has about its customers, the better it knows the individuals behind the transactions and can relate to them on a personal level. The real challenge is to consolidate vast amounts of consumer data from various disjointed touch points and transform partial identities into complete profiles to create a ‘360-degree view of the customer’. By developing a complete picture of the individuals behind transactional data, businesses can tailor future approaches to suit their customers exact personal requirements.
Robust consumer insights come from demographic, socio-economic, behavioural and transactional profiling unified to identify not only historical and current preferences but also to predict future activity. This predictive data can be used to proactively market to individuals in a relevant way. Amazon is just one great example of consumer data being used to add personal touches and drive future purchases through recommendation based on known individual data. Each time a customer visits the site, they are welcomed with a personalised message while relevant previous viewing history and sales messages automatically load for appealing products. Not only are customers presented with content that interests them; they also avoid being bombarded with frustrating irrelevant content that doesn’t.
Mass marketing communications no longer meet with enhanced consumer demands. In a world where everyone is empowered to shop around for the best experience, brands really need to up their game. Consumers can be fickle, shifting their loyalty between brands based on the last experience they had so it is imperative that businesses work hard to meet with enhanced expectations with every interaction. Used effectively, data collection and analysis can provide businesses with a valuable opportunity to gain a deep understanding of customers and use this knowledge to enhance the overall experience. By getting to know consumers on a personal level, brands can expect to forge more profitable and long-lasting relationships while also attracting the attention and commitment of new customers.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, the January issue contains the second, and this February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 6.
Analytics 2019 – The Year of ‘Analytics Everywhere’
As 2019 approaches, technological changes are happening faster than most people can comprehend. These changes affect everyone, from the places we work to the societies we live in today, as data quickly turns into one of the main sources of power.
So what developments can we expect from the world of data and analytics next year? Here are 5 trends that are shaping tomorrow:
1. Kubernetes Comes of Age, Microservices Become the Norm
One of the biggest megatrends of tomorrow is microservices and Kubernetes. Microservices are a new approach to application development in which a large application is built as a suite of modular components or services. Kubernetes is a hugely impactful technology innovation that can orchestrate and distribute containerised applications and workloads.
What used to be monolithic is now distributed, enabling a new way to scale. It’s a way to access and process data locally and reach places that BI platforms of the past couldn’t. Company software development teams are now rapidly adopting this approach. In the span of a year, it’s gone from emerging to becoming a hygiene factor, where app dev teams at enterprises are now orchestrating container-based applications, and demanding production Kubernetes environments.
2. The Multi-Cloud, Hybrid and Edge Continuum
The shift to multi-cloud is happening. IT-leaders are increasingly migrating not just their born-in-the cloud data, but also the mission-critical data that runs their business. The promise of on-demand capacity, low-cost storage, and a rich ecosystem of tools is compelling. However, organisations are not centralising all their data into one place, for risk of lock-in and the inability to be flexible for regulations like GDPR.
Beyond data protection, managing data in the cloud has its own set of rules. The shift from on-premise and legacy data centres should therefore be done at a pace organisations feel comfortable with. The ability to centrally calibrate and distribute to multiple clouds, as well as hybrid on-premise and cloud continuum, are good ways to hedge bets. Edge computing delivers the decentralised complement to today’s hyperscale cloud and legacy data centres, and is often preferred for latency, privacy and security reasons. A post-modern platform should be able to handle distributed data, workloads and usage across multi-cloud, hybrid and edge as a continuum.
3. “Analytics Everywhere” Reshapes Processes in Real-Time
Embedding analytics into business processes isn’t new, but it’s now becoming mainstream. Users want analytics in their workflows as it helps make data more actionable and increasingly also real-time. All of this is being fuelled by machine learning and AI, which provides contextualised insights and suggested actions. It’s the foundation of “continuous analytics” in which real-time analytics will be gradually integrated within a business operation or IoT device, processing data to prescribe actions in response to business moments and at the edge.
In the next five years "intelligent" applications will be ubiquitous. Furthermore, analytics is starting to re-shape the process with new technologies like robotic process automation and process mining.
4. Analytics Reaches Consumer-Tech Performance Levels
Performance is undervalued when it comes to tool selection. Where query performance is good and latency is low, analytic workloads run smoothly. If a query takes longer than a few hundred milliseconds, users may not leverage it in a business process or an augmented reality experience. In organisations, as the self-service trend was in its nascence, perhaps performance was overlooked by many because building visualisations on a flat file doesn't take that much horsepower.
But many self-service BI solutions fail when it comes time to scaling more data, workloads and people across the enterprise. Performance has also been a bottle-neck for distributed big data at scale and the reason why many Hadoop projects failed to become much more than cheap storage. Breakthroughs have recently been achieved through indexing, caching and pre-preparing very large and distributed datasets. Now, as companies of all sizes are increasing their adoption of hyperscale data centres, performance will rise in the selection criterion. Some organisations have moved their data back through “re-patriation” because they haven’t been seeing strong enough performance. This becomes even more important in an IoT application. More and more workloads will run locally or at the edge to avoid latency. In short, efficient performance will be a deciding factor for how architectures will look – centralised or distributed.
5. Analytics Platforms Transform into Data Ecosystems
BI and analytics are the most effective for organisations when viewed as “a system” and not simply as a series of artifacts and tools. An important difference is that individuals use tools, but people participate in systems. A “post-modern” system contains a whole host of people with differing roles, skills or intentions. And they aren't the only participants. Digital services, bots, intelligent agents, extensions, algorithms and so on, also participate. The diversity and sophistication of these non-human participants is set to grow astronomically in the coming years. And it is the exchanges and learning between all these participants that increases the value of the system, augmenting both the human and machine intelligence within it. An open, self-learning system, containing the 9 trends prior, and improving with further participation, will define what the future platforms looks like, and will help enable data democracy and analytic empowerment.
2019 is set to be an important year in the world of data and analytics. As new technologies help drive greater efficiencies within businesses, organisations must have the systems in place to adopt and embrace them.
Lee James, EMEA CTO at Rackspace
The number and range of different cloud services that organisations are using is increasing rapidly. Just think about all the data gathering silently involved in our Christmas shopping. Firstly, data is gathered from our behaviour as we walk through the store, use the app, search online for vouchers or price comparisons, while concurrently monitoring stock levels and registering sales when we get to the tills. All this data feeds into a central point to help the store decide what stock replenishment is needed in real time, so there’s no chance of scuffles breaking out over the last bag of Yorkshire puddings – or they can at least notify security in advance!
Data can inform decisions across every function in every business, with different processes requiring different cloud platforms and services bespoke to those particular demands. Multi-cloud Edge management will rise in 2019 as it becomes a critical asset of the business as they integrate and manage the multi-endpoints that generate this data, as well as processing it and turning it into actionable insights.
Making better use of your security experts
Mike Bursell, Chief Security Architect, Red Hat.
Over the next 12 months, I'm hoping we're going to see a de-insularisation of security within the business and a move to automated processes. As everyone comes to terms with the growing customer expectation of speed to market and requirement for near instantaneous innovation in services, expecting security to perform a “sign-off” and function before any product or service goes live has become completely unsustainable. You have three options:
Come up with “set-in-stone” business-wide security rules and hope that you can manage the inevitable exceptions that almost every project will generate;
Acknowledge that you can't move fast enough with existing processes and try to fix problems as you notice them;
Find a way to move your security expertise into automated processes that are both fast and scalable.
The first of these is never going to allow you to move quickly enough to cope with the increasing speed of deployment. Whether you adopt agile methodologies such as DevOps for all of your development immediately or gradually across your teams, a fundamentally reactive approach will see you losing ground to your competitors and new movers in the market. The second option will, sooner or later, lead to a breach or service disruption that you are unable to manage and is the sort of strategy that will get you called into your CISO's or CFO's office for a very brief and even more uncomfortable conversation. The last option, then, is the one that you need to embrace, but what does it mean and how do you implement it?
The first thing to do is move your security experts out of their “ivory tower”. To be fair, most security groups or departments are much less insular than this in practice, but perception is everything. In 2019, make it a priority to encourage greater mixing of your security expertise through the various departments with whom they work. Not in a “look, here’s a security person, call on him/her if there’s an issue” way, but by getting them involved and invested in the work of their colleagues in different departments and functions, so that both “sides” see the benefit, and stop thinking of each other as the opposition, but as colleagues.
On its own, this isn’t enough: however many security people you have, they still won’t scale. You need to encourage skills transfer. We can never expect every person in your organisation to become a security expert, but if they know the basics, and know who to turn to when they realise that they are moving out of their comfort zone, then you’re already scaling your security capability into the wider business. This should also help your security experts to realise that their expertise is still relevant and useful: there’s a careful balance here between encouraging information transfer without dilution that needs to be carefully monitored.
Once you’ve managed to get security into the hearts and minds - well, minds, at least - of the rest of your organisation, it’s time to think about how you start moving the expertise that your security group brings into the processes that your development, testing, operations, audit and governance teams run day-to-day. This is where you can really start to scale out. If your security experts can identify what points in the development process are security-critical - choice of base container images, for instance, or maybe where monitoring of your operations can actually aid your auditing process - there are opportunities for you to automate increasing parts of your security function into the processes themselves. This is absolutely not about making the work of your security experts redundant, but about releasing them from humdrum daily “unblocking” tasks and allowing them to concentrate more on interesting, valuable tasks where they can come up with innovative processes themselves.
More Magecart but with a different goal
“I’m expecting new variants in web skimming attacks, especially as we observe the different Magecart actors staying active longer and becoming broader and more expansive. While payment data is currently in focus, because web skimming can skim any information entered into a website, Magecart groups will expand to skimming more than just credit card data to login credentials and other sensitive information.”
Yonathan Klijnsma, head
threat researcher at RiskIQ
Attackers will continue to discover and target organizations’ blind spots outside the firewall
“Hackers will continue to capitalize on the weakening of the corporate perimeter caused by customer and partner interactions moving online. An organization’s attack surface—everything it needs to worry about defending—begins inside the corporate network and extends all the way to the outer reaches of the internet. As a result, hackers are becoming increasingly sophisticated at collecting external data about their targets, and are using it to discover and exploit assets online that security teams are unaware of, or lack the resources to protect.
Lou Manousos, CEO at RiskIQ
New trends will introduce more places for threat actors to hide
Brandon Dixon, VP Product at
Governments will continue to call out state-sponsored hacking
“The ‘attribution game,’ with governments now confirming private industry research and outing state-sponsored operations from other countries with indictments is only going to expand. Espionage operations have always been treated very publicly over the years, but recent indictments of Russian, Chinese, and Iranian actors have brought it to the next level. With heightened tension in Eastern Europe and Asia, expect state-sponsored attacks to increase in intensity and governments to become more aggressive in their response.”
Yonathan Klijnsma, head threat researcher at RiskIQ
Threat actors will be using machine learning, so businesses need to be continuously improving theirs
“Threat actors will increase their adoption of adversarial machine learning to evade detection by infrequently trained machine learning models. The good guys’ machine learning models will need to evolve quickly to keep up with these threats by incorporating instance-based approaches, which use models that can learn incrementally from data scientists providing frequent feedback. The world changes all the time, and it’s important that your model changes with it. If you need your model to keep up with current trends, selecting an instance-based model or a model that can learn incrementally is critical. Just as providing frequent feedback helps an employee learn and grow, your model needs the same kind of feedback.”
Adam Hunt, CTO at RiskIQ
PII will be a primary target for threat actors
“Adversaries will continue to evolve their tactics to steal personally identifiable information (PII) from individuals and intellectual property (IP) from organizations. During the last 12 months, we’ve seen compromised java scripts skimming credit card data from payment forms. During 2019 we expect the depth and breadth of this approach to expand to target PII and IP data as well.”
Fabian Libeau, VP EMEA at
Those who ignore their internet-facing attack surface will continue to falter
“The investments in securing corporate infrastructure have not worked, and companies will continue to be overwhelmed by the scale and tenacity of modern digital threats originating outside the firewall. As these organizations struggle to manage their digital presence, adversaries will grow more sophisticated and leverage data stolen from breaches in precise, finely-targeted attacks. They will also leverage machine-learning and artificial intelligence to drive high-powered attacks against businesses and to penetrate critical infrastructure.”
Dan Schoenbaum, president
& COO at RiskIQ
Open source will eliminate barriers between platforms and Blockchain will finally find its place.
· Open source to encourage collaboration and close the skill gap
· Blockchain hype to die down and true strengths revealed
· Companies to stop spending money on maintaining legacy systems and start investing in modernisation
· Analytics to continue to move closer to data, enabling real-time reconciliation and security
· IoT - Smart cities to expect real breakthroughs – and cyber-attacks
· Quantum computers take a big step towards mainstream use
Rocket Software Inc. forecasts significant growth in numbers of developers using open source on a mainframe, the move towards modernisation presenting a game-changer for legacy systems, and the increase of Blockchain popularity in the supply chain – along with three additional predictions for 2019.
The trend towards collaborative working will continue in 2019, with open source eliminating the barriers between platforms.
Almost 19 years ago, IBM was the first major computer power to embrace Linux which today translates to a staggering 90% of mainframe customers leveraging Linux on their mainframe. With the recent launch of open-source frameworks, the divide between modern applications and the mainframe will be reduced through the increase in accessibility. As it becomes less about what platform you’re using and more about what can be achieved with it, we’ll see a shift in how the mainframe is perceived.
Languages and tools like Python, PHP, Java and Git, can all be used, allowing developers, especially students fresh from university, the chance to code on a platform which they might not be familiar with, or even perceive as ‘bygone’. Open data and open source will be the driving force for future innovation, encouraging the next generation not to shy away from the mainframe, thereby creating an opportunity to bridge the skills gap.
Once hailed as the saviour of the Irish border issue and secret weapon in the fight against world hunger, Blockchain has had a tough couple of months. ‘Too slow’ the banks said, ‘no use for a distributed ledger’ the nay-sayers criticised. But the new year will see Blockchain finally come into its own with the manufacturer crowd. While its supporters might have to accept that it may never be adopted across the board, Blockchain will certainly make its mark in the supply chain – processes like B2B transactions, ordering, invoicing, payments, stocking, etcetera, will benefit hugely from implementing the technology.
Modernisation for Innovation
Legacy system owners will come to realise that the ‘why fix what isn’t broken’ mentality is not sustainable. The capital tied up in mainframe maintenance is better used to modernise it and take advantage of new technologies. The mainframe provides power and security and is not going away. Therefore, the motto for 2019 is modernisation rather than replacement. Legacy systems will see true innovation – there’s life in the old dog yet.
Having been ‘the future’ for years, the Internet of Things is continuously making leaps and bounds - GSMA Intelligence forecasts that there will be more than 25 billion "internet of things" connections by 2025. Guy Tweedale, regional VP at Rocket Software would go a step further: “We expect the number of connected devices to surpass 35 billion in the next six years. With the increasing popularity of smart devices, especially in areas like transportation, the figure might even be higher.”
Thanks to the introduction of 5G, devices like parking meters and traffic lights finally have enough power to bring about real breakthroughs in the near future. Moving away from pure data collection to performing actual functions based on external data inputs, congestion will be eased and road accidents reduced – cities will get smarter. However, the more devices get connected, the more we have to prepare for cyber -attacks which are expected to increase in number and severity.
While it might not be available to the general public in 2019, or possibly anytime soon, quantum computing deserves a spot on this list as it is one of the most exciting future developments of our time. Using quantum mechanical phenomena such as superposition and entanglement, quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time, allowing to solve problems that are impossible for classical computers to tackle.
Currently only running on a small 20-qubit quantum computer via the IBM quantum experience project, should quantum computing become widely available, the effects are going to be revolutionary. The entirety of security, pharmaceutical and financial industries will be changed dramatically, and lives saved through e.g. atmospheric mapping in real time to avoid hurricanes. In the future, tragedies caused by the late evacuation as it was the case during Hurricane Katrina in 2005, can be circumvented. Due to the processing power of quantum computers, the two-to-three- day hurricane forecast will be as accurate as the generally spot on 24-hour forecast today. The future might be tumultuous, but we’ll be equipped to weather the storm.
From Paul Trulove at SailPoint
2019: Living in the age of data
Mark Billige, managing partner, and Shiv Pabari, manager, at Simon-Kucher (www.simon-kucher.com), the world’s largest firm of pricing consultants.
As the amount of information captured about customers continues to increase, in 2019 we feel that a key technological development is that companies are going to get a lot cleverer in the way they use this data.
More personalised experiences for customers
Increased data literacy within organisations
More focus on monetisation
Greater consciousness of how data is used
With 2019 well underway, the hype of 5G and the growth of IoT are filling my thoughts, with both – especially the fifth generation of mobile networks – in a relatively early stage. So, how will 5G and IoT develop in the New Year?
By Ronald Sens, EMEA Director at A10 Networks.
First Operational 5G Networks
This year is expected to see the first operational 5G networks become available in selected areas. However, the iteration of mobile network technology won’t fully replace existing 3G and 4G networks, leaving many of us questioning what 5G’s real benefits will be, and how much extra will it cost?
As previously, when new generations of networks have been introduced, 5G will need to coexist with existing networks to support all subscribers and the broad diversity of the installed base. This presents an opportunity for service providers to review and adjust their strategy, based on lessons learned during the earlier deployments of what worked best for their customers.
Radio Access Networks Continue to Evolve
The Radio Access Network (RAN) continues to evolve. This will drive increasingly flexible deployment options. Associated methods of support will be based on Software Defined Networks (SDN), or as some call it ‘self-driven networks’, using a mix of Virtualised Network Functions (VNF). This evolution of legacy hardware into software defined functions continues the development of architectural changes.
In many instances the architectural focus is in the balance of two things:
Network Operations Continue to Evolve
More hardware will be needed to handle the adoption of SDN and VNF. This will drive even more network advancements that will then require even more software. The continued cyclic evolution and shift to a more rapid deployment will drive an increasing need for DevOps. Expect to hear more about Continuous Integration and Continuous Delivery, concepts that have gained attention in the development of cloud.
The development and integration of complexities will increase operational overhead before simplifications of networks can start to be realised. The operational complexities, if not managed through self-healing and the use of intelligence, could delay uptake as the evolution of network operations is just as important as the network itself.
Virtualisation Continues to Evolve
Last year saw many of the service providers focus more effort on investing in resources as the shift from “old iron” to the more flexible virtualised environments hastened. The shift is part of the bigger movement associated with decentralisation, consisting of deploying smaller but more capable systems to handle the increasing traffic. This is not a new trend as network tonnage continues to grow no matter what kind of carrier is supporting it. The tonnage increase continues as more and more devices become available and this becomes the basis for Internet of Things (IoT).
5G Network Continues to Scale Out
As 2019 progresses, subscribers will see 5G progress from conversation to actually supporting a limited quantity of customers. Mobile subscribers will need to obtain new devices. The work done by RAN and RF manufacturers will move from prototypes into commercial products at an increasing pace.
Debate of Hardware vs. Software
5G drives increasing scale throughout all parts of a network, at the same time changing to decentralise many components of the infrastructure. The decentralisation is forcing trade-offs between high-performance purpose-built hardware and the use of software defined solutions.
The trade-offs and costs will remain hot topics for years to come as custom hardware remains a leader in many instances. Virtualisation will move further ahead into containers and orchestrations to knit cooperative suppliers into a cohesive scaled solution.
Radio Spectrum and Government Regulations
Talks regarding radio spectrum, who has what frequencies, and how they should be used, will keep governments around the world occupied. Wi-Fi offload and interconnects between public and private networks will continue to improve coverage.
The story of wireless mobility coverage will be a very interesting topic to watch unfold. 5G deployment topologies will fight to cover every inch of the major populations. Rural areas will have the most to gain.
Subscriber Growth Slows, Connected Devices Increases
Subscriber counts will not go up as much as in years past. Saturation of connected individuals is reaching an all-time high although the count of connected devices will increase at a more rapid pace, as the IoT expands.
DDoS Attacks Grow in Frequency and Size
With the increase of IoT, DDoS attacks will grow in frequency and scale. More devices going on-line places even more pressure on service providers to increase their ability to combat DDoS attacks.
Mobile Micro-Clouds in Containers
Look for the development of small cloud datacentres located at the edge of the internet, also known as cloudlets or mobile micro-clouds. This ‘edge’ or ‘fog’ computing model is an extension of existing cloud infrastructure. They will improve the latency of services for mobile devices.
The adoption of 5G creates the demand to offload resource-intensive services to the network’s edge. Cloudlet deployments will have to be very agile, so they can be quickly provisioned to adapt to a rapidly changing market. Because of this, they’ll be based in the container architecture.
As the network model evolves, so too do the opportunities for innovation and threats from disruptors. 2019 promises to be yet another fascinating year in the network world.
SDS, cloud, NVMe, flash, IoT, GPUs, and startups: Excelero VP of Corporate Marketing Tom Leyden on the data storage trends for 2019 and what happened to his industry predictions for 2018.
Reflections on 2018 forecast
SDS will replace traditional storage arrays
This has not materialised – yet. Storage arrays are not dead. But neither is tape, though its demise was foretold to happen years ago. Flash arrays are likely to stick around for the foreseeable future. Software-defined storage, however, continues to gain market share, with adoption across a range of different industries. One particular example is how the M&E industry is switching to SDS to solve the network bottlenecks caused by array-based architectures. Conclusion: prediction in progress.
Cloud for secondary and on-premises SDS for primary storage
2018 did indeed see growth in the adoption of SDS. Though this prediction took a surprising turn: it was expected that we would see the secondary storage market make a complete move to the cloud, however, customer insights have recently revealed that organisations are having second thoughts on their cloud strategies. It will be interesting to watch this evolve throughout 2019. Conclusion: prediction unsure.
NVMe-based flash will become a commodity
2018 witnessed the cost of NVMe drop significantly and customers from verticals such as M&E, HPC and others, have been deploying NVMe at scale. Conclusion: prediction true.
Network vs. Storage
The prediction that the feud between storage and network teams will come to an end may have been a little optimistic. With increased networking options for distributed high-performance storage architectures such as NVMe over TCP/IP, things are however moving in the right direction. Conclusion: prediction in progress.
12.5 storage startups in 2018
And finally, Tom's predictions for last year concluded with the thought that ‘2018 will see the emergence of exactly twelve and a half new storage startups’. Did anyone count? Was it close? Conclusion: probably close enough.
Looking forward to 2019
SDS will give integrators a leading role in 2019
In the coming year, integrators will play an increasingly important role in bringing to market specialised storage solutions for cutting-edge applications like artificial intelligence and machine learning, as well as virtual and artificial reality. This is because integrators have access to scalable, high-performance software-defined solutions, a wide selection of hardware components and top-notch applications such as data analytics.
The growth of flash and NVMe
As discussed above, NVMe became a commodity in 2018. It can be expected that enterprise NVMe and enterprise SATA SSD will achieve price parity on a GBP/GB basis in 2019. Further, while not dropping in cost to match spinning drives, advancements in density such as 128TB SSDs vs. spinning 3.5” drives at 14-16TB will cause end-users to consider SSDs not just for performance, but also for scale-out use-cases. The advantages in density, power (and hence cooling) and random access ability of these SSDs will lead users to start to doubt the viability of spinning media vs. the convenience, speed of access and reliability of solid state media.
Latency is the final frontier, which NVMe will overcome in 2019
IOPs are becoming less of a focus, with single off-the-shelf flash drives capable of delivering 4-5 million IOPs per drive, 80% of the performance offered by a traditional system and at a fraction of the price. Latency for scale-out architectures has become the final storage frontier: these same flash drives offer under 200 microseconds of latency, but the real challenge is to deploy NVMe at scale while preserving those low-latency characteristics. Expect this challenge to be solved in 2019.
2019 is the year of NVMe over Fabrics via TCP/IP
The combination of a rich server ecosystem with all server vendors offering NVMe platforms, combined with the ubiquitous nature of Ethernet and TCP/IP will lower the hurdle to trial and adoption - driving a revolution in next generation SANs.
2019 will see IoT applications drive storage demand
While there has been much talk that AI would become the driver of storage requirements in 2018, it is not clear that this has come to pass. It seems that that AI has a long way to go before it becomes what the marketing folks claim it is (i.e. machines actually learning things autonomously). Some claim IoT will need AI to make any sense of all the data connecting devices gather, though this could be just another way of feeding the AI hype.
That said, we cannot ignore the Internet of Things: billions of connected devices are capturing unprecedented volumes of data capable of delivering valuable new insights. This data must be stored and the rise of IoT will define new storage requirements: higher levels of scale and predictable performance (latency!).
GPU will require scale-out tier-0 storage in 2019
Whether it is for high-performance analytics or artificial intelligence and machine learning, the adoption of graphics processing units (GPUs) will be very much on the rise in 2019 as the use of GPU computing expands from graphics processing to a proliferation of other use cases demanding high performance and very high compute power. Deploying GPUs at scale for analytics-based applications requires super-fast scale-out tier-0 storage to feed the processers with larger data sets (versus using local flash only).
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, the January issue contains the second, and this February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 7.
Who is saying what about 2019?
There is much concern around the cost of data protection and the results of breach events. Not specifically fines, but reputational damage. The cost in compensation and business damage far outweighs regulator fines, even with the possible new GDPR fine levels. No one in technology is complacent about this, and no one is happy with a competitor’s hack as it could clearly happen to anyone. The increase in regulation is also bringing compliance and risk skills into the mainstream, meaning this moves further away from a tick box activity to a core part of business operations.
Which sectors will win?
Fintech will continue to gain on banks - with the changes introduced by PSD2/Open Banking, the stranglehold of traditional banks will continue to flounder. It is clear from the level of investment and acquisition in the market that banks see Fintechs as worrying competition. They have the agility to innovate without the cost of the legacy system millstone holding them back. Cloud providers will also continue to push out data centres and hardware manufacturers as cloud usage keeps increasing. This is hitting data centres and hardware manufacturers that have realised they can’t compete with cloud directly, even in more security-conscious sectors. Instead they are looking to create complementary solutions such as edge systems that require hardware but work alongside cloud.
In the UK specifically, with increasing pressures on public spending, healthcare is likely to be an important sector as the government pushes for more private sector solutions in an attempt to innovate out of constraints.
What skills are needed?
So many, and the skills gap is widening. Flexibility and speed of learning are always important, but specific areas such as data security and data law awareness need expertise at every level. This can be a real differentiator for a company in a sector that takes the issue seriously.
How fast is the pace of change?
The pace of change is the same as ever, which in technology is much faster than other industries. Application time-to-market is quicker because of the services now available from cloud providers that can make a seemingly polished product quick to create, but we always have to be aware that this can hide a multitude of sins, such as lack of testing, cosmetic quality over product, and code quality. In parallel, the marketing hype cycle seems to be increasing, leaving technology behind in its ability to deliver what the marketing industry promises. We see this in machine learning, the slow development of artificial assistants, limited take-up of predictive expert systems in finance and engineering, issues with autonomous vehicles and the lack of delivery on the power of big data. It is natural and common for delivery to lag the hype curve, but it seems to be getting more widespread – ironically enabled by tech marketing and advertising tools that have been genuine technical achievements.
How do we get our resources lined up in preparation?
This question is very much industry or sector specific as skills requirements vary, but some good considerations would be to ensure we hire flexible staff who have an ability to learn and want to learn. We also need to ensure that everyone understands data protection and that they have more of an awareness for data security. These are areas that can break a business now and lack of knowledge is not an acceptable excuse.
Yemi Olagbaiye, Head of Client Services, Softwire.
AI Driven Development
Thanks to the increase in cloud computing and big data, the capabilities and uses of AI are growing fast. Multiple industries are starting to take advantage of the opportunities this technology brings and in 2019 it will start to become an everyday norm. No longer will AI be viewed as an inaccessible and complicated technology, instead its techniques will be found working hand in hand with data scientists and specialists in order to achieve better customer outcomes. Software developers in particular will feel the impacts of AI, as with the more widespread adoption of AI-driven code generation they will be freed up to focus on more complex, high-value work.
The consumer demand for constantly evolving products and services is showing no sign of slowing as their expectations continue to rise. AI will be critical to meeting this demand as it increasingly enables and enhances the DevOps culture of continuous delivery of new products and services, fueling the momentum needed in order to exceed expectations and stay ahead of the competition.
Expansion of Digital Ethics
2018 saw some of the biggest household names announcing data breaches. While some of these were historical and preceding GDPR, others were all too recent. Keeping data secure and private will always be a top priority and is something businesses should take seriously, however rather than focusing entirely on the technical and regulatory requirements behind data storage, there will be a shift towards approaching data in a more humanised manner.
Consumers need to feel that they can trust organisations to not only store their data effectively but to use it ethically. Rather than simply stating they are compliant with security or privacy regulations, organisations will need to be more transparent and clearly show their uses of this data citing who it will be shared with, how and why. As part of this data should only ever be used and accessed when necessary and when it’s of true benefit to the individual.
The driver for more ethical uses of data is currently the consumer as well as the increasing regulatory demands such as GDPR. This coming year will mark the anniversary of GDPR and as a result, it could be the year that we see the first big fines. The teething problems should now be ironed out and organisations compliant. Consumers are aware of the implications of the new rules and more observant than ever about where their data should and shouldn't be. This means that the businesses they feel can be trusted with their data will be rewarded through public recognition and increased customer loyalty and trust.
Although 5G may not experience a widespread roll out until 2020, EE recently announced the sixteen cities they had selected for 5G roll out in 2019. The enhancements and abilities of 5G will significantly improve the performance of technologies such as AI by facilitating quicker information exchanges and providing deeper context to increase understanding capabilities.
5G will also enable the delivery of new experiences for consumers. Downloads will be faster and
Networks will have greater capacity, enabling more advanced virtual reality and augmented reality applications. As a result, it’s likely there will be an influx in the amount of businesses using these more immersive technologies in order to engage their consumers.
IoT Influence will grow
Adam Binks, CEO of SysGroup
New technologies that make up the fourth industrial revolution – such as IoT and AI – are growing in influence and transforming most sectors. As a result, 2019 promises a year filled with more businesses tapping into these technologies and reaping the benefits of improved efficiencies and reduced operational costs.
We’ll specifically see a rise in the amount of technology companies turning to digital chatbots and intelligence systems. In fact, Spiceworks estimates that in 2019, 40 per cent of large businesses will implement these technologies to improve their customer service and take the load off staff capacity.
CLOUD ADOPTION WILL RISE
2019 will see a stark increase in the number of businesses deploying cloud services and solutions into their IT infrastructure to safely store their company data. According to KPMG, investment in platform-as-a-service (PaaS) will grow from 32 per cent in 2016 to 56 per cent in 2019, making it the fastest-growing sector of cloud platforms.
Cloud computing has bucked this ‘platformitisation’ trend, and so companies will take a similar approach with emerging technologies such as ‘blockchain-as-a-service’ and ‘IoT-as-a-service’. We’ll also see businesses turning to multi-cloud for their data storage and preparing their architectures for running applications across all three resources; public, private and hybrid cloud.
CYBER THREATS WILL INCREASE
As the physical and digital worlds continue to merge, the volume of data we have access to will rocket. But with this comes risk, and in 2019 we’ll see more sophisticated phishing, spear phishing, malware and password attacks on a myriad of connected devices.
We’ll see more technology companies launching smart devices that are equipped with biometric login methods such as fingerprint scanners and facial recognition software. As SysGroup partner Watchguard recognises, this might provide a fast and secure way of protecting personal data, but it is still only one single-factor method of authentication. We’ll therefore see more hackers switching their attention to working out methods to bypass this type of security.
To compensate the rise in cyber-attacks, we need more education around career paths in cyber security. The number of jobs in the field are growing at a rate three times faster than any other technology job, and if this continues we’ll see a global shortage of two million cyber security professionals by 2019.
ROLES WILL EVOLVE
We expect the role of the chief information officer (CIO) shift drastically, too. Going forward, the position will focus on business, data strategy and sourcing external technology products rather than the company’s internal IT security, which will now sit largely in the hands of the IT department.
In the last few years we have seen the wide spread adoption of technologies that support Cloud Computing, Internet of Things and Mobility solutions.
Our people are more mobile than ever. We are collecting increasing amounts of data and have the ability to provision ‘compute and storage’ resources for our key business applications faster and more cost effectively than before.
Security has become a greater concern, especially where the organisation has very little control over the 3rd party platform adopted to deploy and access key business applications. Managing reputational risk is critical. This will not change in 2019 but the risk will become more complex to manage.
In the last 18 months we have seen the network following the trends of ‘compute and storage.’ The network has tentatively moved towards software virtualisation of key network and security components with software-defined networks (SDN) becoming an popular request.
SDN addresses the need for agility, scalability and visibility by transforming hardware-intensive networks into fully programmable and virtualized software-driven networks. In our experience, SD-WANs can help reduce costs by up to 30% as more traffic is moved over secure internet-based services from traditional leased line/MPLS services.
In 2019 SDN, in particular SD-WAN and SD-DC will become the prevalent request because of these benefits.
When we consider cloud, IoT, mobile and security for 2019, the trend is arguably ‘more of the same.’ However, the real trend, will be how we shift to technology and integration methods and services that allow us to gain the full benefits of such an end-to-end Software Defined Environment (SDE). In other words, compute, storage and network working together in unison across the IT landscape.
The SDE is necessary as hybrid cloud environments mandates an enterprise service model, which allows organisations to consume IT resources as a suite of services anywhere, anytime, with usage-based pricing. Such “IT as a Service” model, encompassing traditional IT, private and public clouds is supported by a “Service Catalogue”.
The emphasis will therefore be on supporting the standardisation, simplification and integration of such services. A lot of the traditional skills to sustain technologies that support the “plumbing” or underlay will still be needed. Consolidation of multiple vendor platforms to open sourced, Linux based operating systems will help make this switch.
However, once the underlay can scale, a software driven control layer (the overlay) will arrange the on-demand automatic and dynamic provisioning of the necessary building blocks to function-agnostic hardware. They may further extend to self-provision via business applications using API’s to request functions from the underlaying infrastructures.
To support this trend new management tools or “over the top” Services Integrators will enter the market to provide the controls over such dynamic and rapid adaptable IT landscapes.
In summary the overall 2019 picture is multifaceted and multi-layered and the technology is about supporting integration with providers or inhouse IT departments capable of broad experience in designing and running Software Defined Environments.
By Marcus Harvey, Sales Director EMEA at Targus
Prediction 1: Mobile will be the new norm
The world as we know it already revolves around mobile devices like tablets, smartphones and laptops. From booking holiday destinations to catching up with friends halfway around the world, mobile devices have become a one-stop solution for all our modern day needs. In a bid to boost productivity while staying mobile, most of us carry at least three devices to stay connected.
In the coming months and years, we will see this changing. Forget cumbersome desktop PCs and fixed desks, the only device you will need for the future of work – be it in the office or otherwise – will be your handy smartphone. 2019 will see businesses begin to embrace smartphones as the defacto workplace device.
By 2021, 60% of Enterprise organisations will be testing Smartphones as their Company’s singular IT-supported 3-in-1 device.
Source: Adapted from IDC Futurescape: Worldwide Connected Devices and AR/VR 2018 Predictions
Much like laptop OEM brands, smartphones are a competitive business and there are many brands fighting for market share. In 2019, universal connectivity compatibility will become ever more important as businesses look to support a variety of devices and brands while keeping their workforce productive wherever they choose to work from. Universal docking systems at the office and at home will allow people to work comfortably and productively, seamlessly connecting small screen devices to larger, multiple screens, a keyboard and even automatically connecting to the Wi-Fi. Feel like working from a nearby café for a bit? No problem – packing up will be as simple as undocking and finishing up your work on the smartphone – armed with a coffee in one hand of course!
2019 will be an exciting time for the workforce, truly embracing the “work from anywhere” culture.
Prediction 2: Tech as a temporary solution
Digital natives are increasingly embracing the rental economy, even preferring it to ownership. The global popularity of Netflix, music-streaming and ride-sharing highlight how owning assets has become an almost archaic concept today.
This trend is already being reflected in the business landscape. Companies will choose not to invest their valuable cash on depreciating assets and 2019 will see more companies moving towards having device-as-a-service agreements in place, both small businesses and large corporations alike.
The benefits of this approach are aplenty, not being tied down to long contracts over years grants businesses greater control over their digital strategy, lowers costs, allows quicker adoption of latest technology and provides greater flexibility.
It’s only a matter of time before the inclination towards monthly subscriptions over long-term contractual commitments becomes the strategy of choice for businesses and 2019 will make significant progress on this front.
Providers of these subscription services require that the devices and kits are returned undamaged and in re-sellable condition. This will result in a growing need for bags, cases and accessories to carry and protect the technology in question. These accessories act as an important precautionary measure, helping companies offset insurance premiums.
By 2019, 35% of Fortune 1000 Companies will have a Device-as-a-Service Agreement in place.
Source: Adapted from IDC Futurescape: Worldwide Connected Devices and AR/VR 2018 Predictions
Prediction 3: Retail will be an experience, not a service
In 2019, the retail industry will move away from being a service-led industry to an experiential one.
Retailers must think about how best to extend their customers’ dwell time – length of time that visitors spends on-site.
Traditionally, brands in the retail sector have always focused their efforts on providing the best possible service and quality products to their customers. Changing consumer needs however, call for shopping locations to take this further and exist as destinations to hang out, connect with friends, interact with surroundings, catch up on social media or even get stuck in with some work. In the coming months, we will see retailers experimenting with artificial intelligence (AI) and VR technologies to provide their customers with an immersive experience while in-store.
Retailers must also think about how their offerings can elevate the experience further by building more customer interaction. Countries such as Japan and China are achieving this ensuring a vast majority of retail products contain QR codes. Placing these on the external packaging allows both retailers and customers alike to easily access a world of information by simply scanning the code via QR readers – which most smartphones today double up as. This allows customers to make well-informed decisions and choose the right option. I
Retailers in UK will be forced to follow suit and redefine their offerings or they will pay the price by losing out on market share or worse still, their place on the high-street entirely.
Joseph Carson, Chief Security Scientist, Thycotic
Governments around the world have been developing cyber weapons and using them clandestinely against other countries for many years. With the recent high profile of nation state attacks however, I believe we will begin to see more overt use of cyber capabilities.
The threat of Mutually Assured Destruction from nuclear arms is no longer proving to be an effective deterrent to armed conflict. In 2019 we will likely see governments revealing their cyber weapon capabilities to create a new deterrent, showing adversaries that they will retaliate if they continue to use their own cyber techniques to covertly cause social and political harm.
Tougher regulations around the world
Alongside the GDPR, other countries around the world are also seeking to ramp up their data protection laws to cope with a modern world where data has become one of the most valuable assets. For example, the California Consumer Privacy Act was passed into law earlier in 2018 and will come into force in 2020. I anticipate multiple governments moving forwards with their own stricter laws to approximate the power of the GDPR, to punish companies that fail to protect the consumer data they are profiting from.
Likewise, we will also see continued efforts to bring some order to rapidly advancing technology fields such as IoT, which are currently suffering from a lack of security standards.
Carolyn Crandall, Chief Deception Officer, Attivo Networks
IoT Security and Regulation
IoT will continue its rapid expansion with over 50% of businesses incorporating IoT into their operations in 2019 for economic advantages, market competitiveness, and differentiation. IoT-enabled device innovation will continue to outpace the security built into those devices and Federal government regulation will continue to fall short in defining the laws and fines required to affect change. State-level regulations will be enacted to improve the situation, but will likely fall short in impact, and in many cases, only result in a false sense of consumer confidence with respect to the security of these devices.
Cloud & Shared Security Models
Cloud will become an increased target in 2019 as adoption grows and attackers increasingly exploit weaknesses in shared security models. Cloud providers will protect the infrastructure platform with an increased awareness of hardware-based attacks, however the lack of understanding about how best to secure data in and access to the cloud will leave room for errors and misconfigurations. Adoption of technologies like CASB and deception will grow significantly as organizations seek new security controls designed to address these challenges.
The global storage market is expected to be valued at USD 144.76 Billion by 2022, at a CAGR of 16.76% between 2016 and 2022 and is among the fastest growing technology segments worldwide. Most of this spend will go to Cloud and storage software solutions. Here, StorPool summarises the most important trends and changes seen in the data storage market in 2018 and lists predictions for 2019.
The widespread adoption of “the Cloud” is obvious. There is a lot of noise around “100% cloud” strategies, yet Enterprise IT spending on cloud is just 19% in 2018 and forecasted to grow to 28% in 2022, according to Gartner. This includes SaaS offerings too and even in 3 to 4 years, less than a third of money spent will be in the Cloud.
Now, that being said, hybrid cloud architectures will pick up the pace in 2019. AWS and Azure have strong hybrid strategies and there is a number of third-party vendors providing solutions to manage multi-cloud and hybrid cloud infrastructure and to provide underlying services such as storage and networking for hybrid cloud.
With Amazon's release of Outposts (now in preview), IT organizations, system integrators, and solution providers will be forced to consider their public and hybrid cloud strategy.
It has to be noted, though, that this is predominantly unstructured data, like videos, photos and new-age cloud native applications and SaaS. The majority of core data storage services are still kept locally, with burst or backup/disaster recovery being the usual off-load to the cloud.
For more demanding workloads and sensitive data, on-premise is still King. I.e. the future is Hybrid: on-premise takes the lead in traditional workloads and cloud storage is the backup option; for new-age workloads, cloud is the natural first choice and on-prem is added when performance, scale or regulation demands kick-in.
From legacy SANs to best-of-breed software-defined storage
5 years ago, the adoption of modern “software-defined storage” solutions, capable of replacing a high-end SAN or all-flash array was in its early days. Now the SDS ecosystem has grown and matured.
2018 was the year when we saw mainstream enterprises to finally initiate projects to replace traditional SANs solutions. The most common drivers are the need for agility, cost optimization and performance improvements, which are needed to meet the changing and increasing business demands. We expect SDS adoption to have a spillover effect and gain majority market share over the next 3 to 5 years.
Infrastructure refresh cycles and performance complaints from customers/users are the top 2 triggers of this process. Investments in new-generation infrastructure software solutions are aimed at reducing vendor lock-in, achieving significant cost optimizations and accelerating application performance.
Use of SDS, especially at the high end of the performance spectrum (NVMe, NVMeOF) and when it comes to automation through APIs and integrations, is the only meaningful way to differentiate public and private cloud services, especially at scale.
FC is dead
At this point Fibre Channel (FC) is becoming an obsolete technology and we do not see neither financial, nor performance justification to deploy FC in your IT infrastructure stack. Additionally FC adds complexity in an already complex environment, being a separate storage-only component.
In 2019, it makes sense to deploy a parallel 25G standard Ethernet network, instead of upgrading an existing Fibre Channel network. At scale, the cost of the Ethernet network is 3-5% of the whole project and a fraction of cost of a Fibre Channel alternative.
100G is becoming the typical network connectivity for demanding environments.
NVMeoF and NVMe/TCP will have a gradual increase in adoption. At the low latency end of the spectrum, they will still be considered the second-best option, after proprietary access protocols (with storage driver in the initiator host).
Next-gen storage media
Persistent memory in the form of DRAM-based NVDIMMs finally became widely available on the market in 2018. We expect next-gen storage media to gain wider adoption in 2019. Its primary use-case will still be as cache in software-defined storage systems and database servers.
On a parallel track, Intel will release large capacity Optane-based NVDIMM devices, which they are promoting as a way to extend RAM to huge capacities, at low cost, through a process similar to swapping. The software stack to take full advantage of this new hardware capability will slowly come together in 2019.
There will be a tiny amount of proper niche usage of Persistent memory, where it is used for more than a very fast SSD.
In the same way, as it happened with SSDs and then Flash-based NVMes, storage solutions will struggle to expose the high throughput and low latency of the new (persistent memory) media to applications. So, as usual - be wary of marketing-defined storage slideware, stressing hyped buzz-words, void of reasonable application.
ARM in the datacenter
ARM is (finally) being recognized as a serious potential alternative to the x86 server architecture, for particular workloads. The main drivers here are cost optimization and breaking vendor lock-in.
Arm is still not fast enough to compete for general purpose workloads, but in 2018 we saw the first CPUs which were fast enough to be a serious contender for owning a solid piece of the server CPU market. The recently announced AWS instances, powered by Amazon’s custom Arm based CPU, which claim up to 45% cost savings, will definitely pave the way to a wider Arm adoption.
The prime use case for Arm servers in 2018 was “Arm test/dev”, which is self explanatory. In 2019 we'll see raising demand for Arm, however this will still be a slow pickup, as wider adoption requires the proliferation of a wider ecosystem.
Throughput-driven, batch processing workloads in the datacenter and small compute clusters on "the edge" are the two prime use-cases for ARM-based servers for 2019.
We’ve recently written a dedicated piece on Arm here: Is it the right time for Arm in your Software-Defined Data Center?
The multi-core race
Intel and AMD are on a race to provide high core-count CPUs for servers in the datacenter and in HPC. AMD announced its 64-cores EPYC 2 CPU with overhauled architecture (9 dies per socket vs EPYC's 4 dies per socket). At the same time, Intel announced its Cascade Lake AP CPUs, which are essentially two Xeon Scalable dies on a single (rather large) chip, scaling up to 48 cores per socket. Both products represent a new level of per-socket compute density. Products will hit the market in 2019.
While good for the user, this is “business as usual” and not that exciting.
Global IT, data storage and infrastructure market changes
In our last year’s predictions, we wrote that we expect a wave of consolidations in 2018. While these are a natural process in the world of business, the tectonic shifts that happen in IT infrastructure are reflected here.
There were several high-profile acquisitions. Most of them were not directly storage related, yet they gave out signals of the massive transformation of the IT infrastructure landscape. The more notable acquisitions, were:
The rumored bid of Microsoft for Mellanox is definitely worth mentioning here, as Azure is the only real contender to the undisputed Cloud leader, Amazon AWS. Good and concise analysis on this is available here.
On the storage market, Tintri filed for bankruptcy in the US under Chapter 11 in the summer and was then acquired by DataDirect Networks (DDN).
Somewhat overlooked, as not as shiny were a myriad of regional deals between 2nd and 3rd tier cloud providers. Data centers and cloud providers shifted workloads to newly opened/acquired local facilities in bigger markets, so they can save and serve data locally.
So what will 2019 bring us? Stay tuned to see.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, the January issue contains the second, and this February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 8.
Mobile edge computing
Thoughts from Nick Offin, Head of Sales, Marketing and Operations, Toshiba Northern Europe.
In the coming year, companies will look to better come to terms with the data efficiency and security issues generated by widespread mobile working and the arrival of IoT within the enterprise. As a result, mobile edge computing is showing signs of exerting real influence across a number of sectors. Such solutions not only reduce strain on the cloud by processing data on the edge, but also play an integral role in perimeter security by ensuring data communication is locally translated to a communication protocol before being sent to the organisation’s network core. With organisations looking to integrate this edge-focused element to their mobile infrastructure, BI Intelligence estimates that 5.6 billion business-owned devices will use edge computing for data collection and processing by 2020.
As edge computing develops, so too will the solutions used to collect and manage this data. This year has seen the arrival of business-targeted IoT solutions such as Assisted Reality (AR) smart glasses, which can have a significant impact in delivering mobile hands-free working across a number of sectors, from manufacturing and engineering to transport and logistics. This combination of 5G, IoT and mobile edge computing will undoubtedly drive further innovation in this space.
Benjamin Ellis, Head of Go-to-Market Strategy for Trunomi.
Heavier GDPR fines
There’s a feeling that, at least so far, GDPR has been more about the carrot than the stick. The ICO has supported businesses on their journey to compliance, rather than punishing those who are not. While there have been some high profile fines, they have been the exception not the rule. I expect this to change in 2019, particularly in the lead up to Brexit when the ICO will have to demonstrate that it is enforcing EU rules in order to protect the UK’s ability to import and export data.
Too often, businesses default to using consent and legitimate interest as a catchall-basis for data processing. Until now, the ICO has afforded Controllers sufficient time to implement appropriate processes but will no longer accept incorrect legal bases. This applies to the processing of both external and internal data, e.g customers and employees. HR is a good example: consent is not always an appropriate basis for data processing as data subjects can revoke their consent. In the future, businesses must apply, record and be able to evidence appropriate legal bases in order to avoid unwanted attention from the ICO.
A shift in how we evidence compliance
In 2019, I expect to see a big shift in how companies evidence compliance. Previously, businesses were required to merely be compliant with rules and regulations. Now, companies will be required to evidence their compliance to the ICO. This represents a fundamental shift in mindset and operations, requiring a systematic approach across the company, rather than a one off tick box exercise.
This will result in headaches for marketers, who are accustomed to dealing with vast quantities of data enabling them to segregate audiences by age, location, interests and hobbies. Customer data can also become outdated very quickly – according to Gartner, data quality deteriorates at a rate of 2% a month if not governed correctly. Marketers must make evidencing compliance part of every new activity, and ensure it’s an ongoing process.
A more cautious approach to sharing data
The Cambridge Analytica scandal was defined by Facebook’s sharing of user data. Shortly after the news broke, the company protested its innocence, claiming in a statement that ‘people are expressly asked if they want to give permission to upload their contacts from their phone’, highlighting the fact that this feature ‘has always been opt-in only’. This resulted in one of the biggest public backlashes of the year with almost irrevocable damage done to Facebook’s reputation. It has also re-emphasised the need for transparency in data use.
In 2019, I expect customers to begin to question more and more how their data is being shared, and with who. Many companies will no doubt take note of the lessons learnt from Facebook, Google and others, and become explicitly clear in exactly how data will be used/shared and where. Data management platforms that use auditable and immutable ledgers can provide the legal evidence of permission and can therefore help businesses avoid breaches of trust. The resulting increased levels of transparency can build up consumer trust, improving the overall relationship between customer and business and increasing the likelihood of repeat custom.
By Dave Russell, Vice President for Product Strategy at Veeam
The world of today has changed drastically due to data. Every process, whether an external client interaction or internal employee task, leaves a trail of data. Human and machine generated data is growing ten times faster than traditional business data, and machine data is growing at 50 times that of traditional business data. With the way we consume and interact with data changing daily, the number of innovations to enhance business agility and operational efficiency are also plentiful. In this environment, it is vital for enterprises to understand the demand for Intelligent Data Management in order to stay one step ahead and deliver enhanced services to their customers.
I’ve highlighted 5 hot trends in 2019 decision-makers need to know – keeping the EMEA market in mind, here are my views:
1. Multi-Cloud usage and exploitation will rise.
A report from McKinsey & Company revealed that data flow to Asia has increased by at least 45 times since 2005. Data from key regions such as North America and Europe has risen drastically to 5,000 – 20,000 Gbps and 1,000 to 5,000 Gbps respectively, from the original 100-500 Gbps and less than 50 Gbps in 2005. With companies operating across borders and the reliance on technology growing more prominent than ever, an expansion in multi-cloud usage is almost inevitable. IDC estimates that customers will spend US$554 billion on cloud computing and related services in 2021, more than double the level of 2016. On-premises data and applications will not become obsolete, but that the deployment models for your data will expand with an increasing mix of on-prem, SaaS, IaaS, managed clouds and private clouds.
Over time, we expect more of the workload to shift off-premises, but this transition will take place over years, and we believe that it is important to be ready to meet this new reality today.
2. Flash memory supply shortages, and prices, will improve in 2019.
According to a report by Gartner in October this year, flash memory supply is expected to revert to a modest shortage in mid-2019, with prices expected to stabilize largely due to the ramping of Chinese memory production. Greater supply and improved pricing will result in greater use of flash deployment in the operational recovery tier, which typically hosts the most recent 14 days of backup and replica data. We see this greater flash capacity leading to broader usage of instant mounting of backed up machine images (or Copy Data Management).
Systems that offer Copy Data Management capability will be able to deliver value beyond availability, along with better business outcomes. Example use cases for leveraging backup and replica data include DevOps, DevSecOps and DevTest, Patch Testing, Analytics and Reporting.
3. Predictive Analytics will become mainstream and ubiquitous.
The Predictive Analytics market is forecast to reach $12.41 billion by 2022, marking a 272% increase from 2017, at a CAGR of 22.1%. APAC, in particular, is projected to grow at the highest CAGR during this forecast period.
Predictive Analytics based on Telemetry data, essentially Machine Learning (ML) driven guidance and recommendations is one of the categories that is most likely to become mainstream and ubiquitous.
Machine Learning predictions are not new, but we will begin to see them utilizing signatures and fingerprints, containing best practice configurations and policies, to allow the business to get more value out of the infrastructure that you have deployed and are responsible for.
Predictive Analytics, or Diagnostics, will assist us in ensuring continuous operations, while reducing the administrative burden of keeping systems optimized. This capability becomes vitally important as IT organizations are required to manage an increasingly diverse environment, with more data, and with more stringent service level objectives.
As Predictive Analytics become more mainstream, SLAs and SLOs are rising and businesses’ SLEs, Service Level Expectations, are even higher. This means that we need more assistance, more intelligence in order to deliver on what the business expects from us.
4. The “versatalist” (or generalist) role will increasingly become the new operating model for the majority of IT organizations.
While the first two trends were technology-focused, the future of digital is still analogue: it’s people. Talent shortages combined with new, collapsing on-premises infrastructure and public cloud + SaaS, are leading to broader technicians with background in a wide variety of disciplines, and increasingly a greater business awareness as well. For example, the Information Technology (IT) job market in Singapore continues to see high levels of recruitment.
Standardization, orchestration and automation are contributing factors that will accelerate this, as more capable systems allow for administrators to take a more horizontal view rather than a deep specialization. Specialization will of course remain important, but as IT becomes more and more fundamental to business outcomes, it stands to reason that IT talent will likewise need to understand the wider business and add value across many IT domains.
Yet, while we see these trends challenging the status quo next year, some things will not change. There are always constants in the world, and we see two major factors that will remain top-of-mind for companies everywhere….
a. Frustration with legacy backup approaches & solutions.
The top 3 vendors in the market continue to lose market share in 2019. In fact, the largest provider in the market has been losing share for 10 years. Companies are moving away from legacy providers and embracing more agile, dynamic, disruptive vendors, such as Veeam, to offer the capabilities that are needed to thrive in the data-driven age. A report by Cognizant highlighted that 82% of APAC business leaders believe the future of work is in intelligent machines.
b. The pain points of the 3 C’s: Cost, Complexity and Capability
These 3 C’s continue to be why people in data centers are unhappy with solutions from other vendors. Broadly speaking, these are excessive costs, unnecessary complexity and a lack of capability, which manifests as speed of backup, speed of restoration or instant mounting to a virtual machine image. These three major criteria will continue to dominate the reasons why organizations augment or fully replace their backup solution.
5. The arrival of the first 5G networks will create new opportunities for resellers and CSPs to help collect, manage, store and process the higher volumes of data
In early 2019 we will witness the first 5G-enabled handsets hitting the market at CES in the US and MWC in Barcelona. I believe 5G will likely be most quickly adopted by businesses for Machine-to-Machine communication and Internet of Things (IoT) technology. Consumer mobile network speeds have reached a point where they are probably as fast as most of us need with 4G.
2019 will be more about the technology becoming fully standardised and tested, and future-proofing devices to ensure they can work with the technology when it becomes more widely available, and Europe becomes a truly Gigabit Society.
For Resellers and Cloud Service Providers, excitement will centre on the arrival of new revenue opportunities leveraging 5G or infrastructure to support it. Processing these higher volumes of data in real-time, at a faster speed, new hardware and device requirements, and new applications for managing data will all present opportunities and will help facilitate conversations around edge computing.
“Going into 2019, we’re going to see the technology market continue to transform and adapt to new customer demands. In particular, IT and data companies will be collecting, analysing and providing insights about vast volumes of data – more so than ever before. Businesses will need to start thinking about the future of how they perform these tasks, and how to take advantage of new solutions that can make the jobs and lives of the people responsible for these tasks easier. New solutions can also guarantee more security and reliability, enabling better relationships with customers.
“In particular, data and IT staff will tend to incorporate more blockchain technology into their workflows for data security and protection. And with the consistent stream of business, leveraging technologies that enable predictive insights will give IT staff the tools to know how or when to upgrade technology to ensure there are no business disruptions.”
Blockchain’s Effect on Data Backup and Recovery
“The benefits that blockchain will bring to an organisation’s security protocols are unparalleled. However, there are other areas that will be affected as blockchain becomes more prominent in 2019. Traditional backup will give way to hyperconverged solutions. Convergence will occur between compliance, protection and security as businesses continue to address risks and exploit opportunity with the data they have access to. In the coming year, organisations that can effectively leverage blockchain will be the clear winners as technologies continue to converge.
“This makes blockchain ripe for the backup and recovery market because it can touch all pieces of data, stored in any location. As long as data exists, the need to tap into that data will also exist – but who has access to this data will be the real determinant of blockchain’s power. As we look to 2019, we’ll need to keep three groups in mind: the user, the organisation and any third parties. Individuals will need the opportunity to delete their data and access it when they like. Organisations will want to use insights from the data to explore new opportunities. And both parties should be concerned with any threats from third parties.
“Blockchain can provide a solution that enables all of the above. But before it can be widely adopted, factors such as people, legality, business, culture and more will need to be in agreement. In 2019, we will see more innovators experimenting with blockchain use cases that demonstrate many of the blockchain data protection benefits.”
AI Enables Predictability
“As we move into 2019, organisations will deploy more technology that enables predictive insights into IT infrastructure. Right now, most IT managers are taking a rearview mirror approach when reacting to unplanned downtime caused by interruptions related to software or hardware error, component failure or something even more catastrophic in the data centre. Incorporating predictive technologies will enable proactive monitoring for downtime and faults so IT managers can take preventative action before a disruption ever occurs. Being more prescriptive can lead to fewer disruptions and less downtime in operations.
“More frequently, AI is enabling predictability and will play a key role in data protection in 2019 and in the future. As businesses are continuing to adopt more complex IT environments, such as hyperconverged infrastructures and other modern workloads, data protection will also need to adapt. AI consistently learns from the system as these dynamic IT environments adapt and change.
“Data protection stands to benefit the most from AI enabled predictive insights by reducing risk to data in a power disruption. And with regulations such as GDPR guaranteeing data protection for users at a business’s expense, it is becoming increasingly important to keep data under lock and key. Proactive strategies to avoid the repercussions of even a moment of downtime will be critical for businesses in 2019 that need to provide round the clock data support.”
Three VetTech innovations that will transform the veterinary industry
By Dr. Mark Boddy, CEO, PawSquad.
As the investor frenzy over Pet Tech continues unabated, new gadgets and innovative pet care products seem to be emerging almost daily. Among these innovations, two things appear to be driving them; a growing consumer demand for always-on convenience and the desire of pet owners to take more control over their pets’ health and wellbeing.
In the UK, almost 90% of people own or have access to a smartphone. This has had huge impact on buying behaviour and driven an ‘always on’ culture where consumers expect instantaneous availability and convenience. Simultaneously, the accessibility of the internet means pet owners can become instant ‘experts’ at the click of a button and are becoming more proactive and engaged with their pets’ healthcare.
There is clearly enormous potential for the veterinary profession to leverage these emerging technologies to explore new ways of delivering enhanced services in line with this new set of consumer expectations.
2019 will be a watershed year, in which we will see technology driving a digital transformation across the industry. Here are my predictions for the trends that will take the veterinary profession by storm in 2019.
Telemedicine and the online prescription model
With providers like Babylon Health and Push Doctor already disrupting human healthcare, veterinary telemedicine will come of age in 2019. Unlike the human model, veterinarians are not currently permitted to make a diagnosis or prescribe prescription-only medicines via virtual consultation. However, this is changing. Canada and three states in the US (Washington D.C., Alaska, Connecticut) have already relaxed this restriction, and with the Royal College of Veterinary Surgeons now also debating the case, the UK is likely to follow suit in 2019.
This will have huge implications for the profession, effectively opening up the prescription medicines market to online retailers and telemedicine providers as never before. It will allow telemedicine services to reach their full potential as instead of being limited to a purely advisory service, pet telehealth will become a fully-fledged consulting and treatment service.
IoT and remote healthcare programmes
Recently, there has been mass adoption of health tracking technology for humans (e.g. FitBit), and the Internet of Things (IoT) is catching on among pet owners too. We’re seeing a rise in tech gadgets for pets. For example, wearable devices to monitor daily exercise and other behaviours (e.g. Felcana), litter trays that monitor weight (e.g. Tailio) or remote video technology that catches pets in the act of behavioural misdemeanours (eg. Furbo).
In 2019, we’ll see such devices integrate with telemedicine consultations to support a variety of remote health and wellness programmes, from virtual weight management to video-based behaviour modification programmes. We’ll also see a range of clinical applications emerging, enhanced by real-time data to deliver in-home monitoring of recovery or response to treatment and early warning systems to alert to a pending problem.
Artificial Intelligence and the age of Vet Bots
Babylon Health recently made headlines in the UK when they announced that their AI-powered ‘virtual doctor’ had outperformed human GPs in a range of triage and diagnostic scenarios. This technology will begin to disrupt the veterinary profession too. Early players like Vet-AI are already in development and we can expect to see them starting to influence the veterinary landscape in 2019.
In 2019, open composability will come of age and start to go mainstream. Data is not rigid and the infrastructure in which it lives cannot be rigid either. Although “composability” is not a new term, in 2019, we will see open composability, versus today’s inflexible, proprietary solutions, come of age and start to go mainstream as organizations look to build composable infrastructures on open standards to allow for specialized configurations that are specific to their workloads and address diverse data. – Martin Fink, CTO at Western Digital Corporation
In 2019, we will see the proliferation of RISC-V based silicon. As the IoT grows, the longevity and lifecycle of all connected “things” comes into question. After all, workloads are constantly changing, and processing demands are in flux. In 2019, we will see a proliferation of RISC-V based silicon as there will be an increased demand from organizations who are looking to specifically tailor (and adapt) their IoT embedded devices to a specific workload, while reducing costs and security risks associated with silicon that is not open-source.
In 2019, the first step toward a fabric-based infrastructure will be complete. Although the industry aspires to build a fabric infrastructure – one that is built on “a set of compute, storage, memory and I/O components joined through a fabric interconnect and the software to configure and manage them” – we are several years away from it happening and organizations starting to shift their IT. With that being said, in 2019 we will realize the first step on a trajectory toward fabric infrastructure, including fabric attached memory, with the wide-spread adoption of fabric attached storage. This may seem like a small step, but the commitment to FAS means we are taking the necessary steps, as an industry, to ensure all components are connected with one another, allowing compute to move closer to where the data is stored rather than data being resigned to several steps away from compute.
In 2019, energy-assist will begin to finally make a transition away from PMR. In 2019, adoption of energy-assist storage will begin, as customers seek higher capacities, at lower costs, for their data centers. Organizations will need more storage space, but they will be unwilling to sacrifice performance for capacity, and as a result, they will look to new energy-assist technologies to deliver a cost-effective solution.
Automotive / life sciences
In 2019, devices will come alive at the edge. The conversation about autonomous cars and the power of the IoT will shift in 2019 as both enterprises and consumers watch the devices powered at the edge come alive. Regardless of how many IoT devices are unleashed, their success hinges on one critical component – the speed of compute. In 2019, the compute power will get closer to the data produced by the devices, allowing it to be processed in real-time, and devices to awaken and realize their full potential.
For the automotive industry, this means we are one-step closer to realizing the possibilities of fully autonomous vehicles populating our roadways. With data getting closer to the edge, cars will be able to tap into machine learning to make the instantaneous decisions needed to maneuver on roads and avoid accidents.
For life sciences, we will be one step closer to accelerating precision medicine diagnoses, which means faster time to both diagnose and treat life-threatening diseases.
Machine learning / edge
Narayan Venkat, Vice President, Data Center Systems at WDC, and Stefaan Vervaet, Senior Director of Global Strategic Alliances for Western Digital's Datacenter business unit
In 2019, it will be common for organizations to adopt machine learning into the business revenue stream. Up until now, for most organizations, machine learning has been a concept, but in 2019, we will see real production installations. As a result, organizations will adopt machine learning -- at scale – and it will have a direct impact on the business revenue stream.
In 2019, smaller clouds at the edge will start to sprout. With the proliferation of connected “things,” we have an explosion of data repositories. As a result, in 2019, we will see smaller clouds at the edge – or zones of micro clouds – sprout across devices in order to effectively process and consolidate the data being produced by not only the “thing,” but all of the applications running on the “thing.”
New talent and new hires in software engineering
In the next three years, 4 out of 10 new software engineering hires will be data scientists. As noted in the 451 Research report, Addressing the Changing Role of Unstructured Data With Object Storage, “the interest in and availability of analytics is rapidly becoming universal across all vertical markets and companies of nearly every size, creating the need for a new generation of data specialists with new capabilities for understanding the nature of data and translating business needs into actionable insight.” As a result of this demand to shift data into action, organizations will prioritize hiring data scientists, and in the next three years, 4 out of 10 new software engineering hires will be data scientists.
Digital Transformation in 2019
Alex Teteris, principal technology evangelist
“As organisations look at digital transformation in 2019, they do so in an era of great disruption, in which business models need to evolve quickly to keep pace with the digital evolution. While IT teams are working hard to keep up, the rapid rate of change makes this an ongoing battle. IT was previously in the driver’s seat when it came to strategic decisions on what applications the business should be using. But this too has changed—people are now more tech savvy, and IT departments can be perceived as a roadblock rather than a facilitator, leading to the continuation of shadow IT.
For IT to thrive in the year ahead, mindsets will be reset with a new focus to become a true business enabler and trusted partner for the organisation. This means we will see CIOs stepping up and getting into the driver’s seat with a more distinct focus on the overall business performance as ever before next to the developing a more user centric IT infrastructure with regards to user experience requirements. After all, IT has a direct impact on the performance of both the business and its employees.”
“When it comes to network traffic—as an example of a trend IT will need to react to in 2019—I expect to see a further shift from internal to external internet-bound traffic, which I started to monitor as Head of Global IT Infrastructure in global enterprise. Whereas seven years ago it was roughly split between 10 percent internet/90 percent internal traffic, today, it’s 65 percent/35 percent, and is set to move to 90 percent/10 percent within the next two to three years. Approximately 10 percent legacy and mainframe applications are usually too costly to migrate, which may cause a cascade effect on the internal data centre infrastructure. IT must therefore challenge itself, asking what needs to remain internal and what moves to the cloud—the data centre-hosted application only accessible via a virtual private network (VPN) is no longer for business performance in a cloud-first world.
To become a trusted business enabler, CIOs would be wise to plan a strategic roadmap around the business’s key challenges and requirements - taking network traffic and bandwidth control into account, ensuring that the business can reap the benefits the era of digital disruption has to offer without compromising on performance or security.”
Prediction #2: “5G becomes the next big game-changer, creating additional challenges and opportunities for IT teams”
“The way we connect to enterprise networks is changing and the internet is fast becoming the new corporate network. Offices no longer operate via VPN connectivity, removing complexity and commitment to telcos. I can see changes in network and routing systems, as well as exponential growth in demand in the mobile market continue to increase in 2019. With development well underway, 5G networks will accelerate these changes even further, as more Internet of Things (IoT) and mobile devices are set to come into use. In fact, 5G networks is likely to help power a huge increase in IoT technology, providing the infrastructure required to transmit vast amounts of data.
The resulting smarter and more connected world will introduce both challenges and opportunities for security and IT teams and the broader enterprise. Visibility across the entire enterprise network, as well as connected device traffic, will be critical in defending corporate assets from hackers. CIOs will therefore be instrumental in ensuring businesses are taking the necessary precautions and investing in the right tools to protect themselves when embarking on their digital transformation journeys in a rapidly evolving market.”
New talent and new hires in software engineering
In the next three years, 4 out of 10 new software engineering hires will be data scientists. As noted in the 451 Research report, Addressing the Changing Role of Unstructured Data With Object Storage, “the interest in and availability of analytics is rapidly becoming universal across all vertical markets and companies of nearly every size, creating the need for a new generation of data specialists with new capabilities for understanding the nature of data and translating business needs into actionable insight.” As a result of this demand to shift data into action, organizations will prioritize hiring data scientists, and in the next three years, 4 out of 10 new software engineering hires will be data scientists.