Ahead of 2019, DW requested predictions from organisation working in the IT industry. We received over 150 of these, some of which you will have read in December, some of which you’ll read in this issue of DW, and the rest of which will appear in February. They make for fascinating reading. However, not one of them sought to make any social or moral judgement as to whether or not the technologies discussed are a force for good, for bad or for neutral.
For example, the rise of the digital personal assistant seems to get everyone very excited but, in the UK at least, obesity is a growing issue (if you’ll excuse the pun) and any technology that allows people to spend even more time sat on their couch is not necessarily a good thing. Indeed, thanks to the rise of technology, it’s more than possible to live one’s life entirely without leaving one’s home. We can work from home, all the sustenance we need can be delivered to home and all our leisure hours can be spent at home. Of course, we can exercise at home as well.
And there’s the crux of the issue. Technology itself is definitely neutral, it’s how we choose to use it (or not) that gives it a good or bad association. Many cases are clear cut – technology used by cybercriminals to mess up individuals, companies and even governments is not a good idea; technology used to save people’s lives is.
Ah, but is it that simple? We live on a planet with finite resources and, the more people we keep alive for longer and longer, the more pressure that will be put on these resources. So, is keeping alive people until they are, say, 120, just so they can sit in their armchairs and watch television and sleep all day, a good thing? Especially when the resources required to do this could be better used to help individuals at the other end of the life cycle?
There are no easy answers, but there seems to be an urgent need for governments across the globe to come to grips with the digital world and what it means for their citizens moving forward. For example, there is growing unease that some of the tech giants appear to be able to play fast and loose with the tax rules as they currently exist in most, if not all, countries. So, how does one redress the balance, so that what most would deem a fair share of tax is paid by these organisations?
At a time when the UK seems hell-bent on withdrawing from an organisation whose primary aim (whatever else has followed) is to ensure cooperation between many European countries, it seems that more and more of such cooperation is required, rather than less. A pipe dream maybe, but imagine if countries across the planet had unified policies in many areas of life – not least technology. Right now, it’s all too easy for any organisation or individual to justify a policy or an action along the lines of: “Well, if we don’t do it then someone else, somewhere else will take advantage of the opportunity (ie selling arms to various nations!). Imagine if there was a global policy for everything?!!
And in case anyone thinks that the DW editor has completely lost the plot, I would ask this question: “Why is so much time and effort being spent on developing autonomous vehicles, and, apparently, so little effort being spent on developing some kind of Robocop, when crime and anti-social behaviour levels seem to be rising on a daily basis?”
Governments have a chance to shape the digital future. Right now, they appear to be passengers.
In order to stay ahead of the competition, 50.4% of businesses reported having a proactive ‘opportunity-minded’ approach to new and emerging technologies.
Technology is often thought about in terms of physical devices that are electrical or digital. When in fact, technology encompasses far more than simply tangible objects. New and emerging technologies often impact the value of existing models and services, resulting in digital disruption, which leads to many companies re-evaluating and transforming.
Technology disruption is defined as ‘technology that displaces an established technology and shakes up the industry or creates a completely new industry’.
There is currently a high-stakes global game of digital disruption, fuelled by the latest wave of technological advances spurred by A.I and data analytics. As a result, business models within industry sectors are inevitably changing. Despite the fact 19.3% of companies feel that the pace of technological change has made them significantly more competitive in the past three years, a large majority of companies are still struggling to keep up with this change.
As a result, SavoyStewart.co.uk sought to identify whether businesses view technology disruption as an opportunity or threat, though an analysis of the latest research conducted by Futurum*.
Interestingly, it was discovered that 1 in 4 businesses still struggle to keep up with the times and thrive from digital disruption. Despite this, whilst weighing up the opportunity vs. threat of technological disruption, 39.6% of businesses feel that it provides them with new opportunities to improve and grow as a company.
Savoy Stewart determined this was down to the companies approach to technology adaption, with 24.4% surprisingly, admitting to having no approach. Positively, 50.4% of businesses reported to have a proactive ‘opportunity-minded’ approach, ensuring they remain competitive and up to date.
With 25.1% of business seemingly adopting a passive ‘wait and see’ approach, it is unsurprising that 30.7% of companies felt the impact of technological change over the past three years has made them less competitive.
The window of opportunity to gain competitive advantage generally falls inside a window of three years. It is, therefore, critical for business leaders to understand the value of technologically proactive leadership and operational agility. The faster a company can use technology disruption to their advantage, the more likely it is to surge ahead of its competitors.
Surprisingly, whilst 29.5% of companies stated they feel very excited about their ability to adapt over the next three years, only 18.3% rated themselves as ‘Digital Leaders’. These individuals are highly proactive and agile business leaders who are ahead in their strategic and operational anticipation of the technological change facing them and their organisation.
Thereafter, 35% of businesses feel somewhat optimistic about their ability to adapt. Which is not far off the 36.3% of companies that rated themselves to be ‘Digital Adopters’; easily adaptable and proactive in their approach to evolve with technology disruption.
Following suit, 23.4% are a little concerned about their ability to adapt over the next three years, indicating their company is adaptable but passive in their approach. Which is once again close in correlation to the amount of businesses that rated themselves to be ‘Digital Followers’ at 22%.
Lastly, 12% of businesses stated they are very worried about their ability to adapt to technological change. Which is interesting considering almost double (23.4%), rated themselves to be ‘Digital Laggards’.
Research from the Cloud Industry Forum and BT finds over half of enterprises expect their business models to be disrupted within two years.
New research, launched by the Cloud Industry Forum (CIF) and BT, has revealed that while large enterprise organisations are investing in new technologies in pursuit of digital transformation, skills shortages, and migration and integration challenges, are inhibiting the pace of change.
The research, which was conducted by Vanson Bourne and commissioned by CIF in association with BT, sought to understand how the technology decisions being made by large enterprises, with more than 1,000 employees, in the face of heightening levels of digital disruption. It found that 52% of enterprises expect their business models to be moderately or significantly disrupted by 2020 and that 74% either have a digital transformation strategy in place or are in the process of implementing one.
However, just 14% believe that they are significantly ahead of their competitors in terms of the adoption of next generation technologies, indicating that many are struggling to adapt to the digital revolution. The research suggests that skills shortages sit at the heart of this issue, with enterprises significantly more likely to report facing skills shortages than their smaller counterparts. 59% stated that they lack staff with integration and migration skills (compared to just 28% of SMEs), 64% needed more security expertise, and 54% require more strategic digital transformation skills.
Commenting on the findings, David Simpkins, General Manager, Managed Services and Public Cloud at BT, said: “The research confirms that most enterprises have well developed strategies aimed at minimising digital disruption and enhancing competitiveness. Cloud is clearly an enabler, but many organisations are finding challenges in achieving the agility and flexibility they seek. Unlike small organisations, who can easily be more agile and nimbler in the face of market conditions, change within large enterprises, whose IT estates are infinitely more complex, is much more difficult to achieve.
“Increasingly we’re seeing enterprises managing a wide range of workloads, combining public and private cloud deployments with data centre infrastructure, while at the same time addressing a range of new security threats. This is changing the skillsets that enterprise IT departments need, and it is clear that many will need greater support to safely transition to the digital age,” he continued.
Alex Hilton, CEO of CIF added: “Of all the parts of our economy, it is large, enterprises that are the most vulnerable to digital disruption and that is clearly something that our respondents recognise. Many have invested heavily in their company assets and carry with them a significant amount of tech debt, often making change difficult, slow and expensive. This makes the customer and IT supplier relationship critical, and enterprise organisations must consider their choice of partner carefully to help them navigate this complexity and integrate existing legacy investments with newer cloud-based technologies. Those that do not, will find themselves at a digital disadvantage.”
The internet is made up of thousands of public and private networks around the world. And since it came to life in 1984, more than 4.7 zettabytes of IP traffic have flowed across it. That’s the same as all the movies ever made crossing global IP networks in less than a minute.Yet the new Visual Networking Index (VNI) by Cisco predicts that is just the beginning.
By 2022, more IP traffic will cross global networks than in all prior ‘internet years’ combined up to the end of 2016. In other words, more traffic will be created in 2022 than in the 32 years since the internet started. Where will that traffic come from? All of us, our machines and the way we use the internet. By 2022, 60 percent of the global population will be internet users. More than 28 billion devices and connections will be online. And video will make up 82 percent of all IP traffic.
“The size and complexity of the internet continues to grow in ways that many could not have imagined. Since we first started the VNI Forecast in 2005, traffic has increased 56-fold, amassing a 36 percent CAGR with more people, devices and applications accessing IP networks,” said Jonathan Davidson, senior vice president and general manager, Service Provider Business, Cisco. “Global service providers are focused on transforming their networks to better manage and route traffic, while delivering premium experiences. Our ongoing research helps us gain and share valuable insights into technology and architectural transitions our customers must make to succeed.”
Key predictions for 2022
Cisco’s VNI looks at the impact that users, devices and other trends will have on global IP networks over a five-year period. From 2017 to 2022, Cisco predicts:
1. Global IP traffic will more than triple
2. Global internet users will make up 60 percent of the world’s population
3. Global networked devices and connections will reach 28.5 billion
4. Global broadband, Wi-Fi and mobile speeds will double or more
5. Video, gaming and multimedia will make up more than 85 percent of all traffic
Regional IP traffic growth details (2017 – 2022)
More than 87 per cent of organisations are classified as having low business intelligence (BI) and analytics maturity, according to a survey by Gartner, Inc. This creates a big obstacle for organisations wanting to increase the value of their data assets and exploit emerging analytics technologies such as machine learning.
Organisations with low maturity fall into “basic” or “opportunistic” levels on Gartner’s IT Score for Data and Analytics. Organisations at the basic level have BI capabilities that are largely spreadsheet-based analyses and personal data extracts. Those at the opportunistic level find that individual business units pursue their own data and analytics initiatives as stand-alone projects, lacking leadership and central guidance.
“Low BI maturity severely constrains analytics leaders who are attempting to modernize BI,” said Melody Chien, senior director analyst at Gartner. “It also negatively affects every part of the analytics workflow. As a result, analytics leaders can struggle to accelerate and expand the use of modern BI capabilities and new technologies.”
According to Ms Chien, organisations with low maturity exhibit specific characteristics that slow down the spread of BI capabilities. These include primitive or aging IT infrastructure; limited collaboration between IT and business users; data rarely linked to a clearly improved business outcome; BI functionality mainly based on reporting; and bottlenecks caused by the central IT team handling content authoring and data model preparation.
“Low maturity organisations can learn from the success of more mature organisations,” said Ms Chien. “Without reinventing the wheel and making the same mistakes, analytics leaders in low BI maturity organisations can make the most of their current resources to speed up modern BI deployment and start the journey toward higher maturity.”
Gartner said there are four steps that data and analytics leaders can follow in the areas of strategy, people, governance and technology, to evolve their organisations’ capabilities for greater business impact.
1. Develop holistic data and analytics strategies with a clear vision
Organisations with low BI maturity often exhibit a lack of enterprisewide data and analytics strategies with clear vision. Business units undertake data or analytics projects individually, which results in data silos and inconsistent processes.
Data and analytics leaders should coordinate with IT and business leaders to develop a holistic BI strategy. They should also view the strategy as a continuous and dynamic process, so that any future business or environmental changes can be taken into account.
2. Create a flexible organisational structure, exploit analytics resources and implement ongoing analytics training
Enterprises must have people, skills and key structures in place to foster and secure skills and develop capabilities. They must anticipate upcoming needs and ensure the proper skills, roles and organisations exist, are developed, or can be sourced to support the work identified in the data and analytics strategy.
With limited analytics capabilities in-house, data and analytics leaders should strive for a flexible working model by building “virtual BI teams” that include business unit leaders and users.
3. Implement a data governance programme
Most organisations with low BI maturity do not have a formal data governance programme in place. They may have thought about it and understand the importance of it, but do not know where to start.
Analytics leaders can consider governance as the “rules of the game.” Those rules can support business objectives and also enable the organisation to balance out the opportunities and risks in the digital environment. Governance is also a framework that describes the decision rights and authority models that must be imposed on data and analytics.
4. Create integrated analytics platforms that can support a broad range of uses
Low-maturity organisations often have primitive IT infrastructures. Their BI platforms are more traditional and reporting-centric, embedded in enterprise resource planning systems, or simple disparate reporting tools that support limited uses.
To improve their analytics maturity, data and analytics leaders should consider integrated analytics platforms that extend their current infrastructure to include modern analytics technologies.
Sumo Logic has released new research at DockerCon Europe in Barcelona that reveals 63 percent of European organisations use machine data analytics for security, but lag in broader implementation across the business. The report, titled ‘Using Machine Data Analytics to Gain Advantage in the Analytics Economy,’ takes a comparative look at the adoption and usage of machine data across Europe and the U.S., further finding that only 40 percent of European companies had a “software-centric mindset” compared to 64 percent of US organisations.
The research, conducted by 451 Research and commissioned by Sumo Logic, surveyed 250 executives across the UK, Sweden, the Netherlands and Germany. This was also compared with data based on a previous survey of US respondents that were asked the same questions. The results show that companies in the U.S. are currently more likely to use and understand the value of machine data analytics than their European counterparts.
Key findings include:
Docker adoption continues to expand rapidly in AWS
Looking at its own research, which is derived from active and anonymised data from more than 1,600 customers and 50,000 users, Sumo Logic found in its 2018 “State of Modern Applications and DevSecOps in the Cloud” report that Docker adoption has grown rapidly for companies deploying on AWS. In the past year, the number of companies running Docker containers on AWS has grown from 24 percent of respondents to 28 percent, according to the findings. This represents a year on year growth in companies running Docker containers on AWS of 16 percent. Both Docker Engine, which provides a standardized packaging format for diverse applications,and Docker Enterprise, an enterprise-ready container platform for managing and securing applications, are available on the AWS Marketplace. Additionally, the Sumo Logic Logging Plugin for Docker Enterprise is available for download on the Docker Store.
“The move to microservices and container-based architectures from Docker Enterprise makes it easier to deploy at scale, but it can also make it harder to effectively monitor activities over time without the right approach to logs and metrics in place,” said Colin Fernandes, director of EMEA product marketing, Sumo Logic “Conversely, getting effective oversight across systems and users with machine data makes delivering better services easier alongside improving security and operations. It’s gratifying to see that European organisations already understand the value in using machine data analytics for security purposes.”
International gaming and casino operations company Paf has deployed its critical applications infrastructure on AWS as part of a move to modernise its IT. The deployment included using Sumo Logic to get detailed machine data analytics and insight into performance levels.
"We chose Sumo Logic to provide us with insight into our new application deployments, which are far more complex than our previous applications, and to support our move to container based deployments," said Lars-Goran Hakamo, Security Architect at Paf. “Sumo Logic gives us the ability to identify and resolve issues faster and keep those applications performing at scale. The rich visibility and data insights we are able to glean from Sumo Logic from across our container estate, across our applications and our AWS infrastructure is simply unbelievable.”
Business barriers to data deployments
The 451 Research report also provided insight into the barriers preventing wider usage of machine data analytics:
“Europe is adopting modern tools and technologies at a slower rate than their U.S. counterparts, and fewer companies currently have that ‘software-led’ mindset in place. However, the desire for more continuous insights derived from machine data is there. What the data shows is that once European organisations start using machine data analytics to gain visibility into their security operations, they begin to see the value for other use cases across operations, development and the business,” said Fernandes. “It’s our goal to democratise machine data and make it easier to deploy this as part of modern application deployments running on on the Docker Enterprise container platform.”
Findings demonstrate pressing need to build the right foundation for consolidating and managing cloud deployments.
New research commissioned by SoftwareONE, a global leader in software and cloud portfolio management, has revealed that more than six in ten organisations (62 per cent) believe that the actual costs of maintaining cloud technologies are higher than they expected. This indicates that, despite the evident benefits that the cloud brings to an organisation’s digital transformation efforts, businesses can run into difficulties if they do not have the technology, licensing agreements and ongoing monitoring procedures in place to effectively manage and optimise cloud-based software and applications within their wider IT estate.
The survey, conducted by market research firm Vanson Bourne, also found that 36 per cent of businesses feel that cloud-based “as-a-service” offerings – such as SaaS, PaaS or IaaS – have increased in complexity over the past two years, with an additional 33 per cent believing that the level of complexity has remained unchanged in this time. When viewed alongside the unexpected management costs associated with the cloud, it becomes clear that cloud deployments are fraught with challenges if businesses do not have the means to administer the various elements of their implementations effectively.
Zak Virdi, UK Managing Director at SoftwareONE said: “The cloud has been instrumental in helping to bring greater agility and efficiency to IT, but it’s also important to recognise that making it a success takes time, as well as a commitment to proper, long-term integration and management. This means selecting the right technology, meeting the right licensing agreements, and putting frameworks are in place to consistently and closely monitor the entire implementation.
“From our research, it’s clear that the rapid growth in cloud services and options – while providing an exceptional level of choice to businesses – is also leading to organisations struggling to fully maximise cloud investments while keeping expenditure as low as possible,” continued Virdi.
To illustrate this point further, the research also found that more than four in ten (44 per cent) of respondents claimed that budget restraints often push their organisation towards choosing second or third-choice technology options, rather than the option that they felt was ideal for them. Moreover, 38 per cent said that the management of licences and subscriptions for both cloud deployments and on-premise software pose a significant degree of complexity.
For Virdi, this provides additional evidence of a need for technology that can ease the management of so many different software-based requirements, whether they are based in the cloud or on-premise.
He added: “With so many plates to keep spinning and so many different responsibilities to fulfil – on-premise and cloud deployments, licensing requirements, keeping costs down and so on – it’s absolutely vital that businesses are able to cut out or automate much of the administrative burden, enabling companies to maximise ROI and make their digital transformation efforts a resounding success.”
Virdi concluded: “Amid the urge to adopt cloud, it’s about being able to step back and take stock of what needs to be done from a management perspective. If organisations embrace a platform that allows them to manage their entire software estate and cloud portfolio from a single location, they will be in a much better position to monitor, analyse and optimise the resources at their disposal.”
Businesses across the world still struggle to understand, optimise, and protect their rapidly expanding application environments, according to new research from F5 Labs.
The 2018 Application Protect Report reveals that as many as 38% of respondents have “no confidence” they have an awareness of all their organisation’s applications in use.
Based on regional analysis conducted for the report by the Ponemon Institute, UK businesses know the least about their application situation – only 32% are “confident” or “very confident” that they have full oversight –whereas Germans are the most confident, with 45% claiming to know the full story.
The Application Protection Report, which is the most extensive study of its kind yet, also identified grossly inadequate web application security practices, with 60% of businesses stating they don’t test for web application vulnerabilities, have no pre-set schedule for tests, are unsure if tests happen, or only test annually.
Furthermore, 46% of surveyed respondents disagreed or strongly disagreed that their organisation had adequate resources to detect application vulnerabilities. 49% said the same about their remediation capabilities.
“Many businesses fail to keep pace with technological developments and make unwitting and dangerous security compromises as they have a worrying lack of insight into their application environments. This is a big problem. The pressure has never been higher to deliver applications with unprecedented speed, adaptive functionality, and robust security — particularly against the backdrop of increasing European data protection legislation,” said David Warburton, Senior EMEA Threat Research Evangelist, F5 Networks.
Counting the cost
According to thePonemon Institute’s regional review, the global average for web app frameworks and environments in use is 9,77. The US has the most (12,09), with both the UK (9,72) and Germany (10,37) claiming to be above average.
On average, global businesses consider 33,85% all apps to be “mission critical”. In EMEA, the percentage is 35% and 33% for the UK and Germany, respectively. All regions identified the same top three critical apps: document management and collaboration; communication apps (such as email and texting); and Microsoft Office suites.
Global respondents were also unanimous that the three most devastating threats facing businesses today are credential theft, DDoS attacks, and web fraud.
In EMEA, 76% of German respondents are most concerned about credential theft, which is second only to Canada (81%). DDoS attacks (64%) and web fraud (49%) are German business’ next biggest concerns.
Interestingly, the UK is more threatened by web fraud than anyone else (57% of respondents). Nevertheless, its biggest worries are credentials theft (69%) and DDoS attacks (59%).
Unsurprisingly, web app attacks are a major operational blight in all countries. 90% of respondents in the US and Germany said it would be “very painful” if an attack resulted in the denial of access to data or apps. The UK is the next most potentially vulnerable country with 87% concurring.
The global average incident cost for app denial of service is $6,86m. The US endures the costliest range of attacks with losses of $10,64m on average, closely followed by Germany’s $9,17 million. The UK is slightly below the global average with an average of $6,57m per incident.
Regional differences are also apparent when estimating the incident cost of confidential or sensitive information leaks, such as intellectual property or trade secrets. Globally, the average cost stands at $8,63m. The US pays out the most, having to foot an average bill $16,91m. Germany is second with typical losses of $11,30m. The UK fares better with average losses of $8,10m, which is almost half the US estimate.
Meanwhile, the global average estimated incident cost for leakage of personally identifiable information (customer, consumers or employees) stands at $6,29m. The US is once again hardest hit at an average of $9,37m, ahead of Germany ($8,48m), India ($6,63m), and the UK ($5,63m).
Tools and tactics
According to surveyed businesses, the three main tools for keeping apps safe are Web Application Firewalls (WAF), application scanning, and penetration testing
WAF takes the top spot in the US (30%), Brazil (30%), UK (29%), Germany (29%), Canada (26%) and India (26%). Penetration testing is most prominent in India (24%), followed China (20%), Brazil (19%), Germany (20%), Canada (20%), the UK (18%) and the US (18%). India is again in the lead for app scanning (24%), trailed by China (22%), Brazil (21%), Canada (19%), the US (18%), Germany (16%), and the UK (13%).
The business community’s growing appetite for WAF is further echoed in F5’s 2018 State of Application Delivery report3, which revealed that 61% of surveyed global businesses currently use WAFs to protect applications – a trend largely driven by soaring multi-cloud usage.
The Ponemon Institute also reported that DDoS mitigation and backup technologies are the most widely used technologies to achieve high web application availability. German and Brazilian respondents were the strongest DDoS mitigation advocates (both 64%), edging out the US (62%), the UK (60%) and China (60%). Backup technologies are most popular in Canada (76%), the UK (74%), and Germany (73%).
Storage encryption is also seen as a critical defensive tool. Germany leads the way in this respect, with 50% of businesses using the technology “most of the time”, ahead of Canada (44%), the US (40%) and the UK (39%).
Safeguarding the future
“A company’s reputation depends on a comprehensive security architecture. Firms across the globe can no longer rely on traditional IT infrastructures. Technologies such as bot protection, application-layer encryption, API security, and behavior analytics, as we see in advanced WAFs, are now essential to defend against attacks. Thanks to automated tools with enhanced machine learning, businesses can start to detect and mitigate cybercrime with the highest level of accuracy yet,” said Warburton.
TIBCO Software Inc. has released the initial results of the 2018 TIBCO CXO Innovation Survey examining the leading trends in innovation strategy within the enterprise.
The research provides a deep dive into how and why companies are innovating, as well as the tactics and technologies needed to drive initiatives. The research results also examine the role of digital transformation in innovation across industries, which teams within an organisation are driving innovation the most aggressively, and the obstacles they are facing.
The 2018 TIBCO CXO Innovation Survey polled more than 600 respondents around the world including CXOs, senior vice presidents, vice presidents, senior directors, and directors from business and IT functional areas. Respondents represented industries that include retail/wholesale, manufacturing, financial services, healthcare, transportation, aerospace, energy, oil and gas, professional services, technology and software, and more.
Key findings include:
“Unlocking innovation potential is key for organisations that want to be at the cutting edge of their markets,” said Thomas Been, chief marketing officer, TIBCO. “People, data, and technology form the nucleus of innovation and executives across all industries need all three working together for success. This survey dives deep on these relationships and shows how more digitally mature companies are innovating faster and more often. We’re excited to bring forward compelling insights on the factors critical to innovation. These elements are what every executive should consider when determining strategies to stay competitive.”
Large organizations are running an average of 29 different pre-deployment initiatives to digitize the supply chain, but 86% have failed to scale any of them.
A new study from the Capgemini Research Institute “The digital supply chain’s missing link: focus”, has identified a clear gap between expectations of what supply chain digitization can deliver, and the reality of what companies are currently achieving. While exactly half of the organizations surveyed consider supply chain digitization to be one of their top three corporate priorities, most are still struggling to get projects beyond the testing stage (86%).
Cost savings and new revenue opportunities are the top goals for supply chain digitization
Over three quarters (77%) of companies said their supply chain investments were driven by the desire for cost savings, with increasing revenues (56%) and supporting new business models (53%) also frequently cited. Organizations, especially in the UK (58%), Italy (56%), The Netherlands (54%) and Germany (53%) have supply chain digitization as one of their top priorities.
The broad enthusiasm for focusing on digital supply chain initiatives may be explained by the prospect of the return on investment (ROI) that they offer. The research finds that ROI on automation in supply chain and procurement averaged 18%, compared to 15% for initiatives in Human Resources, 14% in Information Technology, 13% in Customer Service and 12% in Finance and Accounting, and also R&D. According to the report, the average pay back period for supply chain automation is just twelve months.
Most organizations have spread their investments too thinly and are struggling to scale pilot initiatives
The organizations surveyed have an average of 29 digital supply chain projects at the ideation, proof-of-concept or pilot stage. Just 14% have succeeded in scaling even one of their initiatives to multi-site or full-scale deployment. However, for those that have achieved scale, 94% report that these efforts have led directly to an uplift in revenue.
The evidence from those who have moved to implementation suggests that companies are taking on too much, and not focusing enough on strategic priorities. The organizations who successfully scaled initiatives had an average of 6 projects at proof-of-concept stage while those who failed to scale averaged 11 projects.
There was also a clear gap in procedure and methodology between organizations that had and had not implemented digital supply chain initiatives at scale. The vast majority of companies to have successfully scaled said they had a clear procedure in place to evaluate the success of pilot projects (87% vs. 24%) and had clear guidelines for prioritizing those projects that needed investment (75% vs. 36%).
Dharmendra Patwardhan, Head of the Digital Supply Chain Practice for Business Services at Capgemini, added: “While most large organizations clearly grasp the importance of supply chain digitization, few appear to have implemented the necessary mechanisms and procedures to turn it into a reality. Companies are typically running too many projects, without enough infrastructure in place, and lack the kind of focused, long-term approach that has delivered success for market leaders in this area. Digitization of the supply chain will only be achieved by rationalizing current investments, progressing on those that can be shown to drive returns, and involving suppliers and distributors in the process of change.”
Steps to unlock the value in supply chain transformation
As well as learning from organizations that have successfully scaled supply chain initiatives, the report recommends that companies looking to make progress should focus on three key areas:
·Advocate and Align: Ensure transformation efforts are driven by C-suite leadership and senior management. Supply chain digitization is a complex process that spans planning, procurement, IT and HR and as such it cannot be led by any one business unit and must be driven from the top to succeed. Leadership needs to advocate for this transformation, and to provide strategic focus on objectives and what to prioritize. Supply chain digitization is integral to achieving business objectives and must also be aligned with wider efforts – for example to increase transparency and improve customer satisfaction – so it is not considered solely as a cost-cutting exercise.
·Build: For supply chain digitization to be successful, both upstream and downstream partners (suppliers and distributors/logistics providers) need to be onboarded and made part of the digitization efforts. Breaking the silos among the various supply chain functions as well as the technology teams is also critical to the success of supply chain initiatives.
·Enable: While the above help in starting the digitization, in order to sustain it, organizations also need to invest in key areas of building a customer-centric mindset and developing a talent base. They need to devise approaches to attract, retain and upskill their employees.
Commenting on this approach, Rob Burnett, CIO of Global Supply Chain & Engineering at GE Transportation said: “Management buy-in is a huge part of identifying and investing in the digital supply chain projects that can really drive improvement. Rather than a cost center, the supply chain can be a source of innovation and efficiency for the whole organization, but it’s important to maintain a sharp focus on priority projects to get the ball rolling. There should be a wider appreciation that less is more.”
Robotic Process Automation (RPA) and Internet of Things (IOT) represent several viable use cases
In terms of specific use cases, Capgemini’s report reviews the 25 most popular use cases for supply chain digitization today, analyzing each in terms of how easy it is to implement, and benefits realized, to produce the top recommended use cases that can become strategic wins. Of these, RPA and IOT feature more often, in use cases like order processing, smart sensors to monitor product conditions, and to update and maintain connected products. Based on working examples from across today’s supply chain, these use cases have been shown to save time and money on supply chain processes.
New research reveals 75 percent of customers still favour live agent support for customer service vs 25 percent self-service and chatbots.
New research from NewVoiceMedia, a Vonage Company and leading global provider of cloud contact centre and inside sales solutions, reveals that three-quarters of consumers prefer to have their customer service inquiries handled by a live agent over self-service options or a chat bot.
Chat bots can provide customers with quick answers to frequently asked questions or issues, and the survey notes the benefit of chat bots for certain interactions, such as 24/7 service. When it comes to handling sensitive financial and personal information, however, most customers are more comfortable with a live agent, and just 13 percent say they’d be happy if all service interactions are replaced by bots in the future.
According to the survey¹, top concerns for using chat bots for service include: lack of understanding of the issue (65 percent); inability to solve complex issues (63 percent) or get answers to simple questions (49 percent); and lack of a personal service experience (45 percent). Though 48 percent of respondents indicated they would be willing to use chat bots for service – versus the 38 percent who wouldn’t – 46 percent also felt that bots keep them from reaching a live person.
When asked about transactions for which they would not feel comfortable using a chat bot, a significant majority of respondents said large banking (82 percent), medical inquiries (75 percent) and small banking (60 percent). For frequently asked questions or common issues, however, chat bots can add efficiencies to the live agent’s day, freeing them to provide the extra care and time to more complex issues and to the customers who really need it.
“When a situation becomes emotional or complex, people want to engage with people”, says Dennis Fois, President of NewVoiceMedia. “As businesses add more customer service channels, conversations are becoming more complex and higher value, and personal, emotive customer interactions play a critical role in bridging the gap for what digital innovation alone cannot solve. For this reason, companies must find the right balance between automation and human support to deliver the service that customers demand. Frontline contact centre teams will continue to be the difference-makers on the battlefield to win the hearts and minds of customers, and organisations deploying self-service solutions should ensure that there is always an option to reach a live agent”.
A cloud contact centre solution presents businesses with the opportunity to take advantage of emerging technologies such as bots and AI technology, without losing that personal touch, creating a better customer experience. The key is providing customers with the right balance of personalised, white glove service by a live agent when they have the need for deeper, more complex customer care, while also giving them the ability to get quick answers to basic questions provided by chat bots through a variety of communications channels – chat, voice, SMS, or social messaging.
Customers prefer live agents for technical support (91 percent); getting a quick response in an emergency (89 percent); making a complaint (86 percent); buying an expensive item (82 percent); purchase inquiries (79 percent); returns and cancellations (73 percent); booking appointments and reservations (59 percent); and paying a bill (54 percent). However, when asked about buying a basic item, 56 percent would choose a chat bot over a live interaction. The top benefit cited for dealing with chat bots was 24-hour service.
Younger respondents (aged 18-44) were more open to using chat bots overall and across the individual scenarios compared to older consumers (aged 45-60+). In fact, 52 percent of those aged 60 and older would be unwilling to use chat bots for service at all.
This research follows NewVoiceMedia’s Serial Switchers Swayed by Sentiment study, in which nearly half of respondents (48 percent) considered calls to be the quickest way of resolving an issue. The survey also found that the top reasons customers leave a business due to poor service are feeling unappreciated (36 percent) and not being able to speak to a person (26 percent), but that 63 percent would be more likely to return to a business if they felt they’d made a positive emotional connection with a customer service agent.
Angel Business communications have announced the categories and entry criteria for the 2019 Datacentre Solutions Awards (DCS Awards). The DCS Awards are designed to reward the product designers, manufacturers, suppliers and providers operating in data centre arena and are updated each year to reflect this fast moving industry. The Awards recognise the achievements of the vendors and their business partners alike and this year encompass a wider range of project, facilities and information technology award categories together with two Individual categories and are designed to address all the main areas of the datacentre market in Europe.
The DCS Awards categories provide a comprehensive range of options for organisations involved in the IT industry to participate, so you are encouraged to get your nominations made as soon as possible for the categories where you think you have achieved something outstanding or where you have a product that stands out from the rest, to be in with a chance to win one of the coveted crystal trophies.
The editorial staff at Angel Business Communications will validate entries and announce the final short list to be forwarded for voting by the readership of the Digitalisation World stable of publications during April. The winners will be announced at a gala evening on 16th May at London’s Grange St Paul’s Hotel.
The 2019 DCS Awards feature 26 categories across four groups. The Project Awards categories are open to end use implementations and services that have been available before 31st December 2018. The Innovation Awards categories are open to products and solutions that have been available and shipping in EMEA between 1st January and 31st December 2018. The Company nominees must have been present in the EMEA market prior to 1st June 2018. Individuals must have been employed in the EMEA region prior to 31st December 2018.
Nomination is free of charge and all entries can submit up to four supporting documents to enhance the submission. The deadline for entries is : 1st March 2019.
Please visit : www.dcsawards.com for rules and entry criteria for each of the following categories:
DCS PROJECT AWARDS
Data Centre Energy Efficiency Project of the Year
New Design/Build Data Centre Project of the Year
Data Centre Consolidation/Upgrade/Refresh Project of the Year
Cloud Project of the Year
Managed Services Project of the Year
GDPR compliance Project of the Year
DCS INNOVATION AWARDS
Data Centre Facilities Innovation Awards
Data Centre Power Innovation of the Year
Data Centre PDU Innovation of the Year
Data Centre Cooling Innovation of the Year
Data Centre Intelligent Automation and Management Innovation of the Year
Data Centre Safety, Security & Fire Suppression Innovation of the Year
Data Centre Physical Connectivity Innovation of the Year
Data Centre ICT Innovation Awards
Data Centre ICT Storage Product of the Year
Data Centre ICT Security Product of the Year
Data Centre ICT Management Product of the Year
Data Centre ICT Networking Product of the Year
Data Centre ICT Automation Innovation of the Year
Open Source Innovation of the Year
Data Centre Managed Services Innovation of the Year
DCS Company Awards
Data Centre Hosting/co-location Supplier of the Year
Data Centre Cloud Vendor of the Year
Data Centre Facilities Vendor of the Year
Data Centre ICT Systems Vendor of the Year
Excellence in Data Centre Services Award
DCS Individual Awards
Data Centre Manager of the Year
Data Centre Engineer of the Year
Nomination Deadline : 1st March 2019
As we look forward to another year of potentially great change and uncertainty, the managed services industry can take some comfort from its apparent resilience as a model. All the current thinking and forecasting around 2019 in the IT industry suggests a continuing pressure on customers who know they need to become more productive, but are not sure where to spend their IT investment and when. Managed services is uniquely placed to deliver IT solutions in an effective way.
Yet, if anything the pressures are rising for customers; just as in a rising market, many stay profitable enough not to have to think about changing infrastructure, while IT mistakes and wrong decisions matter less, a year of great change may eventually find out those poor deals and false savings. Hence the hesitant IT customer confidence, especially in areas which have less history behind them such as AI and IoT. The thinking is that it is better to hold on to existing technologies which are at least better understood through use.
At the same time, however, the pressure from senior management and those with a strategic view is asking IT to do more with less; squeezed resources will not deliver change, and IT departments themselves may have become more siloed in the last year.
Working with scarce resources should be something that the managed services providers well understand; their industry is all about scale and efficient delivery; they should be able to talk to senior management in customers about the inherent effectiveness of their IT model and how this can be shared with the customer and the customer’s customer.
Yet the promise of technological delivery of riches has not always lived up to the expectations, and research has shown that a half-hearted or lightweight approach to innovation does not yield such good results. The recent Capgemini Research Institute’s “Upskilling your people for the age of the machine” study found that automation has not improved productivity because it has not been a part of the full digital transformation of business processes.
Companies do need to do their homework in terms of change management, in many cases. The need to adopt technology that links to and supports the implementation of a strategy should be a no-brainer. But this is not about new technology, it’s about the basics of change management and communication from leaders, which have been discussed for decades.
The managed services industry may need to encourage new thinking along these lines even more in 2019. It should be pushing at an open door: Equinix research shows that almost three quarters (71%) of organisations are likely to move more of their business functions to the cloud in coming years, they may still need answers that address their security fears, however. And they need help managing that change process in their businesses.
It will be those managed services companies with expertise in particular vertical markets or with a particularly strong customer relationship who do best in this. They are able to talk in meaningful ways about real solutions and engage with realistic customer expectations. They will also hopefully not be afraid to say when a transformation is not going far enough and what real gains could be made by further moves.
For this to work properly they need a clear understanding of the nature of change management in customer organisations, coupled with the required changes in working practices that deliver a safe and secure environment.
The agenda for the 2019 European Managed Services and Hosting Summit, in Amsterdam on 23 May, aims to reflect these new pressures and build the skills of the managed services industry in addressing the wider issues of engagement with customers at a strategic level. Experts from all parts of the industry, plus thought leaders with ideas from other businesses and organisations will share experiences and help identify the trends in a rapidly-changing market.
This month’s journal theme is focused on ‘Significant Moments which helped shape 2018 together with some Predictions for 2019’.
With so much going on in 2018 I thought the best thing to do was list all the awards won and significant reports published over the year by supporting members. CLICK HERE to find out more.
You may also wish to visit the DCA news page where you can look at all the news items from 2018 - https://dca-global.org/news
In addition to supporting and representing the best interests of its members, the DCA has also had a busy year working behind the scenes with the launch of new branding and new website/members portal. 2018 saw the DCA support over forty industry events, initiatives and projects all designed to benefit both members and the sector as a whole. One such project was the 3-year EU Horizon 2020 project called EURECA which received outstanding feedback and praise from the EU commission.
The DCA is extremely proud to have been a contributing partner and the EURECA project has subsequently been nominated for several awards culminating in the EURECA project winning the DCD ‘Data Centre Imitative of the Year’ Award in December in London.
There is much work to do in 2019 which I am confident will be yet again be busy and (without fear of mentioning he “B” word) I am sure it will be both an interesting and challenging year as well! One thing is for sure it won’t be boring! On that very subject the DCA will be working with collaborative partners in the coming months to see what can be done to increase the data centre sectors appeal as a career destination of choice for school and college leavers keen to join the job market.
The trade association already has strong influence both nationally and regionally when it comes to regulation, policy and standards and this valuable work will continue in the year ahead, with new opportunities for members to contribute and participate in helping shape our sector for the benefit of all.
As global event partners for Data Centre World we are again working with Closer Still Media on the next event at Excel in March and we are already busy planning our flagship Data Centre Transformation Conference for this summer in Manchester (more on this to follow). All upcoming events can be found in the DCA Events Calendar and again the annual printed DCA wall calendar show all the DC related events throughout the year will be available soon.
Over the past 12 months the DCA has published over 180 articles in publications which are received and read by over a total of 250,000 subscribers globally, thank you again to all the members who contributed articles over the last year.
It is generally recognised that the DCA Trade Association has built a strong reputation as a trusted source of thought leadership approved content and as one of its founders this is an achievement, I am particularly proud of, 2019 will see an increased focus on how this content is disseminated to further maximise exposure and impact.
The next publication theme is ‘Insight’ with a focus on new industry trends and innovations, such as Open Compute, Edge DC’s and Liquid Cooling to name but a few. The call for papers is now open and the deadline for submitting articles is the 25 January. To submit email firstname.lastname@example.org
We’d like to wish all our members, collaborative partners and those working in the Data Centre sector a very happy and prosperous 2019.
2018 Celebrated Another First For the Data Centre Sector
And another first for CNet Training… with the celebration at the very first graduation ceremony from the Masters Degree in Data Centre Leadership and Management program.
After an intense three years of study, learners from the class of 2018 graduated at a ceremony in the famous university city of Cambridge, UK, including one of CNet’s technical Instructors, Pat Drew.
In addition to being the global leader in technical education for the data centre sector, CNet is an Associate College of Anglia Ruskin University in Cambridge, UK and is therefore approved to design and deliver the content of this prestigious Masters Degree program.
The graduates joined the Masters Degree program from some of the world most respected organizations including Unilever, Capital One, IBM, Irish Life and Wirewerks. Each committed to the three-year distance learning program to unite their existing knowledge and skills with new learning centered around leadership and business management within a data centre environment.
For more information see www.cnet-training.com/masters-degree
Designed and written by the global leaders in data centre technical education, CNet Training, in close collaboration with leading data centre professionals from across the globe, the Certified Data Centre Sustainability Professional (CDCSP®) has been officically launched.
The program has been created as a result of demand from data centre professionals for a sector focused, comprehensive program to provide knowledge and innovative approaches to planning, designing, implementing and monitoring a sustainability strategy for data centre facilities and operational capability.
A one-year distance learning program, the Certified Data Centre Sustainability Professional (CDCSP®) provides flexibility and convenience of learning for busy data centre professionals as they log in from across the globe. The learning, managed and supported by CNet’s dedicated in-house team, provides in-depth knowledge of all the stages of data centre sustainability from strategic vision and business drivers, operational analysis of power, cooling and IT hardware, operational processes and procedures, risk evaluation and mitigation, to design innovation and implementing initiatives, whilst appreciating the business and operational challenges that can be encountered. Maintenance strategies, continuous planning and critical analysis against identified targets and demonstrating ROI are also explored.
2018 has been an exciting year for EcoCooling. The Nordic market has seen some of the largest growth as operators exploit the cool climate and renewable electricity. Working within the fast paced, innovative HPC sector has allowed us to continue developing our CloudCooler range to become one of the lowest total cost of ownership, free cooling systems available. These innovative Plug & Play solutions have proved extremely popular with over 2000 installed worldwide. Engineered to be rapidly deployed to remote locations with minimal requirements for qualified labour, the principles behind it underline our latest product offering, in the form of a mobile, high performance computing containerised data centre. It has been developed to support the growing requirements of the low latency data processing infrastructure required for applications such as 5G, AI and smart cities.
What’s happening for 2019 – We are excited to have just moved into spacious new offices in Bury St Edmunds, we are looking to build on the principles of low total cost of ownership to simplify our product offerings and achieve optimum efficiencies for our end users. Watch this space for new products, exciting case studies, and a whole lot of energy saving!
Andrew Fray, Managing Director at European data centre experts Interxion
5G’s potential won’t be fully realised
The 5G tsunami is well on its way and it will hit our shores in 2019, with CCS Insight predicting that we could see 1 billion 5G users by 2023. Its rollout next year has the potential to completely transform every industry, from manufacturing and marketing to communications and entertainment. Fast data speeds, higher network bandwidth and lower latency mean smart cities, connected transport, smart healthcare and manufacturing are all becoming closer to a reality.
Despite the first deployments of 5G and the launch of the first 5G-compatible devices next year, we don’t expect the impact of widespread 5G implementation be fully felt in 2019. Instead, for many businesses, 2019 will be full of continued investment and focus into rearchitecting existing networks and infrastructure ready to host 5G networks.
“Multi-cloud” will be the new buzzword
Conversations this year have been full of the continued development and challenges of cloud adoption, and next year will be no different. A major learning from this year for many companies is that putting all of their eggs, in this case workloads, in one basket – whether it’s private cloud, public cloud or data centre – isn’t the best strategy. Instead, businesses are increasingly turning to multi-cloud adoption, consuming and mixing multiple cloud environments from several different cloud providers at once. In fact, multi-cloud adoption and deployments have doubled in the last year, and this will remain a major topic for 2019. Major cloud providers are also showing increased interest in multi-cloud, with public cloud providers such as Amazon and Alibaba offering private cloud options, as well as a number of acquisitions and partnerships that will allow the marrying of cloud environments.
In 2019, multi-cloud will become the new norm, allowing businesses to realise the full potential of the cloud, giving them increased flexibility and control over workloads and data, whilst avoiding vendor lock-in.
eSIM will take centre stage
It’s been suggested that, despite the build-up around 5G, it’s actually eSIMs that will be a game changer in the technology and telecoms sectors. Up until fairly recently, uptake of eSIM has been slow, as operators have been concerned with how the technology will impact their businesses. However, the eSIM market is estimated to grow to $978 million by 2023, with demand being driven by adoption of internet-enabled technology which requires built-in cellular connectivity. This year, we’ve already seen a shift in mindset in relation to the technology. New guidelines introduced by the GSMA have also contributed to increased awareness of the capabilities.
In 2019, we’ll see a large number of operators, service providers and vendors trial and launch new eSIM-based solutions. The impact of this growing emergence will be broad and will pave the way for significant developments in consumer experiences, in everything from entertainment and ecommerce to automotive.
Living on the edge in 2019
Edge computing has been on the horizon for a number of years now. However, it’s yet to be fully understood. In 2019, 5G deployments and the increasing proliferation of the IoT will be key drivers behind ‘the edge’ gaining significant awareness and traction. Business Insider Intelligence estimates that 5.6 billion enterprise-owned devices will utilise edge computing for data collection and processing by 2020.
As we move into next year, the edge will continue to be at the epicentre of innovation within enterprises, with the technology exerting its influence on a number of industries. Businesses will look to data centre providers to lead the charge when it comes to developing intelligent edge-focused systems. In terms of technological developments, a simplified, smarter version of the edge will emerge. The integration of artificial intelligence and machine learning will provide greater computing and storage capabilities.
Adoption of blockchain will start to accelerate, especially in financial services
Up until fairly recently, blockchain has remained a confusing topic for many businesses, especially those operating in highly-regulated industries such as the financial services sector. As a result, many financial institutions have been slow to embrace the technology. However, next year, more use cases for blockchain will be uncovered and will make an impact. According to PwC, three-quarters (77%) of financial sector incumbents will adopt blockchain as part of their systems or processes by 2020.
In particular, we’ll see an increasing number of fintech partnerships built over the course of the coming year as more financial companies look to harness the technology’s potential.
Cloud gaming will require a new approach to networking
According to Microsoft’s head of gaming division, the future of gaming is cloud gaming. This new form of gaming promises new choices for players in when and where to play, frictionless experiences and direct playability, as well as new opportunities for game creators to reach new audiences. However, based on a subscription model, the highest quality of service is critical for subscriber retention. Delivering this superior user experience, often across multi-user networks and devices, is dependent on low-latency. For gaming companies looking to keep up with demand for connectivity and bandwidth, it’s all about having the right infrastructure to deliver game content without delay or disruption.
Next year, we’ll see more gaming companies take a new approach to networking and harness the power of data centres to optimise performance, maintain low-latency and provide the resiliency and scalability to cope with the volatile demands of today’s gamers.
University of East London was named on the UK Universities Best Breakthrough List
The Enterprise Computing Research Lab (EC Lab) at the University of East London (https://www.eclab.uel.ac.uk), who played a key role in driving energy efficiency in the design and operation of data centres and computer servers, has been named on the UK Universities top 100 Best Breakthrough list.
The Breakthrough list demonstrates how UK universities are at the forefront of some of the world’s most important discoveries, innovations and social initiatives.
To find out more about the MadeAtUni campaign, please visit: https://madeatuni.org.uk
EURECA wins the DCD Global Awards 2018 for the Industry Initiative of the Year
The EURECA project ( https://www.dceureca.eu ) has won the 'Industry Initiative of the Year' category at the 2018 Data Centre Dynamics (DCD) awards. EURECA beat off strong competition from US state government body Maryland Energy Administration; Host in Scotland; and Host in Ireland.
Speaking after collecting the award, EURECA project coordinator, Dr Bashroush said, "It is such an honour to win this global award, but I have to admit it was such a close call given the three other great initiatives shortlisted.”
By Kevin Towers, Managing Director
Looking back over the last 12 months, it is hard to believe that so much has happened. Here is what we have celebrated in 2018 and hope to build on in 2019.
Techbuyer was honoured to be invited to contribute to the Parliamentary Review for the second year in a row this January. We had a lot to report in terms of business growth and workforce increases (28% this year according to the latest figures). It was also wonderful to report on the circular economy and show how data centre hardware can lead the way with sustainable solutions for servers, storage and networking to go on to second, third and more lives after initial use.
Techbuyer went live on the Telegraph Business website in February with a documentary spotlighting the benefits of refurbished IT equipment. The film was part of the Great British Business Campaign that featured on various television and media outlets throughout the year. The campaign focused on SMEs, which have traditionally under-represented in the media despite making up over 99% of the private sector and being responsible for 60% of private sector employment in the UK. Natasha Kaplinsky was outstanding in getting the message out about refurbished IT.
March was the month Techbuyer first joined the Data Centre Alliance, a move that has led to a large number of opportunities. These include listing on the award-winning DC EURECA project’s directory, an introduction to working with the UK government’s cross party think-tank Policy Connect, research that led to a KTP with the University of East London and insights into the issues facing this fascinating sector. We also had the opportunity to publicise sustainable hardware solutions in a number of industry magazines.
Appearing on stage alongside industry giants like Rittal and Schneider, an academic from the Bordersteps Institute and a representative from the German Federal Environment Ministry was a special moment this June. Gathered together to discuss how data centres can become more sustainable, there was a host of ideas from across the industry. From ways of dealing with energy spikes and legislative changes to ideas for how to handle refreshed hardware, the feeling was that cooperation is key to a more sustainable future.
Techbuyer allied itself with the likes of Barclays, Deloitte, Vodafone, PwC and the University of Liverpool when it joined the Northern Powerhouse partnership programme in June. With a strong belief in the economic potential of the North to develop skills, science and innovation, it is a fantastic opportunity to be part of something that strengthens our home base and has the potential to develop the data centre industry in new areas.
Innovate UK gave us the go ahead this September for a ground breaking joint project with the University of East London. The “Knowledge Transfer Project” (KTP) will embed the research expertise of Rabih Bashroush, Director of the Enterprise Computing Research Lab, and other academics to create a tool that optimises IT hardware refresh in data centres. The focus will be to drive the circular economy and green agenda.
October saw the opening of our first Asia Pacific office in Auckland, New Zealand. We now have a presence in three continents in addition to our offices in the UK, Germany and USA. The move is designed to better service our clients in Asia Pacific and enables us to offer a round the clock service to our customers. Since 2018 saw us deliver to over 100 countries, our global offices are set to be increasingly important. A technical facility is due to open in Australia early 2019 to partner the operation.
Following the work the DCA have done on the EN50600 standard, the Cyber and Physical Security Special Interest Group are in search of new challenges. We are delighted to be part of the process. Rich Kenny, our IT Security Manager joined the group in October to offer insight into cyber protections that are most suitable for the Data Centre Environment. The group is currently working on a white paper to describe best practice. Work will progress in the new year.
In addition to our ISO 90001 and 14001 accreditations for Quality and Environmental standard, Techbuyer became officially ISO 27001 accredited in November for Information Security Management. This is obviously a great milestone for us given our position as IT asset disposal specialists. We wiped over 120,000 hard drives in 2018 as part of our mission of diverting usable infrastructure from landfill. For us, though, it marks the continuation of a much longer journey of constant improvement.
Techbuyer joined this cross party think-tank in December following an introductory talk given at the DCA Annual General meeting. 2019 will see us discuss ways that other industries can learn from the server, storage and networking secondary market when it comes to economically viable solutions for reuse and product life extension. Techbuyer is also hoping to learn about developments in skills training and developments in recycling technologies as we gear up to make our contribution to the UN Sustainable Development goals in the new year.
Since the release of Host in Ireland’s report ‘Ireland’s Data Hosting Industry 2017’ there has been a 120 MW increase in total connected data centre power capacity and seven new operational data centres in Ireland.
Most of Ireland’s 48 data centres, with a combined 540 MW of grid-connected power capacity, are located in the Dublin Metro area. The concentration of data centres combined with Dublin’s offerings of public, private and hybrid data clouds in close proximity to leading colocation, managed services and hyperscale facilities has helped the city earn the nickname “Home of the Hybrid Cloud”.
If predictions of continued investment in data centre construction are realized numbers will continue to increase in 2019, with an estimated investment of €1.5 Billion and by 2021, €9.3 Billion will have been invested in Ireland.
A round-up of the latest Gartner research, covering IT infrastructure and 5G.
As more organizations embrace digital business, infrastructure and operations (I&O) leaders will need to evolve their strategies and skills to provide an agile infrastructure for their business. In fact, Gartner, Inc. said that 75 percent of I&O leaders are not prepared with the skills, behaviors or cultural presence needed over the next two to three years. These leaders will need to embrace emerging trends in edge computing, artificial intelligence (AI) and the ever-changing cloud marketplace, which will enable global reach, solve business issues and ensure the flexibility to enter new markets quickly, anywhere, anytime.
IT departments no longer just keep the lights on, but are also strategic deliverers of services, whether sourced internally or external to the organization. They must position specific workloads based on business, regulatory and geopolitical impacts. As organizations’ customers and suppliers grow to span the globe, I&O leaders must deliver on the idea that “infrastructure is everywhere” and consider the following four factors:
Agility Thrives on Diversity
Bob Gill, vice president at Gartner, said the days of IT controlling everything are over. As options in technologies, partners and solutions rise, I&O leaders lose visibility, surety and control over their operations. “The cyclical, dictated factory approach in traditional IT cannot provide the agility required by today’s business,” Mr. Gill said. “The need for agility evolved faster than our ability to deliver.”
Despite agility placing among their top 3 priorities for 2019, I&O leaders are faced with conundrum as a diverse range of products is available to them. “The ideal situation for I&O leaders would be to coordinate the unmanageable collection of options we face today — colocation, multicloud, platform as a service (PaaS) — and get ahead of the business needs tomorrow. We must reach into our digital toolbox of possibilities and apply it to customer intimacy, product leadership and operational excellence to establish guardrails around managing the diversity of options in the long term,” Mr. Gill said.
The need for agility will only increase, so the two key tasks of I&O leaders will be to manage the sprawl of diversity in the short term and become the product manager of services needed to build business driven, agile solutions in the long term. “I&O has the governance, security and experience to lead this new charge for the business,” Mr. Gill said.
Applications Enable Change
Infrastructure can save money and enable applications, but by itself it does not drive direct business value — applications do. Dennis Smith, vice president at Gartner, said that there is no better time to be an application developer. “Application development offers an opportunity to jump on the express train of change to satisfy customer needs and build solutions composed of a tapestry of software components enabled through APIs,” Mr. Smith said.
Gartner research found that by 2025, 70 percent of organizations not adopting a service/product orientation will be unable to support their business, so I&O engineers must engage with consumers and software developers; integrate people, processes and technology; and deliver services and products all to support a solid infrastructure on which applications reside.
Boundaries Are Shifting
Digital business blurs the lines between the physical and the digital, leveraging new interactions and more real-time business moments. As more things become connected, the data center will no longer be the center of data. “Digital business, IoT and immersive experiences will push more and more processing to the edge,” said Tom Bittman, distinguished vice president at Gartner.
By 2022, more than half of enterprises-generated data will be created and processed outside of data centers, and outside of cloud. Immersive technologies will help to light a fire of cultural and generational shift. People will expect more of their interactions to be immersive and real time, with fewer artificial boundaries between people and the digital world.
The need for low latency, the cost of bandwidth, privacy and regulatory changes as data becomes more intimate, and the requirement for autonomy when the internet connection goes down, are factors that will expand the boundary of enterprise infrastructures all the way to the edge.
People Are the Cornerstone
I&O leaders are struggling to deliver value faster in a complex, evolving environment, hindering the ability of organizations to learn quickly and share knowledge.
“The new I&O worker profile will embrace versatile skills and experiences rather than reward a narrow focus on one technical specialty,” said Kris van Riper, managing vice president at Gartner. “Leading companies are changing the way that they reward and develop employees to move away from rigid siloed career ladders toward more dynamic career diamonds. These new career paths may involve experiences and rotations across multiple technology domains and business units over time. Acquiring a broader understanding of the IT portfolio and business context will bring collective intelligence and thought diversity to prepare teams for the demands of digital business.”
Ultimately, preparing for I&O in the digital age comes down to encouraging different behaviors. Building competencies such as adaptability, business acumen, fusion collaboration and stakeholder partnership will allow I&O teams to better prepare for upcoming change and disruption.
Gartner, Inc. has highlighted the key technologies and trends that infrastructure and operations (I&O) leaders must start preparing for to support digital infrastructure in 2019.
“More than ever, I&O is becoming increasingly involved in unprecedented areas of the modern day enterprise. The focus of I&O leaders is no longer to solely deliver engineering and operations, but instead deliver products and services that support and enable an organization’s business strategy,” said Ross Winser, senior research director at Gartner. “The question is already becoming ‘How can we use capabilities like artificial intelligence (AI), network automation or edge computing to support rapidly growing infrastructures and accomplish business needs?”
During his presentation, Mr. Winser encouraged I&O leaders to prepare for the impacts of 10 key technologies and trends to support digital infrastructure in 2019. They are:
Serverless computing is an emerging software architecture pattern that promises to eliminate the need for infrastructure provisioning and management. I&O leaders need to adopt an application-centric approach to serverless computing, managing APIs and SLAs, rather than physical infrastructures. “The phrase ‘serverless’ is somewhat of a misnomer,” noted Mr. Winser. “The truth is that servers still exist, but the service provider is responsible for all the underlying resources involved in provisioning and scaling a runtime environment, resulting in appealing agility.”
Serverless does not replace containers or virtual machines, so it’s critical to learn how best and where to use the technology. “Developing support and management capabilities within I&O teams must be a focus as more than 20 percent of global enterprises will have deployed serverless computing technologies by 2020, which is an increase from less than 5 percent today,” added Mr. Winser.
AI is climbing up the ranks in terms of the value it will serve I&O leaders who need to manage growing infrastructures without being able to grow their staff. AI has the potential to be organizationally transformational and is at the core of digital business, the impacts of which are already being felt within organizations. According to Gartner, global AI-derived business value will reach nearly $3.9 trillion by 2022.
Network Agility (or Lack of?)
The network underpins everything IT does — cloud services, Internet of Things (IoT), edge services — and will continue to do so moving forward.
“Teams have been under pressure to ensure network availability is high and as such the team culture is often to limit change, yet all around the network team the demands for agility have increased,” said Mr. Winser. The focus for 2019 and beyond must move to how I&O leaders can help their teams increase the pace of network operations to meet demand. “Part of the answer is building network agility that relies on automation and analytics, and addressing the real skills shift needed to succeed,” said Mr. Winser.
The demands on the network are set to grow with the advent of 5G, increasing cloud maturity, and the explosion in numbers of IoT devices. “These are just a few of the pressures leaders should anticipate — so the critical time frame to deal with this challenge is now,” said Mr. Winser.
Death of the Data Center
Gartner predicts that by 2025, 80 percent of enterprises will migrate entirely away from on-premises data centers with the current trend of moving workloads to colocation, hosting and the cloud leading them to shut down their traditional data center.
“I&O leaders must prepare to place workloads based on business needs, not constrained by physical location. From colocation through to public cloud — plenty of alternatives to on-premises data centers exist. Leaders must identify whether there are truly strategic reasons to persist with on-premises needs, especially when they consider the significant amount of investment involved is often amortized over many years,” said Mr. Winser. Preparations must begin now, as the critical time frame for this is 2021 to 2025.
IoT and immersive technologies will drive more information processing to the edge, redefining and reshaping what I&O leaders will need to deploy and manage. The edge is the physical location where things and people connect with the networked digital world, and infrastructure will increasingly reach out to the edge. Edge computing is a part of a distributed computing topology where information processing is located close to the edge, which is where things and people produce or consume that information. It touches on the laws of physics, economy and land, all of which are contributing factors to how and when to use edge.
“This is another trend that does not replace the cloud, but augments it,” said Mr. Winser. “The critical time frame for organizations to adopt this trend is between 2020 and 2023.”
Digital Diversity Management
Digital diversity management is not about people, but rather about the discovery and maintenance of assets that are “out there” in any given modern digital enterprise. “There has been huge growth in the range and quantity of “things” that I&O is expected to know about, be supporting and be managing,” said Mr. Winser. “Traditional asset management is still important, but we’re moving into the realms of involvement with new assets that might have direct effects on the finances, health and welfare of the organization’s customers.” Preparing I&O is vital now before the critical time frame of 2020 to 2025.
New Roles Within I&O
I&O leaders find that staffing justifications require resolving complex relationships between costs, activities and customer quality expectations. Explaining I&O staffing requirements to IT and business leaders in terms of business value by connecting staffing levels to business performance and strategic objectives is a must in today’s modern digital enterprise.
“For instance, IT is increasingly taking on the role of supporting cloud services in terms of aggregation, customization, integration and governance. A big challenge with cloud services is keeping costs under control, and the business expects I&O to be doing just that. Rather than focusing solely on engineering and operations, I&O must develop the capabilities needed to broker services; these will require different roles to the I&O of old,” said Mr. Winser. The critical time frame for this trend starts immediately in 2019.
Software as a Service (SaaS) Denial
SaaS is software that is owned, delivered and managed remotely by one or more providers. The provider delivers software based on one set of common code and data definitions that is consumed in a one-to-many model by all contracted customers at any time on a pay-for-use basis or as a subscription based on use metrics.
In 2019 and beyond, SaaS will have a big impact on how organizations look at infrastructure delivery strategies moving forward. However, most I&O leaders are still focused on infrastructure as a service (IaaS) and platform as a service (PaaS) solutions. “SaaS itself is becoming a level of complexity that IT shops aren’t yet coping with as they should. The shift to SaaS must be accompanied with I&O support, all the way from ensuring visibility is maintained of what is in use, through to supporting compliance requirements and enterprise integration needs. Leaders must start this now as the pressure will be on through 2021 and beyond,” said Mr. Winser.
Talent Management Becomes Critical
Historically, IT staff have been vertically organized based around the technology stack they managed. As infrastructures go digital, there becomes a need for people to work horizontally across stacks in order to identify and remediate technology work stoppages in their business. Expanding I&O skill sets, practices and procedures to accommodate hybrid operations is of the utmost importance in 2019 and beyond. “In short, talent is the critical ingredient for a modern, high-performing technology organization, and great talent is in high demand. People that show versatility and adaptability are quickly becoming must-haves, particularly in hybrid environments,” said Mr. Winser.
Global Infrastructure Enablement
Despite few infrastructures being truly “global” in nature, organizations must still prepare for the notion of “infrastructure everywhere.” While doing so, I&O leaders must work within the constraints of tight budgets and cost pressures.
One way to tackle this challenge is to wisely choose the network of partners needed for global success. “I&O leaders must look hard at their existing partners and raise the bar of expectation. Can they clearly identify the value the partner will bring to them in the context of global infrastructure? Are they unlocking all the value from recent investments their partners have been making?” said Mr. Winser. “There will be no time for ‘B-team’ partners in 2019 and beyond — I&O leaders must be on top of this trend between 2020 and 2023.”
Two-thirds of organisations intend to deploy 5G by 2020
Sixty-six percent of organizations have plans to deploy 5G by 2020, according to a new 5G use case and adoption survey by Gartner, Inc. Organizations expect 5G networks to be mainly used for Internet of Things (IoT) communications and video, with operational efficiency being the key driver.
“In terms of 5G adoption, end-user organizations have clear demands and expectations for 5G use cases,” said Sylvain Fabre, senior research director at Gartner. “However, one major issue that 5G users face is the lack of readiness of communications service providers (CSPs). Their 5G networks are not available or capable enough for the needs of organizations.”
To fully exploit 5G, a new network topology is required, including new network elements such as edge computing, core network slicing and radio network densification. “In the short to medium term, organizations wanting to leverage 5G for use cases such as IoT communications, video, control and automation, fixed wireless access and high-performance edge analytics cannot fully rely on 5G public infrastructure for delivery,” added Mr. Fabre.
Top Use Cases for 5G
IoT communications remains the most popular target use case for 5G, with 59 percent of the organizations surveyed expecting 5G-capable networks to be widely used for this purpose. The next most popular use case is video, which was chosen by 53 percent of the respondents.
“The figure for IoT communications is surprising, given that other proven and cost-effective alternatives, such as Narrowband IoT over 4G and low-power wide-area solutions, already exist for wireless IoT connectivity,” said Mr. Fabre. “However, 5G is uniquely positioned to deliver a high density of connected endpoints — up to 1 million sensors per square kilometer.”
“Additionally, 5G will potentially suit other subcategories of IoT that require very low latency. With regard to video, the use cases will be varied. From video analytics to collaboration, 5G’s speed and low latency will be well suited to supporting 4K and 8K HD video content,” added Mr. Fabre.
Status of 5G Deployment
Gartner predicts that, by 2022, half of the CSPs that have completed commercial 5G deployments will fail to monetize their back-end technology infrastructure investments, due to systems not fully meeting 5G use case requirements. “Most CSPs will only achieve a complete end-to-end 5G infrastructure on their public networks during the 2025-to-2030 time frame — as they focus on 5G radio first, then core slicing and edge computing,” said Mr. Fabre.
Mr. Fabre added that this is because CSPs’ 5G public networks plans vary significantly in timing and scope. CSPs will initially focus on consumer broadband services, which may delay investments in edge computing and core slicing, which are much more relevant and valuable to 5G projects.
Gartner advises that, to meet the demands of businesses, technology product managers planning 5G infrastructure solutions should focus on 5G networks that offer not only 5G radio but also core slicing and edge computing infrastructure and services for private networks. CSPs alone may not fully satisfy the short-to-midterm demands of organizations that are keen to deploy 5G quickly.
“Private networks for enterprises will be the most direct option for businesses that want to benefit from 5G capabilities early on,” said Mr. Fabre. “These networks may be offered not only by CSPs but also directly by infrastructure vendors — and not just by the traditional large vendors of infrastructure, but also by suppliers with cloud and software backgrounds.”
The latest research news from analyst firm, IDC, covering, digitalisation, robotics, AR and VR, servers, storage and Cloud.
European spending on technologies and services that enable the digital transformation (DX) of business practices, products, and organizations is forecast to reach $378.2 billion in 2022, according to the latest Worldwide Semiannual Digital Transformation Spending Guide published by International Data Corporation (IDC). DX spending is expected to steadily expand throughout the 2017-2022 forecast period, achieving a five-year compound annual growth rate (CAGR) of 15.1%.
Europe is the third largest geography for DX spending, after the United States and China. Four industries will be responsible for nearly 44% of the $256 billion in European DX spending in 2019: discrete manufacturing ($39 billion), process manufacturing ($25 billion), retail ($26 billion), and utilities ($23 billion). For European manufacturers, the top DX spending priority is smart manufacturing. IDC expects the industry to invest more than $27.6 billion in smart manufacturing next year along with significant investments in digital innovation ($8.8 billion) and digital supply chain optimization ($5.5 billion). In the retail space, the leading strategic priority is omni-channel commerce, which translates to nearly $5.0 billion in spending for related platforms and order orchestration and fulfillment. Meanwhile, the top priority for the utility industry is digital grid, which will drive investments of more than $13.6 billion in intelligent and predictive grid management and digital grid simulation.
IDC predicts the largest investments in DX use cases across all industries in 2019 will be freight management ($11 billion), autonomic operations ($7 billion), robotic manufacturing ($8 billion), and intelligent and predictive grid management ($12 billion).
"European manufacturing companies are increasingly adopting innovation accelerator technologies," says Neli Vacheva, senior analyst with IDC's Customer Insights and Analysis Group. "The sector is introducing innovation-enabled production processes, advanced asset and inventory management, and new sales models based on IoT, robotization, artificial intelligence, machine learning, and 3D printing. IoT data utilization efforts has repositioned manufacturers in the value creation chain and transformed entire industrial ecosystems."
"European retailers are also running fast in the DX race, with the aim of gaining a competitive advantage, while the non-DX players are confined to a shrinking addressable market," says Angela Vacca, senior research manager with IDC's Customer Insights and Analysis Group. "European retailers will increasingly leverage technology to renovate their business models, deliver innovative services, and enhance customer experience (CX)."
From a technology perspective, hardware and services will account for more than 78% of all DX spending in 2019. Services spending will be led by IT services ($43 billion) and connectivity services ($25 billion), while business services will post the highest growth (19.8% CAGR) over the five-year forecast period. Hardware spending will be spread across several categories, including enterprise hardware, personal devices, and IaaS infrastructure. DX-related software spending will total $55 billion in 2019 and will be the fastest-growing technology category with a CAGR of 18.1%.
The Worldwide Semiannual Digital Transformation Spending Guide quantifies enterprise spending for 181 DX use cases and 12 technology categories across 19 industries and nine geographies. The guide provides spending data for 31 DX strategic priorities and 68 programs, as well as technology spending by delivery type (cloud, non-cloud, and other). Unlike any other research in the industry, the DX Spending Guide was designed to help business and IT decision makers to better understand the scope and direction of investments in digital transformation over the next five years.
As technology develops and requirements change, it is not uncommon for businesses to find that they have a diverse collection of legacy technology platforms. While these systems may have played key roles in business operations at one time, like any software application they do not age well if not looked after, and as such they can quickly become a burden, becoming costly to maintain and unable to scale.
By Martijn Groot, VP of Product Management, Asset Control.
Whether the result of previous mergers, a failure to keep up with technological advances or a reluctance to budget for development, legacy platforms are certainly not uncommon. Yet, while most businesses recognise the need to update their data management systems, the cost of change and perceived difficulties of integrating their systems with new solutions can act as a deterrent. However, continued use of legacy platforms holds businesses back from reaching their true potential and can severely impact business operations. With this in mind, we look at the true costs businesses are likely to face if they fail to bring their data management capabilities in line with 21st century requirements.
Why failing to update will be costly
Despite the difficulties associated with implementing and integrating new data management platforms, businesses are more likely to be negatively impacted by inertia than by taking action and updating their data management solutions. According to our research, more than a third of financial institutions saying that legacy data platforms are the biggest obstacles to improving their data management and analytics capabilities. For financial institutions this is a worrying statistic as they are heavily reliant on easy access to quality data and analytics to perform their role effectively.
However, this is just one of the issues preventing businesses from taking action with a further 31% of financial institutions citing the cost of change is holding them back from updating their legacy systems. For these organisations, there is a mistaken idea that it is cheaper and therefore more cost-effective to stick with legacy systems than to update them. Yet, this doesn’t take into account the indirect costs of legacy platforms and the effects these have on a business which can be felt across a number of other areas, including information security and lack of business user enablement and productivity.
It is crucial that organisations consider how much their current data management systems are holding them back operationally as they typically take longer to carry out processes, delaying data delivery to end-user applications and reports, and lowering productivity. Legacy systems can slow down organisations as they are costly to maintain, miss audit or lineage information, often cannot scale to new volume requirements, and do not quickly and easily provide business users with the data they require. Especially with frequent new regulatory requirements on data quality and due process, the cost of change can be formidable.
Data discrepancies because a business lacks a clear and comprehensive view of its sourcing and validation process can be another costly by-product of legacy platforms. Similarly, they are a higher security risk as they tend to no longer be supported or patched. This leaves data businesses vulnerable which can be costly in several ways, including potentially breaching GDPR or other data regulations.
Moving to insight driven data management
With investment in the right tools, it is possible for businesses to overcome the challenges posed by legacy platforms. Although the initial cost of implementing a new solution may be daunting, continuing to use outdated solutions is likely to be the riskier move and result in a growing number of indirect costs.
While replacing systems may be met with some reluctance, our research shows that many businesses have a good understanding of what they require from new data management solutions. According to our recent study, when considering new data management and analytics capabilities, firms remain focused on the fundamentals with more than a third citing ease of use and flexible deployment as their top business consideration, while 41% consider ROI to be the biggest factor. Therefore, organisations must deploy data management platforms that deliver this and can be easily integrated with other systems within their business.
Undoubtedly, changing data management strategy and tactics can be a difficult task as it is not only expensive to start again in terms of systems spend, but also with regards to time and resources to undo a solution which is typically deeply embedded. However, complacency is likely to involve much higher risks and costs, with a potentially devastating impact on business continuity, security and reliability. We are finding ourselves in an ever more data intensive environment and this data is under more scrutiny than ever before. Businesses, therefore, must ensure they have the most efficient and effective solutions in place to avoid discrepancies and non-compliance and optimally support the business.
DW talks to Sridhar Iyengar, head of Europe, Zoho, about the three As – AI, automation and analytics.
Please can you give us a brief refresher on the company’s origins and progress to date?
Zoho Corporation was established in 1996 and is comprised of three operating divisions: ManageEngine, which offers Enterprise IT management products; WebNMS, which provides an Internet of Things (IoT) framework and solutions. The last division, Zoho.com offers over 40+ cloud business applications serving more than 40 million users worldwide. These cloud applications cover the entire spectrum of business software, ranging from sales and marketing software to HR, Finance, IT office, communication and a collaboration suite. We are bootstrapped and profitable with over 6,500 employees in 8 office locations worldwide. Our business model does not rely on monetising our customer’s data and we take users’ data privacy very seriously.
1. And who are the key personnel?
Zoho is privately held and was founded by Sridhar Vembu, Shailesh Kumar, Tony Thomas and Kumar Vembu. The founders are still active within the organisation. This is supported by a senior management team that has been with the company for close to two decades and which is instrumental in setting up a scalable organisation.
2. Can you summarise Zoho’s SaaS/technology vision/philosophy?
Zoho’s philosophy is to offer compelling solutions with rich functionality, that can increase productivity for businesses. This is achieved by using superior technology in a very easy-to-use way and by providing software that supports rich integrations, out-of-the-box. This provides tremendous value to businesses who were traditionally used to software components cobbled together, that were either not interoperable or complex or expensive to integrate.
3. In more detail, can you tell us something about the recent upgrade to Zoho Desk, with the addition of AI and process automation?
Zoho Desk was the industry’s first contextually aware customer support application. Recently, we added a process automation feature, called Blueprint. With Blueprint, customer success managers can easily create workflows with drag and drop and connect various important processes. Zoho’s AI-powered engine, Zia (Zoho’s Intelligent Assistant) is now available within multiple products including Zoho Desk. Zia in Desk is all about increasing the productivity of teams to make them better partners in customer’s success. Zia can now undertake sentiment analysis of incoming tickets and crunches data to detect anomalies in trends. These, along with other features around AI makes Zoho Desk one of the most advanced customer support products for businesses.
4. And you’ve added ‘deeper’ analytics and AI to your CRM Plus product?
Yes, CRM plus makes a perfect platform to analyse data as it creates a massive set of data across three main front-facing departments of any organisation: sales, marketing and customer support. Zoho Analytics is a module within CRM Plus that’s also available as a separate product. The analytics module helps blend data across sales, marketing and support functions in a way that enables leaders to get visibility into customer journeys across these departmental contexts. This is visually available via dashboards for better decision making.
5. Zoho Analytics – a new departure for the company?
Not exactly. Zoho Analytics is for the new face of Zoho Reports. This product has been around for over a decade and was one of the first self-service SaaS BI software on the market. Over the last couple of years, we have made significant improvements to the product that deepen the product’s focus on analytical capabilities, hence the change in name. This is our biggest release in the last several years for the product. In this release, we have worked on the following:
a. Introduced machine learning and Natural Language Processing (NLP). A business user can now just ask queries via the ‘Ask Zia’ module such as “Show me the top 5 selling product in the UK” and get a visual chart as a response that can be drilled-down for analysis. This makes it easier for the non-technical user who does not have to write complex database queries to create the reports
b. For enterprises, we have also released an on-premise version for the Business Intelligence software that can be deployed in private data-centres of mid to large sized enterprises.
6. And what does it offer end users?
Previously I mentioned the main features that were added in the new release. We worked on those features for end users with below framework in mind:
7. Zoho Social- richer analytics?
Social media is vast and the data flow is unstructured in most parts. As a business with a keen interest on tapping the potential of social media, it’s imperative to have a tool that connects the dots and helps businesses to make quick decisions. The analytics on Zoho social helps you understand your audience better and also have a clear visibility into every single post that your team has made on the popular social media platforms. Zoho Social helps firms manage Facebook, Twitter, LinkedIn, Google Plus and Instagram. Apart from this, we have made the life of marketers easier by integrating Zoho Social with Facebook lead ads and Zoho CRM. Social media teams can now run Facebook ads on Zoho social and push the leads directly to Zoho CRM with just click of a button.
8. Analytics seem to be a big focus for Zoho right now – why is that?
Analytics was always there in individual products, but as our suite becomes more mature and connects more data points, the relevance of analytics is getting stronger and we are able to find information that was once hidden in the siloed application. Also, this is in response to a growing need from businesses, which have an abundance of information residing in different systems. They need analytics to make decisions. We are just making sure they have a simple and smarter way to get things done.
9. And do you think end users are wary of analytics as, thus far, the promise/hype has somewhat exceeded the reality?
Analytics was previously accessible to only a few select power users in companies and in some cases only to companies or teams that could afford it. Our aim with Zoho Analytics or any analytics module available in our products was to break down the entry barrier by keeping it simple so that anyone can build dashboard and connect data and keep the prices reasonable.
Building dashboards that blend data from custom applications and legacy systems is not easy. It’s a complex technical use case to solve. As every application is built on different platforms and has different database schemas, it’s perhaps this complexity that requires vendors to stay in this field for long to understand and develop platforms that help businesses.
10. Can you give us a bit of background on the rest of the Zoho portfolio?
We have briefly touched upon Zoho’s Sales, Analytics and a bit on Marketing suite of products. Although these are five more products that are part of marketing, which include automation, live chat, event management, website optimisation and a website builder. Apart from these front office tools, Zoho has an entire range of products to manage the back office. The finance suite has the accounting, invoicing, subscriptions, checkout and expenses solutions and we have an HCM suite- Zoho People for HR operations. Zoho also has a Low Code developer platform called Zoho Creator that has been around for 12 years. Over 2 million business are hosted on the Zoho Creator platform and there are more than 100,000 developers across the world who are building apps on Zoho Creator using Zoho’s own scripting language called Deluge. Overall there are over 40 applications that Zoho has on cloud with 40 active million users worldwide.
11. More generally, how do you see the SaaS market right now – buoyant. Steady, slow?
We see the SaaS marketing evolving and growing at a rapid pace as more and more business feel that it is the right business model to deploy and use. This has been demonstrated by the inflow of venture capital in the SaaS business and the emergence and growth of SaaS companies.
12. And what will it take for SaaS to gain major traction – or is it more a question of a steady evolution?
While most small and mid-sized companies readily embrace SaaS, larger companies or those that operate in heavily regulated industries are still not ‘fully SaaS’. For these companies, it is a matter of time as they are gradually moving products to either SaaS software or to deploy their in-house applications on publicly or privately available web-scale architectures.
13. How does Zoho compete and/or work alongside the hyperscalers who seem to want to do and own everything it-related?
Zoho’s focus is primarily on providing business applications or development platforms on which developers can build applications on, whereas the hyperscalers are focussed on the infrastructure layer.
14. Presumably, customer service is an area of potential major differentiation – the speed and agility with which customers are helped?
Customer support is one area that Zoho takes pride in. We offer a very high quality of technical support through various channels – email, phone, chat, social, in-app, web, etc. Our online forum and communities with educational content keeps customers engaged with latest development on product and our vision of the direction.
15. Any other thoughts on the SaaS market?
The SaaS market may also witness consolidation in some over-crowded market categories that contains many players. This may result in some larger players acquiring smaller niche ones, but here, integrating the products may be a challenge and not work well for the customer. Zoho is ideally positioned as we offer a very diverse and broad portfolio of products that are well integrated not only between themselves but also with other third party software.
16. Before finishing – can you give us some background on the company’s footprint to date – Channel and/or direct sales?
Zoho operates in a dual mode with both direct sales and channel-driven sales. As we are a privately held company, we are not in a position to disclose financial figures.
17. And how do you see this developing over time?
We are actively recruiting and nurturing our network of channel partners who resell our software, offer services to customise or build on top of it.
18. Finally, what can we expect from Zoho during 2019?
A lot more focus on AI, automation and analytics, which can increase productivity for businesses.
There is much talk about the potential of Artificial Intelligence (AI) and Machine Learning (ML), especially in the Business Intelligence (BI) and Analytics arena.
By Shawn Deegan, General Manager EMEA, Yellowfin.
The most powerful examples revolve around the ability of machines to spot patterns and trends within huge volumes of data. CIO Magazine spotted the trend towards better data signals last year, saying: “Many BI tools are bringing in more, and better, data signals to produce more accurate, insightful reports that blur the distinctions that traditionally separated BI from more advanced analytics. As such, BI vendors will need to advance, or risk losing out in the market.”
If you visit any event in the BI space, every leading vendor is talking about their approach to AI and ML. At Yellowfin, we’ve made it possible to cut through the deluge of data by using well-understood algorithms along with newer machine learning elements to surface insights and deliver practical benefits. Whatever tool you look at, you need to consider how users across areas such as sales, marketing, and human resources can maximise their BI to uncover valuable business information without the need for deep analytical skills.
For many organisations, the dashboard is what most people associate with their Business Intelligence tool. Charts and graphics alongside a traditional ‘traffic light’ approach offer a variety of methods for visualising your data specifically for monitoring a known set of metrics over time. However, business users increasingly need to discover new insights, rather than just monitor existing ones, and this is where dashboards tend to fail.
Dashboards were originally designed as a monitoring tool and not as a tool for data discovery. And with the increasing amount of data that sits in silos within every business, and the ever increasing number of applications available now, it is almost impossible to condense it all on to one page. As a result, the data is aggregated. For the business user, the dashboard offers essential top line numbers but gives little visibility into what's happening in the underlying business. In order to get into the detail, users need to feel confident at using filters and drill-down. So dashboards put the onus of discovery back onto the user by making them slice and dice the data themselves to find insights. However, this gives rise to several problems.
The discovery of insights become less predictable and requires the user to master the skill of manually drilling down into data and having the time to do so. As the data may not change every day, the user has to repeat the process of data discovery to check for changes. Not only is this a waste of time, but it can lead to ‘analysis fatigue’ with the result that users may miss a change when something significant actually happens.
To counter this, businesses often start to use workbook solutions where users load data onto the platform, and then automation helps them build a better dashboard. Often the data is loaded manually so it’s not live and if a business user doesn’t have access or knowledge of the backend, then they must rely on data analysts to keep the underlying information updated. Once the data is loaded, the platform then runs basic algorithms that make layout suggestions based on the data. However, the platform only looks at the data at that particular point in time which means that analysts must constantly refresh the data.
More advanced solutions have started to allow users to ask a question and the application then analyses the data and tries to find an answer. However, this assumes that the user knows what questions to ask the platform – if a user has seen an outlier or a spike it may trigger them to question it. The platform doesn’t do the heavy lifting of finding outliers for the user. Instead, the platform just automates the existing data discovery process, but the onus of understanding what questions to ask still sits with the user. That’s why this approach is used mostly by data analysts rather than business users.
Automate to innovate
Yellowfin takes automated analytics much further. Yellowfin Signals applies machine learning along with more advanced algorithms to discover critical variances in business data. It runs continually in the background and surfaces statistically significant changes, alerting people of the changes relevant to their role. When it finds something, it analyses other data sources to find correlated information, then sends an alert highlighting the pattern so all recipients can determine if the data is actionable.
Signals runs on live data, so analysts don’t have to manually load data into a workbook before analysis can be performed. The underlying algorithms also tailor the notifications of changes to users’ roles, using information from each user’s interactions, and the interactions of those who hold similar roles, with previous signals. Something that may be insignificant to the CEO could be very important to someone further down the organisation. This level of nuance is effectively a learning process that gets more powerful as the system runs within an organisation. As it’s automated, it can never suffer from analysis fatigue or human bias, which are common issues within traditional dashboard uses.
Automation technology is not just making it easier for users to gain more insights, faster; it is ultimately helping to alleviate the skills shortage in the industry. Data analysts and scientists command premium salaries and there simply aren’t enough people to fill demand.
The adoption of new technologies like Signals is at the start of the curve but growing rapidly. For organisations that have reached the limits of the usefulness of dashboards or that are struggling with hiring and retaining a growing army of data analysts, a change is coming.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, this January issue contains the second, and the February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 1.
Shrinking electronics trump AI, 5G and the IoT as top design trend for 2019
New research from 6SigmaET reveals the top electronics design trends for 2019 and their thermal implications.
Making devices smaller and more compact will be the number one design priority for electronics engineers in 2019. That’s according to a recent poll by thermal simulation software provider Future Facilities, conducted at this year’s Engineering Design Show.
The research, which surveyed over 100 electronics professionals and engineers, found that 33% believe that shrinking device sizes will have more of an impact on the electronics industry than new technology developments such as 5G, artificial intelligence (AI) and the internet of things (IoT).
Future Facilities’ research identified the top trends defining electronic design in 2019 as:
Other trends, such as the addition of 5G capabilities and making devices more computationally powerful didn’t rank highly on the survey. Making devices greener and more sustainable ranked lowest in the poll, with a mere 2% agreeing it was their top priority for 2019.
As devices get smaller and new functions are added, Future Facilities predicts a host of new thermal challenges for design engineers.
As Tom Gregory, Product Manager at 6SigmaET explains: “Across all five of the top trends shaping electronic design in 2019, thermal management represents a major challenge. As devices become smaller and components are pushed together, the potential for thermal complications and the requirement for innovative grows significantly. At the same time, new technologies such as AI and the IoT are also bringing their own unique challenges and a host of new hardware requirements.
“Despite these concerns, previous research shows that many designers still aren’t taking thermal considerations seriously enough, often failing to run simulations and not identifying thermal complications until after a design is complete. If engineers are going to successfully develop the complex designs needed and fully capitalise on the new technology trends, this approach needs to change.”
How the Unified Communications industry will develop in 2019
Comment from Bryan Martin Chairman and CTO at 8x8.
First off, I expect that the migration of enterprise and mid-market customers from legacy, on-premise solutions to pure cloud solutions to accelerate. I expect you will continue to see service providers in our space follow 8x8’s pioneering lead in combining all business communications, including UC voice, video, conferencing, collaboration, chat, contact centre, call recording and quality management onto a single platform. We have already seen several contact centre acquisitions in the UC space as other players try to catch up to our vision that UC and contact centre should never have developed as separate products. They belong together on a single platform, not only so that every interaction and conversation anywhere in the enterprise is captured globally for analytics and business improvement, but also because a single platform drives simplicity and ease-of-use for both the business administrator and the end-user. As we continue to roll out 8x8’s X Series services to thousands of businesses, we are seeing the simplicity angle resonate equally with the global analytics use case, especially at mid-market customers.
The impact of AI on the industry by the end of 2019
ML and AI are helping 8x8 derive context to the tons of data that flow from our global cloud services every day. We are in the business of making our customers’ data available to them in a form that can help them improve how their own business operates and engages their customers. We are doing this by enhancing the customer experience, providing recommendations on possible anticipated needs and recommending intelligent actions and next steps in the customer experience. This extends beyond the contact centre, because customers do not always communicate with a business by interacting with the contact centre. Which is why a single platform of engagement, that can capture any and all interactions across any site or channel is so important as the input to these AI/ML datasets. Businesses need to understand that having an incomplete communications dataset will yield incomplete answers from their AI/ML platforms. All interactions anywhere in the enterprise (even between employees) need to be part of the input to the AI/ML platform.
The major changes in the customer service industry specifically
Beyond a continued migration from on-prem to the cloud, and a tearing down of the walls that separate the traditional contact centre from the rest of the business, we see a whole class of data analytics technologies coming that will allow a contact centre manager or agent to enrich (beyond basic caller or chat ID) who exactly is on the other end of the interaction, what that person’s relative value is, anticipate by this person is interacting with the contact centre, and then both optimally route or escalate the interaction and provide the receiving agent with the right context and breadth of longitudinal data to better serve that customer. These capabilities will move from the lab to the real world and thereby create a solid business case for a contact centre to move to cloud-based platforms. It will no longer be an option to stay on-premise, it will be contact centre malpractice to stay with legacy, on-premise systems.
The impact of AI on the industry by the end of 2019?
ML and AI are helping 8x8 derive context to the tons of data that flow from our global cloud services every day. We are in the business of making our customers’ data available to them in a form that can help them improve how their own business operates and engages their customers. We are doing this by enhancing the customer experience, providing recommendations on possible anticipated needs and recommending intelligent actions and next steps in the customer experience. This extends beyond the contact centre, because customers do not always communicate with a business by interacting with the contact centre. Which is why a single platform of engagement, that can capture any and all interactions across any site or channel is so important as the input to these AI/ML datasets. Businesses need to understand that having an incomplete communications dataset will yield incomplete answers from their AI/ML platforms. All interactions anywhere in the enterprise (even between employees) need to be part of the input to the AI/ML platform.
Network evolution in 2019 – a year of challenges and opportunities for service providers
Thoughts from Anthony Webb, Vice-President EMEA, A10 Networks.
Observing the evolution of global networks over the course of my career has always been fascinating. Our fast-paced, innovative industry has transformed the world, ushering in the age of mass communication that touches every part of our lives. Most people don’t give a second thought to how we keep networks in operation and evolve them to meet new demands, and that’s exactly how it should be. But for those of us at the coalface, the coming year is looking very exciting for the industry. Here’s my take on the key topics we’ll see ruling the agenda for 2019.
5G reality strikes
No surprise here that high consumer expectations and voracious demand for new services will drive the rollout of 5G throughout 2019. This is the shape of the future and we’ll see all stakeholders in the industry sharing roadmaps and innovations that will whet our appetites for things to come. The pilot projects that A10 Networks has been closely involved in have shed light on some of the new challenges that this major advance in infrastructure is going to create for service providers. We’ve learned that 5G demands are inherently different to 4G, with smaller packet sizes, more throughput, and higher concurrent session counts observed. That means service providers will need the facility to scale to multi-billion concurrent sessions with high throughput capacity.
As might be expected, core infrastructure security for 5G is a whole new and much bigger ballgame. The approach will need to be more advanced, automated and scalable to guarantee uptime and meet new requirements for control plane and user plane security. Alongside innovations in firewall and DNS protection, we’ll see a new generation of AI and machine learning-based security solutions brought into play to mitigate the increasing DDoS risk created by the proliferation of IoT devices and the resulting expanded attack surface.
Low latency, low TCO and high reliability will all be crucial factors for 5G success. Competition in this market will be fierce, so there’ll be focused investment in technology that delivers robust, secure and cost-effective networks.
Historical investment protection
With all the buzz around 5G we shouldn’t forget that it won’t happen overnight. Network evolution takes time and service providers still have a considerable investment in legacy infrastructure that needs to be maintained, and boosted, to cope with growing demand through the transition period. We’ve been seeing this over several years, as demand has skyrocketed and new entrants have started running advanced services over existing networks. Full network upgrades are cost-prohibitive, so service providers are having to invest in products that can make what they’ve already got work more effectively.
Take IP addresses as a case in point. The pool of IPv4 addresses is being exhausted by the explosion in volumes of IoT devices. The expectation was that everyone would move to IPv6 to solve the problem, but that’s effectively an upgrade and we haven’t seen service providers going full throttle down that route. Instead they’re adopting technology like our Carrier Grade Networking (CGN) solutions that helps them preserve their investment, and get the best out of the technology they have today by extending the lifecycle of IPv4, and carefully managing an effective and affordable transition to IPv6 over the longer term to underpin business continuity.
Overall, I expect to see a measured migration and network evolution. Rip and replace is simply too costly an option and service providers need to think carefully about how best to beachhead into profitable 5G services while maintaining existing infrastructure.
Security gets smarter and more pragmatic
Cybersecurity will continue to be a defining challenge as we move into 2019. There’s a noticeable shift, though, as defence techniques and philosophies mature. We’re moving away from the fortress mentality of building a wall and defending it at all costs, to acceptance that the battle for the network perimeter has pretty much been lost and a more pragmatic approach is necessary. Cybercrime is big business and we know that resourceful adversaries are succeeding in penetrating our networks, so now the focus is on building resilient networks that are capable of rapidly identifying and eliminating malicious activity when it occurs.
The threat of DDoS attacks is still ever-present, and attacks are now so easy and cheap to launch that they will undoubtedly continue to intensify. Protecting against DDoS is now simply a cost of doing business, so organisations want to ensure that they’re getting the best protection for their investment. This means looking for protection that is automated, intelligent and scalable, but which also achieves this at a low TCO.
A shift in the business model for network investment
As service providers maintain existing networks and build out new capabilities, I believe we’ll see a reversing of the trend for deferring capital expenditure in preference for operational expenditure that has dominated in recent years. Organisations used to be keen to reduce CapEx and accept higher OpEx but, when you know you’re in a major upgrade cycle, that approach ties you into existing technology and simply pushes problems further out into the future, building up a technology debt that will eventually need to be repaid. With the speed of innovation in today’s market, that’s a risk that many service providers are starting to question.
Service providers are now thinking further ahead, aiming to futureproof their networks by spending a bit more upfront in a bid to reduce that long tail of operational expenditure further down the line. Solution providers who can offer licensing and contract flexibility should reap the benefits of this change in approach.
The coming year promises to be incredibly exciting for all of us involved in the network industry. Overall, I predict that it will be a balancing act between looking forward to 5G and all the opportunities it offers, while handling the challenges of supporting existing networks and developing security strategies to defend against increasing threats. One thing’s for sure, it won’t be boring!
Deep Learning developments
Stathis Vafeias, Head of Machine Learning at Biometric security experts AimBrain comments:
“2018 has seen AI and Machine Learning come into sharp focus. However, organisations have focused thus far on creating new features based on machine learning; what we are now beginning to see is the application of this technology to old processes such as network load balancing, and just how much AI-powered abilities can enhance the current methods and protocols. This is a move that shows the technology is really embedding itself, as it begins to be treated more like a regular software tool rather than an exotic destination or elusive panacea.
“That said, we have not seen the potential for deep learning anywhere near exhausting. Deep learning networks are getting bigger and bigger every year, as industry as well as academia continue to push the boundaries on the amount of data and computation that networks can consume, without adverse effect on the success of current algorithms.
“Researchers have long realised that annotated data (labelled data, for example examples of onboarding records known to be fraudulent) cannot be scaled. Because of this, we expect to see a lot more effort put into unsupervised learning (where algorithms draw inferences from datasets comprising input data without labels), self supervised learning (a form of unsupervised learning in which some part of the data is withheld and models trained to predict) and transfer learning (one approach in machine learning where models are trained on one task and then reused as a starting point for a different task usually harder to learn, or has less data available).
“We are now seeing a truly symbiotic relationship between banks and fintechs or other startups that combine security with AI and machine learning. It is a relationship that complements; fintechs and startups can focus on continued research and development into AI tools, and banks benefit from the work that these specialist companies undertake - both with other financial services clients and beyond into wider verticals. This expedites results, as startups keep reinventing ways in which fraud is detected, isolated and immobilised across multiple industries, and can apply these to financial services clients.”
Bots try to break the internet, and other trends for 2019
Jay Coley, Senior Director of Security Planning and Strategy at Akamai Technologies, offers some observations.
Staying one step ahead of cybercriminals is crucial when it comes to protecting company and customer data. But this is only possible if you have a good hold on short and long-term cybersecurity trends.
So, what does 2019 have in store? Smarter bots, complex clouds, IoT risks and data regulations will all dominate boardroom conversations. Here’s a summary of the trends that I think will make the year ahead as turbulent as the one just passed:
1. Cyber-attacks will grow - and go slow
Organisations will see an increase in cyberattacks but these will be “low and slow”, rather than “noisy” incidents such as DDoS attacks. Launched by botnets, “low and slow” attacks aim to remain under the radar for as long as possible, to steal as much data as they can. Often these take the form of credential stuffing attacks, where stolen credentials are used to access associated accounts and steal further personal data such as addresses and payment details. To protect themselves, businesses will need to adopt bot management solutions, which identify, categorise and respond to different bot types. The technology uses behaviour-based bot detection and continuous threat analysis to distinguish people from bots.
2. Bots will overtake human web traffic
As bots become more sophisticated, more than 50% of web traffic will come from bots. Already, Akamai has found that 43% of all login attempts come from malicious botnets – and this is set to increase as credential stuffing and “low and slow” attacks grow in popularity. More sophisticated bots will become capable of accurately mimicking human behaviour online – making it harder for bot solutions to detect and block their activities. Effective bot management tools are crucial for addressing this threat. They are able to use contextual information, such as IP addresses and past user behaviour data (neuromuscular interaction), to determine whether a visitor is a bot or human and respond accordingly.
Cyber-attacks will grow - and go slow
Organisations will see an increase in cyberattacks but these will be “low and slow”, rather than “noisy” incidents such as DDoS attacks. Launched by botnets, “low and slow” attacks aim to remain under the radar for as long as possible, to steal as much data as they can. Often these take the form of credential stuffing attacks, where stolen credentials are used to access associated accounts and steal further personal data such as addresses and payment details. To protect themselves, businesses will need to adopt bot management solutions, which identify, categorise and respond to different bot types. The technology uses behaviour-based bot detection and continuous threat analysis to distinguish people from bots.
2. Bots will overtake human web traffic
As bots become more sophisticated, more than 50% of web traffic will come from bots. Already, Akamai has found that
Businesses adopting multi-cloud strategies will face increasingly complex challenges to ensure that security is consistently, and effectively, deployed across them all. With Gartner predicting that multi-cloud will be the most common cloud strategy next year, organisations that have successfully secured one cloud will need to replicate this across all their cloud portfolio to ensure that vulnerabilities are patched and nothing slips through the cracks. With many businesses already experiencing ‘leaks’ or breaches of their single-vendor solutions, we expect companies to seek out cloud-agnostic security solutions to simplify deployment and management across the enterprise.
4. Consumers will continue to put convenience ahead of security
Even though awareness of the insecurity of IoT devices is growing, millions of consumers will continue to ignore the risks, purchasing and using devices that lack comprehensive security solutions – from fitness trackers to smart-home appliances. This could swell the armies of bots, which are already being used to target enterprises. It’s predicted that by 2020 more than 25% of identified enterprise attacks will involve the Internet of Things (IoT), though IoT will account for only 10% of IT security budgets. While some governments have begun to introduce security standards for connected devices, the industry is still a long way from adequately securing its devices.
5. Asian markets will follow cybersecurity suit
Following the launch of GDPR last May, as well as PSD2 (revised Payment Services Directive) and wider security reform, the European Union has been a leader in advocating for stronger cyber regulations and this is likely to continue. Some Asian countries have already started to follow suit, implementing their own regulations, and we expect their number to grow in 2019. As countries such as China flex their muscles as digital rivals to the West, issues around data regulation and protection are climbing government agendas. Notably, some Asian countries have resisted data regulations in the past, but high-profile breaches are encouraging a more proactive approach to data regulations.
6. Cybersecurity will be replaced by cyber resilience
In 2019, smart organisations will stop thinking of cyber security as a separate function of the IT department, and instead adopt it as a posture throughout the entire business. Known as “cyber resilience”, this concept brings the areas of information security, business continuity and resilience together and intends to make systems secure by design, rather than as an afterthought. This helps organisations focus on their ability to continuously deliver business operations in spite of any cyber-attacks or incidents.
7. Zero Trust will march towards killing off corporate VPNs
For years, virtual private networks (VPNs) have been the mainstay of remote, authenticated access. However, as applications move to the cloud, threat landscapes expand, and access requirements diversify; the all-or-nothing approach to security needs to change. Zero Trust, where each application is containerised and requires separate authentication, is stepping in to provide security fit for the 21st Century. In 2019, companies will increasingly turn to a cloud framework for adaptive application access based on identity and cloud-based protection against phishing, malware and ransomware, helping to improve the user experience and sounding the death knell for VPNs
8. Blockchain technology will move from cryptocurrencies to mainstream payments
Today, most people associate blockchain with cryptocurrencies and the less-legitimate end of online payments. However, in 2019, blockchain-based payment networks will properly make it into the mainstream as they enable next-generation payment transactions to evolve rapidly. The inherent security built into blockchain can streamline the online payments process, reducing friction, increasing speed and improving the user experience. In the coming year, we expect to see more and more blockchain-powered payment platforms, with high scalability and speed, being adopted by brand-name banks and consumer finance companies.
“First-generation AI solutions were simple – data in, answer out. Solutions were designed to protect the average end user from confusion and distraction. While black box solutions serve their purpose, they also limit the value organisations can extrapolate by hiding AI logic, which in theory could be used to teach humans what was learned that led to various recommendations.
“In 2019, we’ll see more organisations move to glass box AI, which exposes the connections that the technology makes between various data points. For instance, glass box AI not only tells you there is a new retail opportunity, it also uncovers how that opportunity was identified in the data. It also provides retailers with an opportunity to check their data – and any public or aggregate data they pull in – to ensure AI isn’t making bad assumptions under the adage “garbage in, garbage out.
“This may sound more complex, because it is. Even with next-gen UX that simplifies the integration of AI into processes and workflows, retailers must invest in educating employees to make the most of these. But if we’ve learned anything in the last decade, data-driven insights aren’t a passing fad.”
“Since consumers leave a virtual crumb trail, it’s relatively easy to develop a good understanding of ecommerce shoppers’ preferences and make smart recommendations based on gathered data. As a result, online personalisation has paid off. However, this success is much harder to replicate in physical settings where in-store technologies such as facial recognition and phone sniffing have a creepy factor.
“All of the personalisation that retailers have implemented to-date has been focused on generating insights based on observed behaviour, and delivering those insights in an “impersonal manner,” by relying on technology to make product recommendations or order search results. Store personalisation has to be delivered in the context of the physical location and the employees at that location. Retailers have to figure out how to turn personalisation insights into actions that store employees can deliver in a way that is not creepy – 2019 will see both retail #fails and some new successes, as retailers get better at translating the technology of personalisation into the store environment.”
1) Major public cloud providers will differentiate
In some ways 2019 will be business as usual for the major public cloud providers. Their IaaS model is well-entrenched, and any new technology which drives CPU and RAM consumption on their platforms will be more than welcome.
However, with multi-cloud becoming the norm, the major players will have to prove their worth more than ever before in an enterprise’s IT strategy. Price wars between the leaders have proven to result in a race to the bottom, so instead they’re looking to differentiate. Expect to see Google focus on its AI credentials, Microsoft on its workload migration capabilities, and Amazon to continue pushing AWS hard in the public sector space.
This differentiation will be important for the incumbents as it could be the year they come under increasing pressure from other public cloud players. IBM’s acquisition of Red Hat is a signal of potential competition, while eastern players such as Alibaba and Tencent continue to improve their capabilities locally, undoubtedly with an eye on international expansion.
2) Kubernetes will be used as the industry standard to manager containers at scale
The value of containerisation has been almost universally realised, and adoption of Kubernetes has almost hit a peak rate, making it essentially an industry standard. The challenge in 2019 will be to progress beyond the early adoption of containers towards using them to manage deployments at larger scales.
The ubiquity of Kubernetes will likely drive consolidation within the container market. While open source development is beneficial overall, the sheer number of Kubernetes distributions currently on the market will be hard to sustain. As larger vendors look to improve and differentiate their own distributions with specific capabilities, expect smaller niche players to be snapped up for their expertise.
In general, we are likely to see a continued break down of computing into smaller bits to add to the competitive landscape between cloud providers. Technologies such as serverless computing will likely play an increasingly important role over the next twelve months. We expect to see the beginnings of open source serverless solutions start to compete for broad acceptance in the developer community, which will shape the future of that technology ecosystem for years to come.
3) The continued importance of open source
Challenges still remain for businesses when attempting to deploy cloud computing effectively, especially as complexity increases due to demands for multi-cloud strategies. It seems likely that the trend of creating Lego-like building blocks to help standardise and bring more order to cloud stacks (such as Kubernetes, OpenStack etc.) will continue and even increase.
This is true as businesses realise the benefits of the CI/CD (continuous integration/continuous delivery) approach which the current fast-moving pace of cloud innovation demands. Using open source technology means less disruption with each new implementation, and allows dev teams to fine tune new software more effectively. The commercial benefits of this approach are also clear, as it allows teams to more readily pinpoint changes in productivity as a result of software implementations.
The possibilities of how Artificial Intelligence (AI)-enabled technology will enhance our lives and improve our workload is a subject constantly capturing our imaginations. Since 2000, there’s been a 14-times increase in the number of AI startups, according to a Stanford University study, while in the UK, AI developers saw a venture capital funding increase of more than 200 per cent in the past year. Its potential to change business is seemingly limitless.
By Carmine Rimi, AI Product Manager at Canonical - the company behind Ubuntu.
However, creating these applications to improve our lives and businesses is no mean feat. In fact, AI-enabled applications are extremely hard to build. Large AI programs are difficult to design and write as they involve many types of data. Porting them from one platform to another tends to be troublesome.
What’s more, several steps are required at each stage to begin building even the most rudimentary AI application, each requiring different skills. Feature extraction, data collection verification and analysis, and machine resource management make up the majority of the codebase necessary to support a comparatively small subset of actual ML code.
Keeping it contained
There’s a lot of work to do to get to the starting line, as well as a large amount of ongoing effort required to keep the applications current. Fortunately, Kubernetes -- the open-source platform that automates the deployment and management of containerised applications, including complicated workloads like AI and machine learning -- can be a facilitator.
Containers provide a compact environment for processes to run. They’re easy to scale, they’re portable across a range of environments – from development to test to production – and they therefore enable large, monolithic applications to be broken into targeted, easier-to-maintain microservices.
Kubernetes has had a meteoric rise as a container orchestration platform. The vast majority of respondents to a Cloud Native Computing Foundation survey say they are increasing their use of Kubernetes across a variety of development stages. Forrester stated recently that “Kubernetes has won the war for container orchestration dominance and should be at the heart of your microservices plans.”
The go-to for AI
Kubernetes and AI represent converging trends. Most companies are running (or plan to move to) Kubernetes as a platform for their workloads, and AI is an increasingly important workload.
Kubernetes is ideal for the job because AI algorithms must be able to scale to be optimally effective. Some deep learning algorithms and data sets require a large amount of compute. Kubernetes helps because it is all about scaling based on demand. It also provides a way to deploy AI-enabled workloads over multiple commodity servers across the software pipeline while abstracting away the management overhead. Once the models are trained, serving them in various deployment scenarios, from edge compute to central datacenters, poses challenges to non-containerised forms of applications. Again, Kubernetes can provide the necessary flexibility for a distributed rollout of inference agents on a variety of substrates.
As organisations increasingly shift their attention to AI to decrease operating costs, improve decision-making and serve customers in new ways, Kubernetes-based containers are becoming the go-to technology to help enterprises adopt AI and machine learning. In December of last year, the Kubernetes project introduced Kubeflow, “dedicated to making deployments of machine learning workflows on Kubernetes simple, portable and scalable.” While Kubernetes started with just stateless services, the project said, “customers have begun to move complex workloads to the platform, taking advantage of rich APIs, reliability and performance provided by Kubernetes. One of the fastest growing use cases is to use Kubernetes as the deployment platform of choice for machine learning.”
At the beginning of 2017, among the three major public cloud vendors, only the Google Cloud Platform supported Kubernetes, with its Google Kubernetes Engine. By the end of the year, they were all on board, after Microsoft added Kubernetes support to the Azure Container Service and Amazon debuted the Amazon Elastic Container Service for Kubernetes. Now, the uses for Kubernetes and ways it’s being deployed by companies is seemingly limitless.
That’s a lot of activity in a relatively short time span, and it shows the extent to which tech vendors and their customers are rallying around the notion that containers offer huge benefits in developing and managing AI components of applications.
The rise of AI is helping fuel the growing interest in containers as an excellent way to bring repeatability, fault tolerance and repeatability to these complex workloads. And Kubernetes is becoming a de facto standard to manage these containerised AI applications. It’s a wonderful match that should dramatically benefit enterprises for some time to come.
In the race to remain competitive, many organisations rush to procure and bolt on new technology systems to legacy IT architectures, making the infrastructure much more complex, inflexible and costly to operate. Short of the cost-prohibitive ‘rip and replace’ approach, this is the way the vast majority of enterprise data centres have been evolving. The result is a rigid environment of disparate, layered systems, lacking in synergy and effective cohesive functioning. This is hardly an ideal scenario for today’s enterprise; keen to evolve into the efficient and agile ecosystem that digital transformation promises.
By Sazzala Reddy, CTO and Co-founder, Datrium.
In a bid to simplify such progress-impeding complexity, convergence - in the form of Hyperconverged Infrastructure (HCI) and Converged Infrastructure (CI) -- brings assets, such as storage, networking, processing and other functional appliances together, to be managed under one unified system.
When bringing in any new technology, it is vital to consider its impact on the revenue-related, mission-critical systems upon which the business relies. So how does the convergence solution impact mission-critical application availability, scalability and performance? To better understand this challenge, it is helpful to delve deeper into the basis for HCI - what it addresses, where, and having been tried and tested, whether it may be falling short of its promise. From there, it will be possible to pinpoint exactly what can be done about it.
Challenges and drivers for HCI
As mentioned, the pain points of the disparate IT environment today are a result of how the IT infrastructure has evolved; and the persistent behaviours surrounding it. Typically, there is no long-term plan for deployments. Installations tend to be fragmentary, adding more layers to the infrastructure, initiated as a direct result of individual departments or business units’ unique requirements. Each layer commonly has its own data management, storage, processing and compute resources incorporated, utilising networking at different tiers within the data centre. In this way, silo functions - for example, for support, development and application execution -came about.
When procurements are made, or whenever any hardware, system, software, network, storage, or development tool is upgraded, it comes with a raft of side effect issues that are notoriously difficult to detect, while diagnosis and repair would require greater expertise in each specific area. For example, the resources used by applications may reside within many different computing tiers, and if there is an issue, locating them can be akin to finding a white glove in a snowstorm.
Some systems will suffer from overutilisation of workloads, while others will be relatively underutilised -- which is extremely inefficient and wasteful considering capacity cost. And because of the inflexibility of the silo design, there is no easy way to migrate workloads between systems to ensure they are more evenly balanced.
In the struggle to overcome these challenges and increase performance, IT managers are facing on-going pressure to embrace new trends and more efficient technologies such as flash and the cloud, adding yet more layers of complexity, while being expected to do more with fewer resources.
For organisations still relying on this silo design -- and there are many -- getting IT functions aligned with the business is a major priority. Which is precisely why HCI is so compelling, though even this option is not free of limitations, as the decision of whether workloads should be migrated to a platform such as HCI will entirely depend on the workloads themselves.
HCI scaling restrictions
The drawback to HCI usually becomes apparent some time after deployment, when the IT team realises that the organisation needs more resources to manage its workloads, and therefore want to ‘scale out’. Scaling out can be described as the addition of extra nodes to a cluster (increasing the node count, or quantity), as opposed to the replacement of nodes with more powerful ones (thereby increasing size/capacity, processing power), as is the case with ‘scaling up’. To allow for growth, enterprises really need to be able to carry out both types: scale out and scale up, as workloads require.
Bolt-on systems for HCI that service a particular workload are generally not an issue, however each organisation’s workloads requirement will be unique, and issues can arise where scale-out of a single resource is needed, as is often the case. For example, there may be adequate network bandwidth, but not enough storage. In this instance organisations would have to pay for a full range of features including those which are unnecessary -- which is uneconomical in terms of cost, space and power consumption.
Innovation: the advantages of ‘self-protecting HCI’
This challenge is tackled with the next generation of HCI; which is based on creating two distinct tiers for processing: a performance tier and a protection tier, as opposed to traditional HCI’s single tier.
As the organisation grows, self-protecting HCI is conducive to applications scaling easily, with high levels of performance, low latency with high-level security, and full resiliency during any system failures.
High levels of performance are achieved through the utilisation of flash, with the convergence of VM (virtual machine) or container and IO processing into industry standard x86-based servers. Self-proecting HCI intelligently harnesses existing flash storage, as well as processing nodes or servers within the data centre, mitigating the relinquishment of other existing hardware and software resources, as has previously been the case.
The protection tier converges data processing, primary storage, backup and archive into data nodes which have been optimised for long-term resiliency, efficiency, and security. Storage can be based on spinning disk or flash, as per requirements. As the data resides locally in this tier, the delays, so characteristic of traditional backup environments, are avoided.
Scale up or scale out
Previously, when it came to scaling, organisations would have had to trade off one approach for the other. The good news is that with this new generation of HCI, comes the innovative capability to implement both scale up and scale out types of growth, while also enabling application availability and performance with HCI, utilising the organisation’s existing legacy server hardware.
This is made possible through the creation of a tier 1 application platform. So, depending on which particular constraint the workload is facing within a given architecture, enterprises can now scale up or out as needed. If more storage is needed, more servers can be added to accommodate increasing workload requirements. And if an application requires more processing power, units with greater processing capability can be easily added. Systems designed to enable ‘scale up’, allow increases in storage processing power, networking capability and more, growing to support petabyte-scale workloads, with the ability to maintain low latency at that scale, at impressive uptime rates.
With the effective use of commodity server-based flash storage, performance can scale linearly in a way that is fast and cost-effective, even supporting converged data backup both on-prem and in the cloud. Flash stores all of the active data required by a particular application on the server executing it. So subject to workload requirements, flash can be configured granularly and used as a server-side cache to reduce data access latency, and within the storage nodes for improved overall data centre efficiency. With flash’s provision of microsecond latency, enterprise scale tasks and tier-1 applications can be supported with ease.
So in answer to the question of whether hyperconverged systems can handle critical enterprise applications, yes, they can. Through the innovative partitioning of workload components into individual performance and protection tiers, supported by fast-performing server-based flash and resilient scale-out storage. Enterprises can now confidently embrace all the benefits of next-generation convergence with the flexibility of scaling they require, bringing organisational growth, agility and, most importantly, the assurance of mission-critical workload performance.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, this January issue contains the second, and the February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 2.
1) The open source movement will transform how telco’s operate
As a result of the monetisation challenges faced by telecoms companies when it comes to 5G and IoT, 2019 could be the year where many collaborative telecoms groups and communities grow as they try and pivot towards becoming software companies.
The challenge will be in the competition and development of differing standards between these groups. As new technologies arrive, each company will be vying to define the standard. This “too many cooks” approach can create significant confusion and wipe out the benefits of collaboration.
Open source ought to play a pivotal role in this. Cloud computing companies have benefited from collaboration on open source - one only needs to look at the number of different enterprises centred around creating products based on Linux or Kubernetes to see that open source can prove to be a viable business model. Telecoms providers ought to be looking at doing the same in 2019 by making infrastructure work for everyone, improving efficiency and opening up networks for all apps on their infrastructure.
2) There will be a lot of providers deploying 5G, but monetisation will prove a challenge
There will be a race to see who can market 5G the quickest and who will have it as standard first. We’re already seeing tests from multiple providers across the world in isolated areas, and the speed and size of rollouts will only increase as providers look to gain the upper hand.
However, this race could be a costly one. Consumer need for 5G isn’t as great as it was for previous generations. 4G can handle most consumer use cases (such as streaming, gaming, browsing etc.) fairly comfortably with reasonable speed.
5G’s main benefit is providing increased capacity, not speed and latency, making it more of a technical development. Being the first 5G standard network will be a marketing coup, but may not come with the consumer kudos and demand it once did.
3) However, it might not be the year of edge computing innovation that some might hope for
The question that stems from all this is: who is going to make use of the fresh potential for edge computing? Mobile devices themselves are fairly powerful nowadays, and many are able to run some AR processes effectively, meaning you don’t need edge at the bottom of the machine to enhance its compute power. The same goes for self-driving cars, another innovation people are watching keenly to see how 5G and edge will impact. There’s certainly an interest in putting data closer to the user in this case, but again, a lot of compute power can be delivered directly within cars themselves.
There are use cases which telecoms providers can demonstrate the importance of using 5G and edge together, but the challenge moving forwards once again is monetisation. Some current examples include location-specific use cases such as museums tapping in to AI or local caching of media, and for phone related tools where Wi-Fi capacity won’t be enough. However, it’s quite challenging to do these things on a commercially viable level, and it doesn’t seem that the industry will revolutionise this in 2019.
Five predictions for Enterprise Service Management IT in 2019
Courtesy of Ryan Pellet, Chief Customer Officer at Cherwell Software.
“The horse is here to stay but the automobile is only a novelty — a fad.”
— President of the Michigan Savings Bank advising Henry Ford’s lawyer, Horace Rackham, not to invest in the Ford Motor Company, 1903.
As the quote highlights, business predictions can be tricky to get right. Humor aside, successful companies need to make educated guesses about which services, tools, and technology to budget for to stay competitive over the next one to five years. In our constantly changing technological landscape, this is not an easy task.
As Chief Customer Officer, I spend a lot of time deep in the areas of customer experience and data mining/data analytics. From this perspective, I created this short list of what's to come in enterprise service management (ESM), which I hope will help your company to effectively plan for a successful future. And, while I can’t promise that all my prophecies will hit the bullseye, I think I’ll avoid the same fate as the gentleman who advised against investing in Ford.
#1 Less brainpower wasted on work processes and more focus on customer centricity.
How? The rise of Enterprise Service Management (ESM) technology is starting to do the thinking for organizations – saving valuable brainpower from repetitive problem solving and putting it towards thinking more about the customers companies serve. That trend will only increase as ESM practices expand allowing companies to focus their energy, attention, and critical thinking on the customer experience instead of on workflows and processes.
#2 Companies will design the workflow from the outside in.
Many companies buy technology and build processes catered to the characteristics of the people and function within their organizations as opposed to how it impacts the target of the solution. Increasingly, processes and workflows will be designed around the user/recipient of the process. This involves thinking about service management differently than we do currently – instead of how a company sells, companies will look at the process from the perspective of why the customer buys from clients.
#3 No more purchasing technology for technology’s sake.
Systems, hardware, and software that don’t add value to the customer simply won’t be accepted. With the help of ESM, companies will avoid this waste by understanding which services and applications add value for each customer and which don’t. For example, current packages that many cloud or cell phone providers offer require customers to pay for a number of services they don’t use. Streamlined services and applications will become the expectation for customers.
#4 Integration will be 100% expected.
The IT function’s value will continue to expand outside of IT and one of the main drivers will be the use of service management outside of IT. Essentially, anywhere a process can be better streamlined, understood, or assessed, ESM can add value connecting previously disparate, standalone technologies and services. Silos no more, integration will be the status quo!
#5 Younger workers will set the tech agenda for employers.
Millennials, who are poised become the largest living generation by 2019, are used to having integration of processes and services at their fingertips. They don’t want to work with a company that’s behind the technology curve. They will make decisions about where they work and who they do business with based on how the companies operate and how they train their employees.
I’d like to also predict that I’ll lose weight, win money, and grow more distinguished with age, but alas, I can only make predictions based on data and previous experience.
The edge and automation take centre stage
says James Nesfield, CEO of Chirp
"The hype around the edge has certainly intensified as companies look for new ways to efficiently process ever-growing pools of data. Now, we are seeing a general shift towards more compute - specifically AI - being enabled at the edge device, so more processing is being decentralised and pushed out to edge devices. In part this is in order to cut down on data throughput demands, and reduce latency. As this shift happens, and edge devices have more processing available to them, the need for them to be well-connected increases. So we are seeing new emerging connectivity standards coming out to support the increasing capabilities of edge devices, such as LiFi, Sigfox, NB-IoT, and data-over-sound.
"Security, especially from a physically, geographically managed end-to-end perspective - edge, fog, cloud - remains unsolved in a comprehensive, consistent manner.
"As businesses kickstart their digital transformations, we will see the current trends towards more AI and data aggregation at the edge, and increasing demand on existing connectivity networks, continue. When you’ve got more data and more processing happening at the edge, the requirement for easy and reliable connectivity between those devices - no matter what technology they are using to connect and share data - becomes paramount.
"In particular, ongoing advances in both hardware and machine learning technology now enable voice to be processed at the edge, increasing the possibilities for instant human-machine interactions. Modern voice systems are a great example of this - consider the ability for speakers and microphones to listen and respond to environments. For example, Audio Analytic has developed audio recognition software that could allow an Amazon Alexa to detect the sound of breaking glass or a break-in in the home. Similarly, Rainforest Connection uses mobile phones in the Amazon rainforest to listen for logging activity and send a text alert to authorities who can determine if it’s illegal and then stop it. Audio is a really underutilised medium at the edge today, and there are a whole range of fascinating use cases for it."
A three horse race
According to Jim Oulton, Associate Partner and Finance Services Cloud SME, Citihub Consulting.
Over the next 5 to 10 years, we see cloud service provision remaining a three horse race between AWS, Microsoft, and Google. These firms will continue to heavily re-invest profits in new data centres and regions driving further economies of scale. However, we predict that this growth will start to flatline over the longer horizon and cost will become less of a deciding factor when choosing a provider. In parallel, Kubernetes will be firmly cemented as the de facto application platform of choice, improving portability between providers. These factors will reinforce existing trends which see cloud providers pushing further up the technology stack to differentiate themselves and to increase their ability to lock customers in for the long term. As a result, Citihub Consulting envisage customers will run a "polycloud" model, selecting best of breed Data, Analytics, Artificial Intelligence, Machine Learning, IoT and Serverless technologies based on fit to individual application requirements. Competition will also see niche SaaS and PaaS providers enticed onto the main providers' platforms with heavily discounted IaaS pricing, further consolidating the big three's position.
Despite this, Citihub Consulting still believe that many large regulated enterprises will retain a significant legacy estate within their own data centres due to the prohibitive cost of modernising their applications even over a 10-year timeframe. For example, one large Financial Services customer of Citihub Consulting's estimated the cost to refactor a single business unit's applications for Cloud to be in the region of $100M. So for most large firms, we will probably see adoption of CaaS/PaaS/FaaS targeted on areas that are already planning significant development or transformation – the tip of the iceberg.
Big Data Analytics in 2019 – less big, more fast
“While Big Data Analytics has dominated the headlines, Operational Analytics is now emerging as a route to faster ROI, where the requirement isn’t for huge volumes of data, but for speed. Until recently, real-time analysis of data to support business decisions has been something of a pipe dream: held back by the constraints of Big Data, with long ETL times, slow relational databases and the need for data warehouses and data lakes. This has made analytics very infrastructure- and process-heavy and often hard to justify.
In 2019, we will see a big increase in Operational Analytics solutions being designed and deployed. IoT and manufacturing will drive much of this demand, but industries such as retail, banking and telecommunications will also start to turn to Operational Analytics as they realise there’s a huge opportunity to use the insight to fine-tune offerings and enhance customer experience. The Cincinnati Reds baseball team are leading the charge with this kind of analytics capability – providing fans with live performance stats and up-to-the-second betting odds. In 2019, we can expect more to follow in the Reds’ footsteps, thanks to advancements in database technology that finally allow for data to be analysed at both speed and at scale – rather than having to pick one or the other. Memory-first NoSQL databases will enable real-time analytics with no ETL, no data warehouse and no data lake.”
A better year for Bitcoin
says Tomislav Matic, CEO of Crypto Future, a Blockchain based IT Solutions provider.
While 2018 saw the peak - and resulting fall - of Bitcoin, 2019 will see the mass take up of the very platform that facilitated Bitcoin’s revolutionary success - blockchain. So far we’ve seen millions of pounds being poured into developing the technology, and while many companies are yet to reap the rewards of their investments, we’re gradually edging closer to success.
Don’t be mistaken, we’re still a long way off from seeing the mass adoption of the technology, but steps are being made, and it could likely happy in 2019. When Bitcoin’s blockchain first arrived on the scene, the most power that the technology could muster was 7 transactions per second, whereas Ripple can now manage 1500. We’ve seen a lot of progress, and with the turn of the year comes fresh optimism, and after overcoming a few hurdles, Blockchain can become massive in 2019.
Once this scalability problem is solved, the next roadblock will be cross-chain interoperability and user friendliness across various applications, and beyond that, the issue of organisations handling and analysing the Big Data provided by Blockchain.
Nevertheless, the opportunities are limitless. The reason why so many companies are invested in the technology is because of the potential. From the implementation of smart contracts and automatic booking, to being able to trace the origin of food products to become more aware of what we’re eating, and all the way to blockchain based voting apps - removing the opportunity for fixed elections with immediate results; once blockchain is properly developed, its eventual impact on the world will be similar to - if not bigger than - that of the internet.
AI and ML – time to identify the uses and benefits
Says Simon Blunn, VP EMEA at DataRobot.
“As we head towards 2019, industry leaders’ thoughts turn to how to drive greater efficiencies, profits and employee/shareholder satisfaction. Many will be investigating how to gain immediate value from emerging tools and techniques, weighing up the pros and cons of being an early adopter of a new technology.
Almost every executive knows of Artificial Intelligence (AI) and Machine Learning (ML), but find it difficult to define what it is and how best to use it. Beyond definitions however, there is a more pressing need to clearly identify the potential uses and benefits.
At DataRobot we see clients around the world innovating with automated machine learning to become leaders in their market.
With this in mind, our 2019 predictions for how AI will influence business includes:
Uninterruptible power supplies (UPS) are a vital component of any organisation’s business continuity strategy, providing battery-based backup should service from the mains become disrupted.
By Marc Garner, Vice President, IT Division, Schneider Electric, UK.
The essential requirements of a UPS are that it is reliable, easy to maintain, efficient to operate or control and unobtrusive—one wants to know that it is there, but one would prefer that it never be needed.
Increasingly, given the high levels of automation and remote management in today’s data centres, UPS systems must have IoT enabled capabilities to ensure their status can be monitored, and where possible their operation be controlled, from outside the white space. Furthermore, any maintenance that is necessary must be performed as quickly and easily as possible, with an absolute minimum of downtime.
Schneider Electric’s Easy UPS 3S series is designed to meet all these requirements for small and medium businesses. Available in power configurations ranging from 10 kVA to 40kVA, the three-phase UPS is designed with simplicity in mind, ensuring such aspects as installation, startup, maintenance, monitoring, management and scalability are made simple.
It is intended for use in applications including small and medium data centres, on-premise computer rooms, manufacturing facilities, telecommunications, commercial buildings or healthcare.
Installation and startup
Easy 3S UPS models are available in a range of sizes and can be configured with internal or external batteries, depending on size constraints and computer-room layout. Mounted on wheeled enclosures they can be rolled into place easily and an Easy Loop Test allows the performance to be verified in advance of connecting the load so that any obvious faults may be detected and rectified in a timely manner.
To simplify installation, input output and bypass breakers are included as is an Emergency Power Off switch all of which speed up deployment.
The UPSs enclosures are easily accessible from the front to simplify servicing and maintenance. For example, the dust filters are placed right behind the magnetic front panel so they can be easily removed and replaced which can be a frequent occurrence, especially in harsh or ruggedised environments.
Expansion and resilience
The Easy UPS 3S series can be deployed with up to four models arranged in a parallel design to provide increased capacity or improve redundancy in situations where downtime is critical.
For high availability systems the UPS can be deployed in a variety of parallel or distributed redundant designs so that the failure of any one UPS can be accommodated, without risk to loss of power by other units in the configuration.
Easy UPS 3S series also offer an advanced economy mode of operation which reduces operating costs while offering very high levels of protection. They utilise Li-Ion battery technologies, which provide smaller footprint in terms of size and a better total cost ownership, compared with traditional valve-regulated lead-acid battery (VRLA) batteries.
Depending on the level of availability that is required the systems can be deployed in fully online mode, where the battery and power inverters are permanently connected to the power path from utility to load. Another option is to deploy it in economy mode, where a modicum of reliability is traded off for greater energy efficiency.
Double-conversion online UPS topologies are most commonly found within large data centres, where mains power is rectified to DC before charging the backup battery. Afterward it passes through an inverter to be converted back to AC, hence double conversion.
When in double-conversion mode, power output from the UPS will always pass through the inverter, providing a regular conditioned supply to the load. There is no loss of power in the event of a mains outage or blackout because the load is always connected to the inverter and battery backup. Operating in this mode however, means there is constant wear on the power components, with attendant reduction in Mean Time Between Failure (MTBF) and a knock-on effect on reliability.
In economy mode, or ECO Mode, a manual bypass switch overrides the double conversion path and connects the load directly to the mains input. This can help to save energy, but power protection is reduced as the IT load is exposed to raw utility mains power - without the conditioning normally provided by the double-conversion, online UPS. The UPS must therefore continuously monitor the mains power and quickly switch to the inverter when a problem is detected and before it can affect the critical load.
Overall, the Easy UPS 3S series can deliver 96% efficiency in double conversion mode and up to 99% efficiency in ECO Mode. It can also accept a wide input voltage window from 280 to 415V which also helps save battery power.
Data Driven Monitoring and Management
An intuitive graphical user interface (GUI) on the front panel allows simple system configuration. For more in-depth system monitoring and management from a central location, there is the option to include an SNMP (Simple Network Management Protocol) which allows the UPS to be managed via a Web interface or through Data Centre Infrastructure Management (DCIM) software application such as Schneider Electric’s EcoStruxure IT.
Use of the network-card allows status updates or alerts regarding the UPS performance to be sent to a Smart phone application or to a central management console. As issues arise they can be addressed by remote management from the console or Smart device or, depending on severity, a service professional can be dispatched to deal with the problem in person.
EcoStruxure IT monitoring and management software is available in several variants to accompany. EcoStruxure’s Asset Advisor software is a cloud-based application particularly suited for customers who prefer to outsource their data-centre management services to third parties. This software provides round the clock monitoring of hardware assets, including UPS and other critical IT assets, delivering updates directly to a customer’s mobile phone. In addition it provides analytics services that deliver detailed insights into how well a data centre is performing and offers recommendations based on realtime information.
EcoStruxure IT’s on-premise management suite, namely StruxureWare for Data Centers, is an integrated suite of applications typically enabling businesses to manage their data centres across multiple domains. It allows customers to make decisions regarding how to balance their equipment for an optimal mix of high availability and peak efficiency throughout the entire data centre life cycle.
Both software suites provide customers with the intelligence to proactively perform maintenance and replace batteries in a timely manner, ensuring that power security is never impaired. The Easy UPS 3S series makes use of Lithium Ion battery technology, which provides a longer lifetime operation and a higher volume of charge and discharge energy cycles, when compared to traditional lead-acid batteries. In addition, as well as taking up less space.
Given its range of advanced technological options, IoT enabled connectivity and compatibility with leading infrastructure management software systems, Schneider Electric’s Easy UPS 3S range makes business continuity a reliable, simple and easy to deploy option for today’s always-on businesses.
Let’s face it: cloud storage is here to stay. When the Storage Networking Industry Association (SNIA) formed its Cloud Storage Initiative (CSI) back in 2009, the working group helped usher in the acceptance of the then-new technology.
Now, however, cloud storage is as ubiquitous as the Internet itself. Twitter alone consumes 300 petabytes of data on Google’s cloud, and companies like Facebook, Amazon, AliBaba, and Tencent all provide and consume massive amounts of data storage.
That’s why SNIA has renamed the group the Cloud Storage Technologies Initiative (CSTI), to signify a newer, more nuanced approach. Adding the T for “Technologies” expands the group’s charter to help support the evolving cloud business models and architectures (like OpenStack, software-defined storage, Kubernetes, and more). The CSTI publishes articles and white papers, offers speakers at industry conferences and presents highly-rated webcasts to thousands of viewers.
SNIA member Eric Lakin joined the CSTI group as soon as he realized that it aligned with some internal initiatives at the University of Michigan, where he manages the storage infrastructure for the University’s 19 schools and colleges. This storage engineering team provides a commodity storage platform for Windows, Linux Unix, and Mac users across campus to store files using SMB or NFS protocols.
The cloud is an ideal storage platform for Lakin’s team, especially for older data. “We’re recognizing that older data doesn't need to occupy a premium physical storage space in our data centers when we can just kick it out to the cloud and take advantage of cloud economics,” Lakin says. “The business value of data created at the University is the highest right at the time when it's been created. Over time, the value drops. Most of the files we create are saved and stored away on some shared drive and you never go back to them.”
Lakin manages around 10 petabytes of storage. “What we're doing is a very rudimentary information lifecycle management within our on-premise storage system,” he said. “We are scanning the file shares, identifying files that have not been accessed in over 180 days, and then we are moving those files transparently out into cloud storage and leaving behind a stub file.”
These stubs look and act just like the original files, but when a user clicks on it, the system makes an API call out to the cloud provider, which pulls the file for the end user for access. It’s almost as quick as local access, too.
This process frees up local storage space. “Ultimately, when we're doing this at a multi-petabyte scale,” Lakin said, “each petabyte of old data that I can push out to the cloud will eventually result in four petabytes of on-premises storage that I can decommission or use for something else.”
Lakin shares his successes and challenges with other CSTI members via webcasts, like the one he co-hosted titled “Create a Smarter and More Economic Cloud Storage Architecture” for SNIA.
“As we begin to do this at a multi-petabyte scale over the next few years, we fully expect that we will run into some technical and other challenges associated with scale,” Lakin said. The CSTI can then function as a user community within SNIA that other technology teams can reach out to, get answers, solve problems, and help each other.
Anyone looking at integrating cloud technology into their storage environment can join the CSTI and participate in these webcasts and benefit from CSTI’s membership like Lakin to help them on their way.
Better yet, it’s a vendor-neutral discussion. Members of the SNIA CSTI are encouraged to think about the aspects of the technologies that solve end-user problems, rather than specific products or companies. “It’s kind of a safe place,” said Lakin. “We can talk with and about the technologies without feeling like we are promoting or selling anything. We can just talk openly to get problems solved.”
Moving forward, the University of Michigan hopes to put four petabytes of data into the cloud over the next five years. In addition, Lakin said, his team is in the very early stages of integrating cloud storage into its data backup environment as well. His experiences in that realm will also help others who might be working on their own processes as well.
Lakin’s webcast co-presenter, Alex McDonald, is the Chair of the CSTI and has been in the IT business for over 40 years, most recently creating standards-based initiatives, educational programs, and working with industry groups. McDonald was part of the CSI before the addition of the “Technologies” to its name. Early on, cloud storage was considered only good for things like backup or archiving data.
It took a few years to convince users that cloud storage could be used for more than just deep, cheap and slow storage, and that it was something to be used and developed in its own right as part of a more significant application offering. “When Amazon and Google came along, they created these quite wonderful services sitting in the cloud,” McDonald said. “And over time, bandwidth & latency improved and became cheaper. That's really the point at which the game started changing and we began to realize that the storage part of the cloud war was won. But there is still an awful lot of education that we have to do to explain that the cloud had changed.”
Now that more advanced technologies exist in the cloud, CSTI can help end users figure out what’s happening with OpenStack, Kubernetes, data services, orchestration, and more. “We can already deliver over-the-cloud block and file storage,” said McDonald. “We know how to do that. We can also deliver key-value type object storage. And we can build new and different cloud applications on top of these storage types.”
No one is sure where this move to the cloud will end up, but McDonald is excited. “Nobody's quite defined the whats and hows of computational storage (where there’s compute directly on the storage units) yet,” he said. “But I see this kind of technology as something that's very important when applied to the cloud. And that's the kind of thing we want to be pushing and promoting over the next few years.”
Still, McDonald notes that CSTI must also continue its basic educational efforts. New people come into the industry every day, looking to figure out what the current state of technology is and how it works. SNIA and initiatives like CSTI can help them learn what they need to for their real-life business applications. “It’s these important “state of the cloud” messages that we will go out and deliver to help make the transition to cloud” said McDonald. “It’s an exciting time of change, and there’s so much that’s new to learn.”
“The important thing for most of end users is that they don't have a great deal of time,” said McDonald. “I think the CSTI education appeals to them because they can get short little bursts: an hour long piece of education and end up better informed at the end of it, having learned something that's of value and applicable to what they're trying to do. And with a technology rather than a vendor focus, and with our goal to the surrounding technologies rather than cloud storage alone, our education program is more relevant and important than ever.”
About the SNIA Cloud Storage Technologies Initiative
The SNIA Cloud Storage Technologies Initiative (CSTI) is committed to the adoption, growth and standardization of storage in cloud infrastructures, including its data services, orchestration and management, and the promotion of portability of data in multi-cloud environments. To learn more about the CSTI’s activities and how you can join, visit snia.org/cloud.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, this January issue contains the second, and the February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 3.
Can companies reach Digital Zen in 2019?
Asks Eric Schrock, CTO at Delphix.
Nearly three quarters of executives believe they’re being out-innovated by the competition. The fast are eating the slow in today’s digital economy, and companies that are just getting their arms wrapped around becoming a software company are finding that they must evolve to become a data-driven company. Digital services are becoming more intelligent and more data-aware, and new approaches to machine learning and data science are producing unique insights necessary to create a competitive advantage. Data is now central to winning the innovation war.
But most companies are stuck playing data defense, struggling to manage privacy and compliance while battling against their own legacy systems and processes to provide data to the innovators that need it. Organisations are stuck in a virtual data swamp, unable to move forward and unable to escape.
As we look forward, all signs point to the greatest opportunity, and greatest risk, for data is still to come. The continued growth of new machine learning, more intelligent digital services, and breadth of data is going to create more demand and more opportunity. But as more data is gathered and flows to more places, firms will continue to struggle to secure their data against an exponentially more complex attack service, all under the pressure of new regulations and penalties.
With that in mind, here is what I believe is in store for businesses in 2019:
1. The year of the breaches
As British Airways, Cathay Pacific and Ticketmaster will tell you, 2018 wasn’t too kind to companies on the data security front. Unfortunately for them – and for all of us – 2019 will only be worse, fondly remembered for producing a record-number of application outages.
The reason is simple. Highly complex IT environments and applications are being placed under more pressures every day. Pressure to upgrade, improve, comply and critically, move to the cloud. All this will place a huge burden on already over-stretched IT teams and resources to 'shift left', to upgrade, migrate and test applications, faster. And threat actors have learned that disrupting a business is just as profitable as exfiltrating data, with ransomware and malware able to cripple businesses and entire municipalities such as Atlanta, Georgia.
With the General Data Protection Regulation (GDPR) in play, companies that succumb to the pressure will pay large fines. At some point we will see a large data loss incident which, with the benefit of hindsight, will have been largely preventable. The regulators will want to be seen to be tough and will lay a bench mark down to businesses. The first big test of GDPR will be a PR disaster for the unfortunate organisation concerned, not only can they expect to receive the full force of GDPR, the incident will gain far more media coverage than normal and will cause significant brand damage.
Such high-profile outages of critical applications will cause massive disruptions for businesses and customers, but also shine the spotlight on the company under fire thanks to regulations.
2. Gambling on digital transformation will yield winners
2019 will see the financial services and banking industry taking significant strides in their digital transformation journey. With fully digital, mobile-only banks such as Monzo rising in popularity, we are already seeing vast changes in the industry. Traditional brick-and-mortar institutions have had to take a step back and develop a dedicated approach to transforming their operations. Being flexible, available 24/7 and mobile-first while also being compliant to regulations and privacy laws can be a tricky pursuit.
The allure of digital transformation will transcend traditionally technology-led fields to actual agricultural fields. The food and agriculture industries might not be the first to come to mind when digital transformation is mentioned. Yet this industry is already making remarkable progress in embracing a digital-first approach. From soil heat map analysis to drones, this industry is creating new opportunities and paving the way for tech adoption, even in third-world countries.
The reach of digital transformation into domains that are traditionally considered separate from technology is inspiring and interesting to say the least.
3. Technology, and not people or processes, will be key to compliance
Businesses do not know yet how to be fully compliant in the face of new regulations. Taking full advantage of data and fast, when a global business can have anywhere near 60,000 developers working for it with 140,000 third party services to manage on top – pursuit of perfection when it comes to security is close to mission impossible. As data flows into, across, and out of a company, securing the edge is simply not enough. Business will need to employ technology to mitigate risk within their data. Systems are fallible, but so are humans. Someone perfectly authorized to work with data can, intentionally or not, misuse private data for unethical purposes.
What’s more, the expected rise in deployment of AI and machine learning technologies across many industries, from manufacturing to asset management, will have to feed off free-flowing and secured data to make it worthwhile and preserving innovation. When it comes to financial trading, we are seeing that there is no longer any downtime. Traders are deploying into live environments with roughly 5-10 seconds to spare on application testing, which is a 180-degree transformation when it comes to testing with larger volumes of data processed every day and shrinking testing windows.
Technology will be key to managing risk across the entire data supply chain, so developers and data scientists can move at breakneck pace without compromising privacy and compliance.
4. People will interact more with technology and less with computer screens
Remember the days when voice assistants were novel? Voice computing is permeating everything we do, from home assistants to hands-free driving, to scheduling your appointments in conversations with other people.
The days of keyboard and screen-only machine-human interactions are over. It’s not just voice; our tech is wearable, tactile and augmented. These human interactions present not just a new way of interacting with technology, but whole new streams of human data that can present new opportunities, such as diagnosing neurological conditions from voice patterns. Enterprises will need to tap into this data to dramatically improve their digital services and customer care.
We’ll continue to see a rise in technology created to interact with humans not through a screen, taking personalisation and automation to whole new levels.
5. Everyone will become a data scientist
There is no doubt that data is everywhere. Every business – from entertainment to healthcare – will be driven by data to create better customer results and patient outcomes.
Data-rich companies will continue to punch above their weight in delivering against consumer expectations. Adoption of data technologies and analysis in developing and frontier economies will create new opportunities and level the playing field. This is why start-ups are doing so well, where larger enterprises with legacy systems suffer.
If you are investing in a data science team, get them the data management and virtualisation tools they need to be successful. Anyone’s analysis is ever only as good as the data they analyse.
As the year draws to a close, we are seeing more toxic data lakes emerging and threatening to drown companies that can’t swim through the perilous waters fast enough.
The destination is clear - a state of Digital Zen where businesses can be confident that their data is stored securely and their processes are fully automated, 24/7. The journey there, however, is often bumpy and inherently tricky.
It will take businesses more time and a lot more resilience to reach the final destination, but one thing is for sure - the Digital Zen will definitely be worth it.
Sukhi Gill, VP and CTO for DXC Technology in UK & Ireland has identified six digital trends that will accelerate business transformation in 2019:
1) Enterprises go after digital business moonshots
In 2019, enterprises will make more aligned, bet-the-company execution decisions to accelerate digital business. Expect to see new businesses, business models and technologies built from digital. A unified digital strategy between the business and IT is the only way to unload the compounding technical debt that is holding companies back from exploring moonshot digital initiatives. It’s all about focusing, accelerating digital transformation, having the stamina to succeed and achieving nonlinear growth.
2) Enterprises adopt next-generation IoT platforms
As enterprises map their physical world to an intelligence-rich digital one, smart “things” become a driving force for implementing next-generation platforms in 2019. This advance will enable large quantities of industry-specific data from the internet of things (IoT) to be analyzed, uncovering novel, hyper-dimensional correlations that provide fresh insights, enhanced decision making and better business outcomes.
3) Action at the edge disrupts the cloud
The IT industry continues to build out what we call “the Matrix,” the pervasive, intelligent IT infrastructure that goes beyond cloud to include edge computing, IoT platforms, machine intelligence, augmented reality/virtual reality, blockchain and more. Companies will build completely new ways to leverage the Matrix, including decentralized applications (DApps), shifting power from a small number of central players to a large number of participants. Additionally, a shift toward event-driven applications and serverless architectures allows very small and specific applications to run in lightweight environments such as pocket or wrist devices.
4) Enterprises enter the age of Information Enlightenment
Leveraging information will become a core competency in 2019. Companies that experience Information Enlightenment will realize that artificial intelligence and machine learning can improve service offerings and generate new sources of revenue — but only with the right algorithms, model orchestration, data and infrastructure.
5) Enterprises redesign customer experiences amid stronger data privacy rules
Protecting customers’ personal data will force companies to rethink their digital strategy as the full effects of the General Data Protection Regulation (GDPR) set in. Enterprises must create privacy-centric information ecosystems, with analytics and security as the foundation, as they aim to deliver secure interactions and superior customer experiences.
6) Companies begin closing their data centers
The enterprise data center is becoming less relevant as the data arrival and business processing shift to the cloud. To operate more efficiently and derive more value from their data, enterprises are shifting workloads to public cloud providers, who have massive bandwidth and strategically placed data centers. The trend will play out over the next three to five years, as cloud migration gives way to “built for cloud” replacements.
There’s no hiding from, or for, tech in 2019
Whether it's artificial intelligence or facial recognition, tech will be everywhere in 2019. But Silicon Valley may have peaked, and the tech giants will be in regulators' sights in both America and Europe.
Writing for the World In 2019, the annual publication from The Economist, Deputy editor Tom Standage, and technology correspondent, Hal Hodson delve into two key areas they believe will shape the conversation and agenda in technology during 2019.
Tom Standage, Deputy Editor, The Economist
Regulators must respond to AI now: As AI is applied in a growing number of areas, there are legitimate concerns about possible unintended consequences. The immediate concern is that the scramble to amass the data needed to train AI systems is infringing on people’s privacy.
The General Data Protection Regulation was a step in the right direction, giving EU citizens, at least, more control over their data (and prompting some internet companies to extend similar rights to all users globally). The EU will further clarify and tighten the rules in 2019 with its ePrivacy Regulation. Critics will argue that such rules hamper innovation and strengthen the internet giants, which can afford the costs of regulatory compliance in a way that startups cannot. They have a point. But Europe’s approach seems preferable to America’s more hands-off stance. China, meanwhile, seems happy to allow its internet giants to gather as much personal data as they like, provided the government is granted access.
Given how widely applicable AI is—like electricity or the internet, it can be applied in almost any field—the answer is not to create a specific set of laws for it, or a dedicated regulatory body akin to America’s Food and Drug Administration. Rather, existing rules on privacy, discrimination, vehicle safety and so on must be adapted to take AI into account.
As for jobs, the rate and extent of AI-related job losses remains one of the most debated, and uncertain, topics in the business world. In future workers will surely need to learn new skills more often than they do now, whether to cope with changes in their existing jobs or switch to new ones. As in the Industrial Revolution, automation will demand changes to education, to cope with shifts in the nature of work. Yet there is little sign that politicians are taking this seriously: instead many prefer to demonise immigrants or globalisation. In 2019, this is an area in which policymakers need to start applying real thought to artificial intelligence
Hal Hodson, Technology Reporter, The Economist:
Faces become machine readable: The latest advances in machine learning have created software that can determine the unique pattern of a person’s face from imagery or video to a far higher degree of accuracy than older technology
The world’s CCTV and police cameras are one upgrade cycle away from capturing higher-definition imagery, which will help facial-recognition algorithms work better
Silicon Valley’s approach to facial recognition, using powerful computers and large datasets of faces to train highly accurate software, is only beginning to percolate into the security market. That will speed up in 2019
The combination of web-tracking and physical biometrics will mean that spaces in which human beings are not tracked will shrink in 2019.
In America, for example, Major League Baseball will start allowing fans to validate their tickets and enter stadiums via a scan of their face, rather than a paper stub. Singapore’s newest megamall will use the technology to track shoppers and recommend deals to them. Tokyo will spend the year installing facial-recognition systems in preparation for the Olympics in 2020, when it will use the technology to make sure that only authorised persons enter secure areas. Research by SITA, a technology vendor, suggests that three quarters of all Airports and Airlines are investing in the technology or carrying out research in the area.
It is worth worrying about facial recognition for the principles it offends and the damage it threatens. Human beings need spaces where there movements are not tracked. A world with ubiquitous facial recognition means one in which no coffee morning, no midnight walk, no trip to the shop can occur without being assigned to a specific face and identity. Strong laws that protect individual rights are the best hope for limiting such tracking.
Top five predictions for software and IoT companies
Cloud, serverless computing and SDN to feature
Richard Blanford, managing director of managed cloud and IT infrastructure expert Fordway, comments:
Although I’m a strong believer in the benefits of cloud for many applications, I expect to see a move back to decentralisation (a.k.a. client/server) for the growing number of intelligent devices such as robotics in manufacturing – what you might term ‘intelligent client mark 2’. These devices are in effect small scale datacentres in their own right and need to process information in real time, so for them the latency of cloud is becoming a major issue and the need to have intelligence at the edge will increase.
I also expect to see an increase in serverless computing, where organisations rent cloud capacity by the transaction rather than by the number of instances, and the use of containers, which make it easier to move applications between cloud providers – although this won’t work for legacy applications, which will need to be fundamentally redeveloped. As all the major cloud providers now support Kubernetes container management, organisations who have (re)developed their applications to containers will be able to take advantage of cloud broking between providers. This could either be an in-house role if an organisation has the capability, or be provided by a third party.
Finally, we will see the continued implementation of Software Defined Networking (SDN) overlays across networks to reduce complexity and increase organisational control, leading to the eventual replacement of proprietary, name brand networking hardware with lower function, lower cost ‘white box’ devices.
Cybersecurity issues show no sign of going away any time soon
Derek Manky, Chief, Security Insights & Global Threat Alliances, Fortinet
Derek Manky, Chief, Security Insights & Global Threat Alliances, Fortinet:
“We are seeing significant advances in cybercriminal tools and services which leverage automation and the precursors of AI. Organizations need to rethink their strategy to better anticipate threats and to combat the economic motivations forcing cybercriminals back to the drawing board. Rather than engaging in a perpetual arms race, organizations need to embrace automation and AI to shrink the windows from intrusion-to-detection and from detection-to-containment. This can be achieved by integrating security elements into a cohesive security fabric that dynamically shares threat information for broad protection and visibility across every network segment from IoT to multi-clouds.”
If there is one buzz-term that’s been recurring and is likely to continue going into 2019, it must be Artificial Intelligence (AI). Advancements in technology are transforming the business landscape, enabling functions from marketing to finance to make highly informed, lightening-speed decisions – and HR is, or at least will be, no exception.
By James Akers, Director of Product Management, Thomsons Online Benefits.
While in reality integration of AI into HR solutions will be a slow process, there’s no doubting the technology’s potential to take on elements of the HR function, delivering substantial time and cost efficiencies for professionals and improving the employee experience.
But will there ever be a time when technology makes HR professionals obsolete? Or are there other elements of the function which can simply not be engineered?
The state of play now
HR departments are often criticised for lagging behind when it comes to innovation and technology adoption, but evidence suggests that this is no longer the case. According to our Global Employee Benefits Watch 2018/19 research, 70% of global employers now use benefits management software. This has the capability to automate many manual processes; from scheme enrolment through to administration, deduction and taxation. Historically these were time-exhaustive error prone tasks for HR and benefits teams.
However, while automation is streamlining some HR processes, there is still potential for technology to remove a far greater proportion of the administrative burden from teams – before even touching the strategic or consultative elements of the role.
If we take administrating private medical insurance (PMI) as an example. While benefits software has expedited and improved the employee experience of selecting PMI cover, insurers often lack the technology needed to automatically receive selections and immediately kick-off schemes from date of election. Instead, employees’ selections are often stuck in a holding pattern, waiting for insurers to enrol them. This is often only processed on a monthly, rather than real-time, basis.
Enterprises need to consider this experience in contrast to that offered to consumers. If you or I needed travel insurance for example, we could log onto a platform and likely receive cover that day. If the HR supply chain were able to automate the end-to-end PMI enrolment process, the same speed and seamless experience could be achieved, while further reducing the administrative burden on HR teams.
The HR departments of tomorrow
If we imagine for a moment that we have overcome technology integration as a hurdle, we can then look forward to a future where technology plays a greater role in HR – going beyond administration to defining to strategy.
If we return to PMI as an example. In future, we may see AI-enabled solutions able to evaluate reams of demographic and lifestyle data on a global employee population and determine a list of PMI providers best suited to the employee’s needs. This could take into account the best price point and product offering specific to the employee’s need and likelihood of claim. It could then project cost and product comparisons between schemes and with the click of a button enrol the employee instantly into the most cost-effective and relevant scheme for them. Such choice, comparison and hyper-personalisation not only delivers a consumer grade experience, but also has the promise to disintermediate the employer from the benefit arrangement; yielding further savings and efficiencies while delivering a market-leading benefit offering.
On a grander scale, we could see machine learning and data modelling used to predict the impact of one decision on another. For example, an HR professional may be able to ask, “what might my medical costs be next year in China if I increase my headcount by 10%?” with software able to instantly calculate an answer, considering trends in benefits take-up, salary and insurance premium hikes. In this scenario, technology has the potential to influence where an organisation places head-count, directly impacting HR and business strategy.
The advantages of such solutions, in terms of their ability to reduce administration, optimise the employee experience, reduce organisational costs are limitless – but is this enough to eliminate the need for people in HR?
A place for people
The answer is of course “no”. Developments in automation, data analytics and eventually AI will drive and necessitate the evolution of HR. However, I cannot imagine a near, or mid-term, future where technology is able to replicate the emotional intelligence (EQ) integral to the function, and therefore take on the consultative element of HR’s role.
Human Resources departments fundamentally deal with people, and these people will continue to crave some form of human interaction, particularly when seeking help on a sensitive matter or one linked to their personal health and happiness. Computers are not yet able to emulate empathy which is ultimately what makes us human. When it comes to benefits for example, our research found that almost half of global employees (46%) still highly value face-to-face communications. Handing every engagement opportunity over to a machine could therefore alienate a large proportion of the workforce.
Next steps for businesses
Further automation and eventually AI integration into HR is inevitable. This will continue to improve the employee experience, remove the administrative burden from HR teams and even inform business strategy. However, this future will only be realised if HR teams are able to take full advantage of the tools at their disposal. It’s imperative that organisations equip their HR and benefits professionals with the skills and understanding they need to take advantage of the benefits offered by technology and shift into their increasingly consultative role.
Technology will not and should not replace people in HR. As long as people have a role in the workplace, there will always be a need for human interaction and management. A future where human empathy, sympathy and discretion are not needed in the function is a long way off.
Artificial Intelligence was a big talking point at Forum Systems’ London Summit recently, and for good reason. With every IT service now relying on APIs to function, and the speed of API attacks no longer measured in days or weeks, but mere milliseconds, API security gateways must find a way to adapt and evolve to threats in real-time.
By Jason Macy, CTO, Forum Systems.
Additionally, with the sheer amount of API communications showing explosive growth, analysing data trends and predicting events requires that API security becomes more intelligent. With such significant advances in Artificial Intelligence (AI) in recent years, the next logical step is using AI to enhance the interpretation of API data.
Firstly, let’s be clear about the role that APIs play in your network today. APIs are the synapses that connect applications together and enable them to share data. They literally underpin almost everything we do, from banking to shopping to controlling our heating. Without APIs, we would not have seen the exponential growth of cloud computing, the Internet of Things, and even social media. Even our favourite devices – smartphones, tablets, smart watches, fitness trackers etc. all require connections to cloud services to function. In other words, they all use APIs.
With the profound growth and adoption of APIs, API vulnerabilities have become the sleeping giant of our API-led technology world. The vulnerabilities posed by exposed APIs and insecure API Gateways are significant, yet they remain one of the most overlooked threats to information security today. This is largely because API vulnerabilities are not always easy to spot, and require specialized technology for detection and prevention. This is notable with the OWASP Top 10 (the highly respected, peer-reviewed list of the top vulnerabilities facing organisations today) which now reference APIs in 9 of the top 10 vulnerabilities.
Is AI ready to help?
So, is AI ready? The consensus within the IT industry is a firm yes. With the convergence of a number of factors such as the availability of low cost cloud computing resources, well established AI algorithms, and in-depth knowledge of API bi-directional information properties, the time is right for AI to extend beyond chess-games and be applied toward real-world API deployments.
How does AI help APIs?
First and foremost, we see AI as providing an additional layer of analytics and context to the API architecture. AI is not a replacement of an API security gateway, or an alternative approach to securing APIs. The only guaranteed way to secure the traffic that travels through your APIs is to deploy API security gateways that are secure by design. Secure by design means – an operating system that is locked down (no root access, no ability to add 3rd party software, integrity checks at both startup and while running), product architecture that follows strict security architecture principles (e.g. administration, policy storage, sensitive security artefacts such as PKI keys, passwords, etc. must be encrypted locally and during transit), and product components built with mission-critical stability in order to maintain performance during high volume transactions and penetration attacks.
With a fundamental API security gateway architecture in place, you can then add Artificial Intelligence to further enhance the protection, visibility, and predictive analysis that properly designed API AI models can offer.
By leveraging recent advances in artificial intelligence and machine learning, API Security Gateways can begin to predict and protect against attack behaviour in real-time. They are uniquely positioned to make use of AI for a number of reasons:
1. Centralised data gathering
API security gateways represent the information gateway or the conduit of data going in and out of the network. This puts them in the perfect position to capture and analyse this data from a central position. What’s more, by pooling together and analysing threat behaviour across all of your API communications, AI can be used to leverage API analysis far more easily that trying to aggregate logs from disparate services and components across your enterprise.
2. Deep content awareness
The API security gateway already has access to the bi-directional API data on your network, so it has a unique advantage over other technologies. Using deep content-parsing and awareness of the data in the network and the activity on it, the API Security Gateway can intelligently feed AI algorithms for advanced predictive analysis.
3. Granular feature engineering
Feature engineering is one of the most important aspects of AI. With API Security
Gateways, the API communication is contextualized into granular features which are
then isolated to build predictive models such as anomaly detection, capacity
planning, response profiles, etc. As each set of business APIs are distinctly different, automated and granular feature engineering are required to ensure the AI models are properly and accurately predicting outcomes.
There is no substitute for good API security design
In conclusion, by using AI and machine learning to analyse the data that passes through it, the AI-enhanced API Security Gateway will be able to predict the behaviour of API attacks in real-time.
AI is perfectly positioned to enhance the security capabilities of API Security Gateways, but should not be mistaken as a replacement for good security design. Rather, AI has the capacity to enhance and augment the protection and visibility gained by deploying API Security Gateways.
AI is no longer science fiction and has already begun to transform many industries. However, an AI model is only as good as the data you feed it. This is why the combination of AI with API Security Gateways represents the next frontier in advanced analytics.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, this January issue contains the second, and the February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 4.
Tim Sadler, co-founder and CEO at Tessian:
Organisations will need to protect human vulnerabilities in their networks
“Human error is natural and common. It is also a huge cause of financial and reputational damage to organisations across the world as employees, intentionally and unintentionally, reveal valuable proprietary data to rival companies or malicious actors. Despite this threat, employees are still not perceived or protected as a real security concern.
As digitisation of information grows in the workplace, and employees are expected to securely manage and process large amount of data, the potential for mistakes, and subsequent damage, is only going to increase. Moreover, it is possible that GDPR flexes its muscles even more in the coming year. These changes will force companies to protect their employees in order to secure valuable data.Human behaviour is unpredictable and dynamic. As new vulnerabilities and weaknesses surface and become more prominent, malicious actors will develop new and more sophisticated tactics to exploit them. To combat this, organisations will need to embrace security technologies, such as machine learning, that continually adapt and respond to changing threats in real time.”
Paul Walker, Technical Director at One Identity:
Biometric authentication goes mainstream.
“Identification of users via the traditional means, such as usernames and certificates, will be more and more complemented by biometric methods to ensure two-factor authentication. How we type, our reaction time, and/or how we use systems and consume the services provided to us will be a more prevalent part of the identification of a user. These biometric methods will soar in 2019 deployment initiatives across the market, with a direct impact on our day-to-day uses of services.”
Maria Lobato, Vice President of Marketing for Secure Access and Fraud at Cyxtera:
Digital trust is make or break for financial institutions
“With nearly two-thirds of global consumers worried about the chance that their bank accounts or bank cards will be hacked, building digital trust has to be as much about culture as it is about anti-fraud technologies. The cultural shift is yet to happen. What’s more, fraud prevention education is normally treated as an afterthought. Even if it weren’t, the truth is that a set of educational projects will not create a customer-centric and trust-driven culture.
In 2019, financial institutions will have the opportunity to engage their customers by developing services that reach an equilibrium between solid security and minimal friction. Organisations need to have a unifying vision and employ anti-fraud technologies that are effective in the short and long term as it is the only way to realize the business value of building digital trust.
Financial institutions that fail to make this shift will grow largely irrelevant as millennials continue to live their lives in an on-demand, Instagram-worthy, Amazon Prime-fast fashion.”
“The nature of cyberwarfare is changing. Russia has led the way in the use of targeted cyber actions as part of larger objectives, and now other nation states are looking to follow the same playbook. While a direct cyberwar is not on the horizon, there will continue to be smaller proxy cyber wars as part of regional conflicts where larger nation state actors provide material support to these smaller conflicts. These regional conflicts will be testing grounds for new tactics, techniques and procedures as larger nation states determine how cyber warfare integrates into their larger military objectives. Nation states will also start experimenting more this year in adding ‘disinformation’ campaigns as part of their cyber warfare efforts. The goal of these campaigns is to mask the nation state performing the attack by using the TTPs of a different nation state as part of their attack. These attacks may be more ‘straightforward’ with the goal of being detected and other nation state actor blamed. These kinds of attacks will make true attribution more difficult.”
The World is an IoT Oyster
“It seems that every year now is the year of the “Internet of Things”. An even more diverse set of items, from electric cars to toasters to pacemakers, are being added online with varying sets of security measures. As noted by Adam Shostack at Black Hat 2018, these IoT devices have unique sets of real-world properties which can be attacked and exploited remotely. We expect attackers to create exploits to target the physical components of IoT devices with the goal of degrading performance or completely disabling them: remotely cause batteries to discharge rapidly, overload compressors or heating elements, or cause them to stop responding. Examples of these exploits could be: electric cars running of out battery power on the freeway, toasters catching on fire, or in a worst-case scenario, pacemakers turning off.”
Tim Helming, Director of Product Management at DomainTools:
The World is an IoT Oyster
The World is an IoT Oyster
“A set of security standards for consumer and small business-grade IoT devices will be drafted. This proposal could include something analogous to the UL listing for electrical devices--it would state that a device with the certification meets specific minimum standards for ‘securability.’ Example criteria could include forcing strong administrative passwords, hardening of the OS, not listening on any ports except one or two that require encryption and authentication, etc.”
Corin Imai, Senior Security Advisor at DomainTools:
Crippling Critical Infrastructure
“We need to start thinking of our critical infrastructures as more than just the physical and virtual, and we need to start thinking of the disruption to our democracy as one of those infrastructures. The more our teams talk about election hacking and the impact political campaigns have on our democracy, the more likely that disruption could be the next attack on our critical infrastructure. Our democracy and how the public perceive our government can cause wild disruption to the state of our nation.”
Outpost24: 2019 IT Security PredictionsMartin Jartelius, CSO, Outpost24 on Hacking and Data Breaches:
1. “As GDPR continues to be implemented, we will see a perceived rise in the number of breaches. However, we will be uncertain if this should be attributed to an increase breach disclosure or an increase in actual breaches, or that breaches against personal data have become financially attractive.
2. Everyone will need a security specialist. Everyone will need a team of application security specialists. But they can’t have them, because “the market” is drained. This gives rise to the proliferation of MSSPs and consultancies, but also hopefully a shift to focus on usability and decision support in security technology, enabling non-security experts to make educated decisions based on advice by their support systems.
3. Organizations will keep talking about defense in depth but keep building a wall around their perimeter and leaving a very soft network inside.
4. As technical security measures continue to make it harder to breach organisations, phishing will continue to rise, and organizations will keep claiming user responsibility for insecurity and gullibility as the problem, however it will still be down to not hardening workstations and internal networks.
5. We will see an increased focus on supply chain breaches in web applications due to the substantial success of those attacks in the last year. These attacks differ from normal supply chain attacks as instead of targeting code in the manufacturing line, as components are loaded cross domain and across organizations, the website security or large organizations will be broken based on their dependency on small organization.”
Sergio Loureiro, Director Cloud solutions, Outpost24 on cloud security:
Simon Roe, product manager, Outpost24 on Application Security:
Suzanne Spaulding, former DHS Under Secretary and Nozomi Networks adviser:
“Unsophisticated attackers will get better at breaking into OT Networks, but will likely lack the level of sophistication needed to have a significant physical impact. Ever more sophisticated tools and techniques for hacking are available for downloading from the web. This means that the number of unsophisticated hackers able to break into systems will rise – but what they’re able to do once they get in is another question. If you look at Russia’s attacks on the Ukrainian power grid, attackers were able to remain undetected and do reconnaissance work for months. To bring down power for nearly 250,00 customers, they had to thoroughly understand the operations at the targeted plant. That level of sophistication can’t be bought and sold on the internet, which means that the real damage will continue to be done by actors with access to the right skills and resources.
The things that have been holding back Russia, China, North Korea and Iran from a critical infrastructure attack on the U.S. could shift. When it comes to nation state threats on U.S. critical infrastructure, we think of four key actors: Russia, China, Iran and North Korea. Each country has been held back from attacking the U.S. for different reasons. Think about a graph with an x and y axis. The x axis represents capabilities and the y axis represents destructive intent. At the moment, Russia and China have the highest capabilities, but they fall lower on the scale of destructive intent. Of the group, they’re more rational and more dependent on their own critical infrastructure. On the other hand, North Korea and Iran have higher destructive intent, but fall lower on the capabilities scale. But it won’t stay this way forever. The level of destructive intent of Russia and China could change overnight – which is a concern given the capabilities they already have. And North Korea and Iran are strengthening their capabilities every day. North Korea’s attack on Sony is a good example. In the news, the focus was on all the embarrassing emails, but the attack was about more than just leaked emails – Sony’s networks were damaged. And Iran made headlines when it pulled off a damaging cyber attack against the Sands Casino. The U.S. has yet to experience a highly-damaging attack on critical infrastructure, but that may not be true for long.”
Bill Lummis, Technical Program Manager at HackerOne:
“I think we're going to start seeing a real uptick in breach insurance with the added pressure of GDPR on top of the existing financial and reputational pressures. I think that in turn will start driving more process maturity as breach insurance starts to be more sophisticated in an actuarial sense. They'll look for specific processes and procedures to be in place to determine rates, and I think that in turn will start to drive better auditing and hopefully better security.”
Cody Brocious, security researcher, HackerOne:
- Drastic increase in attention around privacy-related breaches
- At least two major botnets discovered on IoT/embedded devices
- Falloff in interest for blockchain-related companies/initiatives
“Many large enterprise customers are facing the operational challenges that come with a network-based detection and response system with heavy alerting and human-response transitioning to a cloud, application and endpoint management strategy. They are still buried in alerts and the network traffic is increasingly encrypted which means they lose visibility and control. All of our customers also still have to manage their end-users which is still the main point of failure in any security strategy.”
Prediction 1: Managed Security
“Managed security service providers (MSSPs) have been increasing in their popularity over a number of years. Driven by an ever-increasing technology estate that needs to be secured, and a lack of available skilled talent to hire, we see more and more companies turning to managed security providers to help fulfill their security needs.
Within this, Managed Detection and Response (MDR) is a category of managed security that has seen particularly high growth; but one which will probably be absorbed under the MSSP umbrella in due course.”
Prediction 2: Cyber Insurance
“We’ve seen cyber insurance gain traction within the industry. Driven not just by organisations’ own desires to add an extra layer of business protection; but often times it’s mandated by large organisations on smaller partners to take out cyber insurance.
Therefore, it’s something we’re likely to see grow exponentially in the small and medium business sectors. It will also have the consequence of pushing SMBs to look at their overall security posture and make improvements.
We can also foresee cyber insurance as being the precursor to wider scrutiny of security vendors - perhaps even driving some to offer some form of assurance or liability protection.”
The first wave of cloud migration (Cloud 1.0) is coming to an end with low-criticality applications being moved to public cloud vendors. The next wave is much more difficult – this is where organisations wish to migrate applications that are running their business and give them competitive advantage.
By Sean O’Donnell, Managing Director EMEA, Virtual Instruments.
Gartner’s infrastructure team managing vice president and chief of research, David J. Cappuccio, recently stated that his planning assumption is that by 2025, 80% of enterprises will have shut down their traditional data centre, versus 10% today. This prediction may or may not come true, but data centre staff are now finding that they don’t have the cloud knowledge or skills demanded of them to ensure a smooth transition from on-premise to cloud-based applications.
Cloud 2.0 or Infrastructure as a Service (IaaS), as it is frequently referred to, needs a lot more planning as it is dealing with application performance, rather than just availability. There are two distinct approaches a) ‘Lift and shift’ where you move your existing application to a new infrastructure with minimal change, or b) ‘Cloud Native’ whereby you either re-write the application for the cloud or use the cloud service provider resources to do so. Most business-critical applications are made up of code dating back many years and so will not be an easy task to re-write. These applications also have dependencies on accessing other applications and resources, so it is critical to have a full understanding of your on-premise hosted applications before migration is considered. For this discussion we will concentrate on ‘Lift and shift’.
Primary drivers for moving to the cloud are to reduce infrastructure spending, personnel costs and while enabling business agility and scalability. Physical infrastructure such as hardware, floor space, cooling, security can be owned and managed by a third party and their shared hosting approach should mean a cost reduction – so why is this not always the case? The primary reason is lack of planning. If you simply decide to replicate your on-premise environment at a hosted site then the costs will be similar to where they are today. The planning process should include the question ‘How much will it cost to run my application in the cloud?’ This potentially opens a huge can of worms because to know what service you expect from the cloud provider you need to start with knowing how your applications are performing now – this is a very different question to ‘how are my servers/fabric/storage/ performing now?’ which is how you have always managed the infrastructure.
The other fundamental is ‘Which Cloud Service Provider is the best choice for my application. Very few cloud providers even come close to offering a Service Level Agreement (SLA) for the performance of your application in their environment – yet this is vital to your decision process. If the application is going to run slower than it does currently this migration decision could potentially affect your organisation’s business. If you have to add more capacity/power/resource to host it then the cost goes up.
The starting point, therefore, is to analyse the workload behaviour and performance characteristics of your applications you plan to migrate. You need to know how they are performing and what provisions are they using before the cloud service provider selection process begins. Each application has an individual workload profile that will perform differently in different environments. When you look at the application workload you can see the peaks and troughs of resource requirements, and you need to gather this information over time – 8:00am on Monday can need different resources compared to 2:00pm on Thursday. Seasonality also needs to be factored in – do you have more throughput at different times of the year/month/week/day?
In addition to the application’s performance you need to understand its dependencies. If the application is accessing other resources that are not going to be hosted in the cloud, or is managed by a different provider, this can affect performance and cost.
Each cloud provider has strengths and weaknesses when it comes to hosting your application. You need to approach each potential supplier with your eyes wide open. You are putting your business at risk if you get the service wrong, so you should not trust their performance claims. Each application workload needs to be tested against the new environment to ensure you are getting an equal or better service than you currently have. Many organisations have migrated an application then moved it back in-house because of disappointing performance or escalating cost – in the main this repatriation is caused by lack of or poor planning prior to migration.
Most organisations have a cloud initiative and the analysts and advisors are pushing the business to reap the benefits of an IaaS environment. A cloud hosted infrastructure, if well planned, can bring good financial and performance rewards. Don’t rush to the cloud without answering these three questions:
To take these in turn:
Will my application perform as expected in the public cloud? – How do you know which application workloads to migrate and which to retain in an on-premise data centre? How are the applications performing now? Have you assessed your seasonal application workload behavior? If you can either run your application workload in the cloud test and dev environment to ensure performance or synthesise the application workload and run a simulation of it in a live environment, you should get a good idea of what will happen.
How much will it cost to run the application in the public cloud? – How do I simplify and reduce time to migrate a large number of diverse workloads? How do I select the optimal CPU, memory, network, and storage configuration? Be specific with the host. You should be negotiating cost based on a known application workload and a known environment – at no point should you sign up to a service that doesn’t offer you better performance for your applications at lower cost than running it yourself – after all, these are key reasons why you are going down this route.
Which cloud service provider is best fit for my application? - How do I test cloud workload performance before migrating? Does the cloud host have the capability to run your application workload prior to migration so they can prove to you how it will perform? The infinite different host configurations are not guaranteed to perform the same way for your application. Test several hosts back to back to ensure you will get what you expect.
Finally, with all questions are answered, and you are happy with the SLA the cloud provider is offering, you can move ahead with the migration. This is usually managed by the cloud provider in conjunction with you. But how do you know the new hosted application is adhering to the agreed SLA once you go live? This is where a performance monitoring solution comes in - again don’t trust the cloud provider to deliver this, make sure you know. Monitor your application workloads post migration to identify any unforeseen performance or capacity issues and for peace of mind that all is well.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, this January issue contains the second, and the February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 5.
From the factory to the field
With Mark Homer, Vice President Global Customer Transformation at ServiceMax from GE Digital:
For industrial equipment manufacturers, we see a significant portion of new sales growth coming from connected equipment with sensors, actuators, and better analytical insights. The ability to exchange critical data with other machines and computer networks will not only fuel growth, it will also give greater value and recognition to the asset and service data itself, as well as better actionable insights.
2019 will also see more technology deployment from the factory floor to out in the field. With the As-a-Service industry growing, we also expect to see more manufacturers transitioning to service business outcomes, supported by predictive maintenance, field service and Industry 4.0 technologies and platforms. This will also shine a light on the importance of soft skills with field service technicians who are at the forefront of operations and customer interactions.
“The database world is rapidly moving towards a database-platform-as-a-service or “dbPaaS” model in which databases are consumed as a service from cloud providers. I anticipate this trend to increasingly also apply to in-memory computing solutions.
In-memory-computing-platform-as-a-service or imcPaaS solutions will enable companies to easily consume in-memory computing platforms as PaaS solutions on major cloud services such as AWS, Microsoft Azure, Oracle Cloud, Huawei Cloud and more. We already see leading companies across a range of industries from financial services to online business services to transportation and logistics deploying the GridGain in-memory computing platform on private and public clouds for large scale, mission-critical use cases. In-memory computing vendors are already making their products available as dbPaaS or imcPaaS solutions and I predict those solutions will increase in functionality add new services at an increasing rate in 2019.”
“Gartner defined the ‘in-memory computing platform’ category in December 2017. They recognized the emergence of a new product category which includes in-memory data grids, in-memory databases, streaming analytics platforms, and other in-memory technologies. Since that time, the category has rapidly expanded and evolved and I see it continuing to evolve in 2019 as in-memory computing platform vendors respond to the massive interest in machine-driven decision making by adding machine and deep learning capabilities to in-memory computing platforms. These new machine learning capabilities will allow companies to increasingly deploy in-process HTAP (hybrid transactional/analytical processing) solutions which update their machine learning model in real-time based on new operational data. In-process HTAP enables optimal decision making based on real-time data for mission-critical applications such as fraud detection, credit approvals, and vehicle and package routing decisions. In addition, new integrations between in-memory computing platforms and deep learning systems will allow companies to more easily access and analyze their operational data using artificial intelligence solutions.”
(Nikita Ivanov, founder and CTO of GridGain Systems)
“In the coming year, the digital transformation trend will continue to grow as companies increasingly push to become digital enterprises. A key enabler of this trend will continue to be in-memory computing. As part of a rapidly growing trend over the past several years, we have seen companies of all sizes increasingly adopt in-memory computing platforms to achieve the application performance and scalability they need to achieve digital transformations. Companies like ING, American Airlines, eTherapeutics, Finastra, and more rely on in-memory computing to drive key components of their business. High performance, in-memory computing is becoming standard technology as companies worldwide deploy technology to enable their digital transformation initiatives and provide their customers with the real-time interactions required to remain competitive in an ever-accelerating business environment. New computing approaches such as hybrid transactional/analytical processing (HTAP) or hybrid operational/analytical processing (HOAP) will increasingly be used to unlock the value of data in real-time and drive business results.”
(Terry Erisman, VP of Marketing, GridGain Systems)
Open source, hybrid Cloud and edge security under scrutiny
Arun Murthy, co-founder and Chief Product Officer at data analytics expert Hortonworks
believes there will be more industry shake-ups on the horizon, specifically in the areas of open source and hybrid cloud, the security of edge computing, and data governance for AI.
1. Death of the CIO?
We expect to see the traditional role of the CIO wither away, and we predict that there will be half as many CIOs in 2021 as there are today. Don’t worry – we’re not saying that businesses will make their CIOs redundant, but rather that the role will evolve out of existence..
Where once their responsibilities were purely technical, the central strategic importance of new technologies such as AI and automation means that CIOs are increasingly straddling the boundary between business and IT. Businesses need their CIOs more than ever, even as the role changes – they should be championing the capabilities technology at the board level, while keeping IT tightly-focused on the business’s strategic goals.
2. Experience over youth
The human factor is of paramount important to successful AI and automation (AI&A) projects, but skills are scarce. In the next 12 months we will see a significant investment in training and resources, and much of this money being spent on retraining older workers to become data scientists and solution architects. With technical skills so scarce, no real skills pyramid and little in the way of a recognised career path, we’ll see many of these “new” experts being aged 40 and above, bringing their considerable experience and existing skillsets into their new role.
3. Rise of the “HR Idol”
With AI&A skills at a premium, the role of HR within businesses will multiply in importance. Not only must they work harder to hire and retain talented, knowledgeable workers, but they will also invest significant time and resources in training / retraining. As such, Infosys Consulting predicts that 2019 will see the beginning of a major wage increase for HR professionals working within technology domains, with competition for the best HR workers also hotting up. The best HR professionals – those that can support their employers’ ambitions to deliver new, intelligent products and services by maintaining a skilled workforce – will become “HR Idols”, valued throughout the company (but especially by management).
4. Supply chains will impede AI gains
Don’t assume that all technologies advance at the same rate. In the next year, one of the biggest AI and automation challenges facing the CIO will be the failure of supply chains (especially in the last mile) to keep up with the improved efficiencies enabled by AI&A. We predict that 20 per cent of the value of these technologies will be lost due to physical constraints such as congestion, infrastructure, and workforce limitations. These challenges will need the direct involvement of the CIO, highlighting how the role is developing to encompass a much wider array of strategic responsibilities.
5. AI becomes the norm for healthcare
AI and automation are becoming more prevalent in healthcare, but 2019 will mark the point at which it becomes a routine part of the patient journey. Infosys Consulting predicts that by mid 2020, two in three patients with any condition will be supported by AI&A-related technologies, either as part of diagnostics, treatment, or administration. While this may be invisible to the patients themselves, healthcare providers from doctors’ surgeries to PCTs will reap the benefits of greater efficiencies as well as better insight into health conditions and their treatment.
The edge is everything?
By taking latency out of the equation, edge computing can enable decision-making to take place closer to the point of origin for the data. It can also reduce consumption of services such as network and storage while improving reliability and customer experience. But what do these benefits look like in the real world? Lal Hussain (Director IT Applications at Insight UK):
Reducing consumption & eliminating bottlenecks:
Current latency times might seem insignificant – for instance, the latency between the Netherlands and California is approximately 150 milliseconds. However as IoT increasingly enters the mainstream for both consumer and enterprise applications, these minute periods of time could be the difference between success or failure. Take autonomous vehicles – to avoid collisions, vehicles need to make decisions in near-real-time. In this scenario, even an almost imperceptible amount of latency in a decision to avoid an obstacle can pose a serious threat to safety.
Another benefit is reducing network traffic and freeing up bandwidth. As IoT adoption continues to grow, the data generated is transmitted from, between and within different cloud computing stacks, increasing network traffic. Not all of the data generated by IoT devices is critical; it doesn’t all need to go back to the central cloud services for storage or analysis. By processing data locally, edge computing significantly reduces the load on networks as less data is pushed back to the core network; it also means less storage is consumed centrally in the Cloud, all of which has a cost impact. In this model, only truly valuable data goes over the network.
Also, connectivity is not ubiquitous, and some devices are typically located in remote areas where network access is limited. Edge computing comes into its own with remote devices, giving them the independent ability to process data and take action regardless of connectivity. In the case of, say, a wind turbine in a remote and barren area, this gives them ability to do their own predictive maintenance. Another example is personalised medical devices, which could interact with localised nodes to enable diagnosis on the fly.
Richer & more reliable customer experiences:
Being able to process and respond to data at the source of that information brings the ability to truly tailor an experience to a customer’s requirement, in near real-time. Linking mobile apps to smart beacons which can identify nearby customers, you can make customer-centric decisions and push them content, features and incentives almost immediately, without the need to send data back to a central cloud service.
Consumers now demand an always-on experience, and this relies on the underpinning services and infrastructure having high availability. By localising processing and decision- making in an edge computing model, you can make your service more tolerant of failures in infrastructure, therefore re-enforcing the positive customer experience. In a world where personalisation is the future of client entanglement, edge computing brings an additional pathway to success for those businesses capable of harnessing the opportunity.
Intoware 2019 Predictions
James Woodall, co-founder and CTO:
Big businesses are finally embracing digital transformation - Digital transformation is a concept that has been bounded around for some time but is only just being realised by businesses at scale. Until now there has been resistance to change and a desire to cling on to old, tried and trusted, processes. 2019 will be the year businesses, especially established multinationals, finally start to embrace this change on an organisational-wide scale. Attitudes are changing. Younger individuals are getting into positions of leadership and subsequently, attitudes around technology have changed. We could have digitised processes years ago, but it’s only now that it is being embraced.
Projects are moving from innovation to reality - Big businesses have been investing in small innovative companies and projects for some time as test cases. Finally, we’re starting to see these projects bear fruit and be ‘believed in’. High profile success stories are informing more decisions and we’ll see more projects move from 50-100 users to 50,000, 100,000
Google Glass is more commonplace - We’re not going to see wearables replace smartphone or tablet devices anytime soon but we’ll see them become more commonplace in 2019. Businesses are now seeing the benefits of what has traditionally been a ‘cool, nice to have’ technology. We’ve seen four businesses in the last month move to a purely wearables based solution
Data, data, data - Much like business transformation, data is not a new concept but how businesses use it will be the key to whether they sink or swim. Those that understand it to enact continuous improvement and change will be the ones that survive. This is because of its single source of truth and it will be this that continues to fuel the fire for areas such as AI, chatbots and machine learning technologies.
Augmented reality will start to become reality - Not dissimilar to the growth of wearables, AR is not something that is new, per se. But it is a technology that is starting to see more widespread adoption. The adoption of this particular technology in the construction industry is about providing safe and hands-free access to digital tools and there are numerous examples where it would be incredibly valuable. For instance, surveying the top of a tall building without the need to handle or be looking down at piles of paper. There’s the added benefit of overlays too, which offer exciting developments in terms of placing digital plans over reality when it comes to things like planning roadworks. From a business point of view, the use of AR improves speed, accuracy and convenience, which all ultimately contribute to the bottom line.
It will be the low-skilled work that becomes specialised - A lot has been said about the rise of automation. Clearly what it will do is get rid of basic repetitive jobs like data entry but, we’re less likely to see this kill off positions. More likely we’ll see those at the lower end of the sale upskill and become more specialised in a particular area pertaining to the technologies that are influencing it
Address government drive to get into encryption - The nature of information and its importance is widely lauded, especially when it comes to cybersecurity. Which is why how businesses and government work together when it comes to encryption will be a big issue in 2019. I can see why the desire exists for governments to have unrestricted access to information but I worry about adding a backdoor as the key inevitably falls into the wrong hands.
If you prefer not to deal with people when shopping, you’re not alone. Mitel’s recent study of 5,000 consumers around the world uncovered that on average a quarter of consumers never want to deal with a human when shopping online.
By Shameem Smillie, Contact Centre Sales Manager UKISA.
As online shopping becomes more popular, are machines, then, the future of customer service?
Yes and no. Technology certainly has the ability to improve customer experiences. Three out of four consumers believe that’s already the case today. Yet, the same study uncovered that consumers want to know that human assistance is on hand if required. One in three consumers felt that live human assistance should be offered as part of the initial online experience, with most consumers feeling it should be reserved for special cases, such as during complex transactions.
The future of customer service isn’t man vs. machine, but rather man and machine working together. This can be seen easily in the case of contact centres. As the point of entry for many customer queries or issues, contact centres represent the front line for many businesses’ customer service engine, with the potential to significantly impact the organisation’s reputation among customers. By automating many of the more generic responses, human customer service experts are left with more time to deal with more complex queries and issues – reducing frustration caused by call waiting times. Our research found that many businesses may be missing crucial areas for improvement in this area, by simply assuming the effectiveness of their customer service. Fewer than half of respondents believe the technology needed to deliver the perfect online buying experience is available. This stands in stark contrast to findings of a previous Mitel survey in which 90% of IT decision-makers optimistically reported progress in improving customer experience through the use of technology.
Why the disconnect?
Most businesses have a multichannel relationship with customers – meaning they have a number of customer touchpoints including online, in stores, through social media and over the phone. For a consumer, these engagements are all part of broader experience, but some businesses tend to still view these engagements in isolation. As a result, customers often find themselves repeatedly prompted for the same information as they pass through different channels. Most consumers we asked (58%) believe that the technology currently deployed by brands has room for improvement when it comes to delivering the perfect online experience.
Bringing Artificial Intelligence (AI) into the customer experience can have a transformative effect on the customer’s relationship with a brand. More than aligning customer data with the right customer, AI can be used to help customer handlers to personalise and improve human-based interactions by “learning” what leads to better outcomes. As an example, an AI-enabled contact centre might intuit which kind of consumers are more likely to prefer a callback rather than waiting for an available attendant, and offer the callback option early in the process. At another level, AI contact centres could prioritise calls based on recent events (e.g., a returned product, an overcharge) to ensure that potentially dissatisfied customers receive top priority and are routed to more experienced agents.
While technology is a fantastic tool to boost the customer experience, the role of the human is far from dead. Businesses need to take customers’ individual preferences into account and find a way to blend these experiences without losing the benefits of online and offline, whether it’s convenience, selection or service. Those that are able to empower machine to human interactions by tying together communications and collaboration capabilities with technologies such as IoT, AI will be best placed to strike the balance. In a hypercompetitive world where customer experience has become just as important as the product or service delivered, corporations that build this approach into their strategies will see a direct impact on their customer retention and ultimately business growth.
There is an old inspirational quote that reads “there are three things in life that you cannot recover: a word once it is said, the moment after it is missed and the time after it has gone”. But the author may not have imagined a world where technology captures an ever-increasing volume of our words. When someone leaves a message, talks to staff in a call centre, or logs onto a conference call, they are generating voice data - millions upon millions of hours of it. There is an immense amount of potential value in that data, and yet the majority sits idle in today’s enterprises.
By Peter Majeed, RVP Customer Success and Field Services at Delphix.
So why aren’t companies already leveraging the wealth of potential available to them, and instead choosing to leave it abandoned and untapped?
Voice data is much harder to secure, deliver and analyse than ‘traditional data’. And it is harder to gather clean and representative data to build usable models from it. But companies should not be deterred; those that overcome the challenges will reap the rewards from this new frontier.
Here, there and everywhere: enterprise voice data surrounds us
In today’s connected age, the chatbot may have passed its sell-by date, but conversational artificial intelligence (interacting with computers via speech) is visibly on the rise. It is now the case that 20 percentage of Google searches are made via voice control and Alexa now has over 10,000 skills in its set. Furthermore, as the number of Internet of Things (IoT)-enabled platforms grows, so does the number of speech interfaces which can interact with them, such as smartphones and cars.
This goes beyond just speaking with machines – human-to-human interaction can also make up part of this revolution, such as customer service calls or interactions with healthcare providers. Some organisations, such as Pindrop, are already using this form of AI technology to detect fraudulent claims. This, however, is only scratching the surface of the potential of voice data. One day we may be able to mine call centre data to try and predict which customers are most likely to buy what product and even deliver real-time customer satisfaction metrics.
The road may be bumpy, but that shouldn’t stop your travels
There are a variety of challenges which come from attempting to analyse and understand voice data.
The principle issue facing those who want to get the most from the wealth of voice data their company sits upon, is ensuring access to quality data. It has been estimated that data scientists can take up to 80% of their time just acquiring and cleaning up their data.
But even once the data has been cleansed and organised, this does not necessarily mean the data is sufficiently diverse, potentially resulting in data bias. Voice data brings about a whole new spectrum of data bias. For instance, an algorithm trained with male voices from Manchester will likely have difficulty understanding a female voice from Glasgow.
Challenges like this often result in the proliferation of ‘data capitalism’, providing an advantage to already-established data companies. Apple, Google, and Facebook often have a monopoly over this complex form of data, whilst smaller organisations scramble around to find sufficient data. There is a silver lining in this evolving technology space however, as large conglomerates develop open-source software libraries. Google’s TensorFlow, and AudioSet, (which is an ontology of over 2 million individual audio files), and YouTube’s YouTube-8M (which offers 450,000 hours of video that have been classified and labelled) allow smaller players to build upon these foundations.
The quality of the data we use isn’t the only challenge. Regulation can also prove to be a roadblock to accessing this precious information. Data redaction will have to meet necessary compliance regulations and ensure the secure delivery of data across the enterprise.
GDPR has now been in effect for 6 months, and whilst organisations are familiar with the requirements and impact of not being compliant, many are still in the midst of understanding and putting in place processes and policies. Increasing pressure on companies to protect personal identifiable data, with the threat of heavy fines for non-compliance, has resulted in companies focusing on Production Data management and protection. However, all organisations have a wealth of non-production data that is not as securely managed or protected as their Production Data.
One of the reasons for over-looking a company’s non-production data is that the comprehensive security measures, such as masking data in the many test, reporting and analytics systems of a large company, can come at a high price and prove very complex to implement company-wide, especially when dealing with such a complex form of data. However, working with modern masking solutions that have inbuilt data profiling capabilities that can sift through large amounts of data to detect sensitive information, will help businesses manage their data privacy processes more efficiently. High-end masking solutions will take this one step further and recommend masking algorithms in order to streamline and accelerate the process of securing data.
Extending this to voice data simply becomes the next step in any organisation’s data strategy. Organisations hoping to tap into the potential of voice data must carefully consider the ways in which they will provide secure access to this information across their business.
Start now to reap the benefits of voice data
Emerging technologies such as voice-activated devices, artificial intelligence, and machine-learning are constantly opening up new opportunities for organisations to innovate and be competitive in their industries. Early adopters will gain the advantage today if they are able to set the right foundations and framework in place to manage and secure the data fuelling these emerging technologies.
Now is the time to be building on the basics, with the right data platform and tools, to establish where data is stored company-wide, and ensuring that those who need the data have fast and easy access to it.
Leveraging voice data can provide extraordinary advantages to businesses. In order to reap the benefits, businesses must invest time and effort to ensure the right practices and procedures are in place. By building out a framework to manage and secure data using both processes and tools to do so, businesses can build a strategy today to ensure they are ready for tomorrow.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, this January issue contains the second, and the February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 6.
Migration from Windows to Android
This year, the journey to Windows Mobile End of Life (EOL) began and in 2019 the migration from Windows to Android will continue to gain momentum. This will result in many warehouse devices that are currently running on legacy software, becoming inefficient and vulnerable to cyberattacks. If this is not dealt with, this legacy technology will affect the efficiency of warehouse operations and slow down workers – possibly resulting in system downtime. Many businesses are taking a fragmented approach to the migration process and slowly phasing in new technology in manageable chunks rather than upgrading all their devices at the same time. This is beneficial as it will be more cost efficient and easier for existing employees to adapt to the new technology.
Drones in the Warehouse
While public drone use may not be feasible until the 2020s due to prevailing concerns about security and viability, they are already being used in warehouses. Drones can be used to identify stock, sort returns and transport smaller items from one end of a mega-warehouse to the other. Mobile scanning technology will be integrated into the drone so that it is possible for them to complete certain tasks, such as scanning barcodes to identify stock levels. For now, drones will work alongside employees with the next step seeing their development into independent, artificial intelligence enabled machines capable of delivering stock within warehouses and eventually to customers. This is one of the long-term solutions to cut down delivery time and costs. Some companies have already gone some way towards testing other forms of AI enabled delivery, such as Ocado which successfully tested driverless delivery in 2017.
Collaboration Over Automation
While full automation may be on the horizon, retailers will continue to purchase collaborative robots (Cobots) that require human interaction over the next year. Cobots work alongside warehouse employees, combining the benefits of both human and machine, resulting in higher levels of efficiency, productivity and accuracy. This will allow businesses to benefit from skilled workers and the accuracy of a machine.
Another form of automated collaboration is the “bionic worker”, where an employee is physically linked up to a piece of technology, such as a heads-up display like Google Glasses. The fear that arises from automation is that robots will replace humans in the workplace, but instead robots will encourage workers to be upskilled by taking the physical, mundane and repetitive jobs away, allowing employees to take on more stimulating, high-value and high-skill tasks.
Interconnectivity and Visibility
The systems within the supply chain are often siloed, making communication and collaboration between them difficult. A key trend for 2019 will be the increased interconnectivity between systems through the digitisation of the supply chain. This will be enabled by a high-quality order management solution (OMS) that will ensure the actions of each system are highly visible to the organisation at any one time. The outcome will be better data sharing and collaboration, resulting in more helpful and meaningful data collection. This will ultimately lead to more informed decision making and will allow retailers to compete better in today’s functioning environment.
Shaping the industrial world
Malta is famously known for being the location of choice for film productions like Game of Thrones, Gladiator and World War Z, but the island is now being recognised for its business-friendly environment. Here, Jonathan Grech, director of industrial automation expert JMartans Automation, explains why the Maltese economy is thriving and what to expect in 2019.
Malta is at the cross roads of contrasting economies and cultures. Its strategic location allows it to serve as a hub to the European Union (EU) and Middle East and North African (MENA) markets. In fact, Malta’s economy was one of the strongest performers among EU member states in 2017, with an average growth rate of six per cent.
With Brexit looming, 2019 will undoubtedly see British and European businesses battle the effects, with no clear understanding of its impact for the long-term future. Along with the country’s growth, the technology employed in Malta has rapidly developed in the last decade. With new technologies emerging every day, it’s not expected to slow down anytime soon.
One such technological development is Blockchain. Defined by Don and Alex Tapscott as being “an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value”, the Maltese government believes the technology could bring exponential improvements to the island’s transportation and education systems.
Globally, there is a lot of speculation on the role of blockchain for the development of Industry 4.0, but some companies are already combining blockchain solutions with 3D printing and other processes to create new manufacturing trends. It’s for this reason that Malta is welcoming more technology start-ups and blockchain-based businesses in 2019, and we expect other countries to follow suit.
The recent surge in the adoption of Ethernet communication protocols across the world, has led to forecasts claiming that the IO-Link market will significantly continue to grow, beyond 2025.
IO-Link is an open standard serial communication protocol that allows for the bi-directional exchange of data from sensors and devices to various networks and fieldbuses. In the coming year, we expect that more companies outside of Malta will be aware of the advantages of integrating IO-Link systems like those we provide at JMartans.
One such benefit is that IO-Link systems do not require any complicated wiring. Instead, devices can be connected using the same cost-effective standard unshielded 3-wire cables as conventional I/O applications.
An additional benefit is the wealth of data made available through the IO-Link. With access to sensor-level data, plant and facility managers can monitor the operation of system components. This data can then be used to streamline device replacement and plan machine maintenance accordingly. Both of these reduce costs and the risk of downtime.
We are seeing greater steps being taken by companies towards ‘intelligent manufacturing.’ Just as managers are turning to IO-Link to merge operational technologies, we’re seeing technologies, like safety light curtains and two-hand function devices, being employed to improve the safety and efficiency of complex manufacturing equipment.
For example, when a machine operator is required to use both hands to initiate and continue a machine cycle, two-hand control devices are required to keep the user at a safe distance from any hazardous movement of the machine. If one or both hands are removed from the control, then the machine comes to an automatic halt.
While safety should always be a priority, newer technologies like these are making it easier for manufacturers to find and integrate the right device for their facility. This reduces risk to the final product, employees and equipment.
These are just some of the trends and technologies JMartans expects to drive businesses, not just in Malta but across the world in 2019. While the battle to retain a competitive advantage might not be as gruesome as those witnessed in Game of Thrones, businesses can avoid it altogether by implementing the latest technologies as part of their manufacturing process.
For example, Amazon is in the process of rolling out its ‘pay per call minute’ contact centre software in the UK, which will make the move to cloud simpler and quicker. People don’t want multi-million pound software on multi-million pound hardware any more – they want to be able to integrate contact centre and CRM in one place, and scale at will. What’s more, as machine learning becomes more common, it will be far easier to incorporate new capabilities into a SaaS platform than on-premise.
‘Many businesses will wake up to the opportunities presented by blockchain and see beyond the hype. We are all guilty of wanting to implement it ‘just because’ in the past – now, I think it’s the single largest and most practical use-case for sharing credible information.
In the office leasing business, we’re using blockchain to increase transparency, and drive better decisions and frictionless transactions. Real estate is worth $30 trillion in office space —and if you’re burning 2 percent of that a year on informational data and due diligence, relying too much on the word of landlords, that’s hundreds of billions.
So, my ultimate goal is to create real estate’s first automated valuation model and have every office property in every major market globally indexed and searchable, available to lease with one click. That kind of platform is going to be invaluable for brokerages, financial information companies, and some of the biggest tenants in the world, who have expressed interest already in joining in.’
2019 IT trend: Action at the end will disrupt the cloud
In 2019 more apps and data will move to the edge, potentially disrupting the cloud model. As technology becomes embedded in everything everywhere, and data continues to grow exponentially, enterprises will manage apps and data differently. Glen Robinson, Emerging Technology Director at Leading Edge Forum.
Apps at the edge
The IT industry continues to build out what we call “the Matrix,” the pervasive, intelligent IT infrastructure that goes beyond cloud to include edge computing, IoT platforms, machine intelligence, augmented reality/virtual reality, blockchain and more.
The large incumbent tech players are building out this infrastructure daily, enabling end users with new and marvellous ways to interact with each other. Smart devices are now more than just smart phones; they inhabit our homes, our workplaces, our bodies. And whilst all this is going on, we see new entrants into the Matrix — those who are building completely new models and ways for us to leverage the Matrix, particularly via new apps. Distributed Apps (DApps) is a potential disruption to the entire cloud model, shifting power away from a small number of central players to many participants.
Blockchain may be the answer to the trust issues which regularly seem to hold the technology industry back; in a world which is becoming ever dependent on data, we’ve never been in more need of a trusted solution that enables us to verify the veracity of data.
Data at the edge
Of course, data is more decentralised as well. The source of data has evolved from known producers of data, where we would build specific infrastructure to manage the secure and timely delivery of the data, to today, where everything either has the ability, or is being retrofitted with the capability, to generate data about itself and its surroundings.
The availability of data to us today has never been greater, but with it comes a new set of challenges.
Intelligence at the edge
This brings us back to the Matrix. The fabric that builds intelligence from data is moving ever closer to the sources of data creation. Therefore, we are seeing shifts towards event-driven applications and serverless architectures that allow very small, very specific applications to run in lightweight environments that could be as close as the device in your pocket, on your wrist, embedded in your arm, retrofitted to your desk or outside your house on a pylon.
The decentralisation of data is driving the decentralisation of the intelligence needed to make sense of the data in a timely manner. We’re only at the very beginning of this stage of evolution.
Digital transformation will continue to be a high priority for businesses in 2019, but it will be driven increasingly by changes to the current job landscape. Recent research found that 81 per cent of UK respondents believed that having the option to work from home is important to the future of business, and facilitating this is likely to be top on the agenda for businesses, especially as the benefits become increasingly clear. For starters, empowering and trusting employees to take control of their day, will likely lead to a happier, healthier and more productive workforce. Secondly, a global survey of business professionals suggested that the home is now the most productive workspace, as it removes many of the distractions that are characteristic of office environments. Finally, offering flexible working is now a perk that many employees will see as a deal-breaker. In order to attract new talent, as well as retain existing staff, companies will need to meet these expectations.
But, it’s important that employees are armed with the right tools to carry out their work efficiently and productively. These can range from instant messaging tools that encourage quick collaboration, to webinar solutions and video conferencing software for internal and external meetings. Flexible working is here to stay, and it will only gain more traction in 2019. Companies should be strategic about the policies and technology they implement, but the main goal in 2019 should be fostering an environment where an employee is encouraged and equipped to work in whatever way is most productive for them, be that working remotely or sticking to the traditional 9-5.
Six Cybersecurity predictions for 2019
From Andrew Hollister, Director of LogRhythm Labs (EMEA).
Businesses have become obsessed with business agility. According to Forrester, it’s one of the best ways to secure a long-term competitive advantage in today’s globalised marketplace. And yet, outages can take it away with one snap of the fingers.
By Mark Adams, Regional VP for UK & Ireland at Veeam.
Outages wreak havoc on a suffering company’s critical services, and at a time when customer expectations are skyrocketing, cause significant reputational damage which can take years to repair. In an age governed by information and the ability to access it, unfortunate incidents like those suffered by Telstra and TSB are a pertinent reminder of why businesses can ill afford their lights to go out. The ability to remain ‘always on’ is crucial.
But it’s a challenge that’s far from simple to solve. Given the widespread use of third party cloud providers, guaranteeing availability is not just a case of a company having their own resilient backup and recovery options in place. They have to be able to trust that their third party providers are following suit. After all, what would be the point of keeping everything in working order if your staff or customers still can’t access the services they need? Thousands of companies have little or no IT staff to depend on, and are almost entirely reliant on external service providers to deliver what they need to meet their agility goal.
However, despite pressure to maintain this level of high-speed functionality on behalf of their customers, major cloud providers too continue to struggle with regular periods of downtime and disruption when trying to maintain service for customers. These are choppy waters. The knock-on effect of such incidents can be extremely costly; our 2017 research showed the average cost of downtime globally for mission-critical applications can rise as high as £60,000 per hour. The average annual cost of downtime sits at more than £16m, which isn’t something that many companies can easily shoulder.
How can businesses mitigate against such a costly risk? The best way to ensure survival, as is true with most things in life, is through preparation. Whether a business needs to watch its own back or has responsibility to the ongoing success of others, having a plan in place to follow if an outage occurs (that can help recover and get back online quickly) is key. IDC estimates that 80% of businesses that don’t have what’s defined as a ‘disaster recovery plan’ in place will simply fail when an outage strikes, not to mention suffer an almost incalculable drop in revenue due to falling customer trust.
Planning for the worst
So what does a disaster recovery plan look like? Businesses can start their preparations by first ensuring they understand where disaster recovery sits within the context of their overarching business strategy. This comes from carrying out an impact assessment. Businesses need to take the time to identify which apps and businesses processes are critical to operating all day, every day. They need to calculate the maximum amount of downtime they can stand for each of these before they fail. Considering factors like these make it possible to calculate the ideal recovery targets for these apps and processes and appropriately identify what measures are needed to realise them.
The choice of partner to help implement what’s required for an effective disaster recovery plan is also a big decision. Factors such as the partner’s experience and the nature of the service level agreement (SLA) that they can offer are critical to bear in mind. Elements like uptime guarantees, turnaround time on service requests and enquiries, as well as fees and compensation should all be part of even the most bare-bones SLA. With the continued push on businesses to be always-on, these will only become more important considerations. Compliance is also a factor that should not be taken lightly, and any service provider worth taking seriously will be fully compliant with the legal requirements of the territories where they operate.
Location, location, location
Location too is important as part of planning. Choosing on-premises or an offsite location for data storage can make a real difference to any given company’s ability to react, with each having their own strengths and weaknesses. A 3-2-1 strategy is one of the most popular choices we see businesses make, which involves keeping three copies of data on two different types of media, with one offsite.
Offsite data centres can be often more convenient and reliable, as optimal conditions for your servers and equipment are always maintained, and tech support and security are always on hand. If future plans include significant expansion using the cloud, having access to offsite capability that can be quickly scaled up may also be important. However, having critical data physically separated in this way suddenly places greater priority on strong network access, so extra or more reliable bandwidth might be needed. We also return back to the issue of whether the offsite provider can be trusted, as we need to remember they too have their own challenges to maintain service and foster their own business agility.
Fiercer than ever
Planning for the worst by taking these factors into consideration can make a colossal difference when it comes to mitigating the threat of outages and downtime. However, it should also be noted that a disaster recovery plan alone is still not enough. Businesses also need to be regularly testing the viability and quality of their backups to be certain they are completely recoverable and dependable. The worst time to learn that the backup procedure has not been working properly is when they are the only option.
Economic and political climates remain uncertain, but what is crystal clear is that industry competition has become fiercer than ever. Agility has therefore never been more important, and that’s why it has become such a powerful competitive differentiator. But this ability to act quickly can disappear equally as fast, offering catastrophic consequences. Businesses therefore face huge pressure now to ensure their services never falter and remain highly responsive. They cannot afford to grind to a halt. Planning for the worst is imperative, recovery is key. This is how businesses achieve and maintain their need for speed.
The manufacturing industry is a key pillar of our economy. As electricity radically transformed it, data driven manufacturing will bring the industry to a whole new level of performance, quality and efficiency, commonly called “Industry 4.0”.
By Olivier Vicaire – Digital business consultant for Orange Business Services UK.
With new technologies like Internet of Things (IoT) and data analytics, the manufacturing sector is shifting into the era of fully-networked factories where the entire production process from design, supply chain, production and quality control to distribution are all integrated and powered by Artificial Intelligence solutions that can provide real-time actionable insights 24/7. AI and advanced automation will provide a “level of accuracy and productivity beyond human ability”, according to the World Economic Forum.
Transform to thrive
There’s a lot at stake here. The main reason many firms are forced into taking action and looking for improvements is because of the tough market conditions in which they’re operating. Both mature, industrialized and emerging counterparts have experienced weak manufacturing production growth in recent years, according to the International Yearbook of Industrial Statistics 2017 produced by the United Nations Industrial Development Organization (UNIDO).
2016 saw the growth of manufacturing value added (MVA) in the world’s industrialized economies fall to under 1 percent. Significantly, even the MVA growth of China, the world’s biggest manufacturing nation, fell to 6.7 percent in 2016, down from 7.1 percent the previous year. Industry 4.0 can help by increasing digitization and interconnection of products, value chains and business models. This in turn enables business decisions, supply chains and ultimately products to be informed by real-time insights generated by data collected throughout the manufacturing process. This can bring about slicker, more efficient operations, reductions in downtime, and savings in costs.
Smart manufacturing needs new skills
This level of innovation needs context and a solid business case for manufacturers to adjust their current processes and thrive. The problem with manufacturing at present is that the industry is faced with an entirely new way of operating. Most businesses still haven’t reached industry 4.0 full potential. One of the main causes identified is the lack of skills in digital and the culture of change.
Moreover, the robot versus humans conflict is making harmony difficult to find.
Advanced automation powered by artificial intelligence is able to solve and complete complex tasks in real time. That’s all great, but the real value in all of this is people who will lead the charge in managing and interpreting the data that comes from an industry 4.0 environment.
Mastering the data chaos to deliver tangible business value
These technologies are not really “off the shelves” and require the necessary expertise to deliver the value that everyone is dreaming of. Creating data is not very difficult; companies have been doing it for years! But how do you generate real business value from this data? How do you manage the growing amount of data piling up in the various Information Technologies (IT) and Operation Technologies (OT) systems and on paper? How do you analyse data from various sources to achieve a given business outcome? And in which technology should you invest?
You can start initiatives here and there at various levels of the production line, but you will soon realise that you need to rely on a partner, or an ecosystem of partners, to guide you through your data journey. Business consultants will help you to draw the big picture and identify where the opportunities are. Focussing on business impacts, they craft the vision: what good looks like and start to pull together the various building blocks.
IoT experts and data scientists will then advise on where to find the right data and how to collect it, store it and manage it, according to the business strategy. With that in hand, it becomes easy to spot what kind of sensor to put where to collect the missing data. These experts are able to create or configure algorithms to generate real time dashboards or insights, as well as automation through artificial intelligence and machine learning.
For example, we did some work for a European car manufacturer because they were losing items in their process. The company had a complex supply chain which included tens of thousands of boxes which contained materials for production. Each of these boxes had a value of 400 euros and all in all, the company was haemorrhaging tens of millions of euros due to the boxes that got lost within the supply chain. And that was just one aspect of a much wider supply chain with other concerns they had to manage.
We came up with a solution to track one box out of ten with a sensor powered by a battery that updated the location twice a day. It provided enough information to catch the odd stray box that went missing along the way. More importantly, they were also able to triangulate the precise location of one box and in doing so, they found an entire stash of boxes that had been misdirected. As a result, they were able to reduce the number of boxes that went missing and also address the huge costs associated with it. Granted, not all problems that a business has will be so finite. Having a comprehensive vision of the full process in the time provides valuable insights and is the first step to full automation.
Strong foundations required
Many companies are trying to build cathedrals on the sand. There are business units leading construction of the buildings without engaging the IT in the initial phases of the project.
Connectivity is an essential enabler of Industry 4.0. Without reliable and secured connectivity, there’s no data. Today, there are lots of solutions available, from wireless low power consumption to very high speed wired connectivity. Tomorrow, 5G will be a standard for wireless connectivity. The right connectivity delivering the most quality bandwidth for the money is key to performance and cost effectiveness of collecting data. One technical solution might not be enough. Network integration is key.
Orange recently partnered with Siemens to support their transformation to a next generation network (SDN NFV) to support their IoT strategy. The two companies will collaborate to provide a range of solutions around asset tracking and asset monitoring to optimize the supply chain and improve efficiencies. They will also develop digitally enhanced products which offer the opportunity to drive new business models. Underlying all of this, and indeed what should be at the heart of any successful industry 4.0 implementation are deep, actionable insights. By connecting machines and physical infrastructure to the digital world, businesses can translate the wealth of data they produce into business results.
Advanced analytics and digital services will help manufacturing companies increase productivity and efficiency across their business. Installing the technology isn’t a goal in and of itself and so it will need conversations with your technology provider beforehand to define the business value, agree objectives, metrics and approaches. But just as important are the champions within the business to verify success and continually use the data to inform and improve processes.
In response to our request for technology predictions for 2019, we received more than 150 submissions. The December issue of DW featured the first batch, this January issue contains the second, and the February issue the final instalment. We hope you enjoy reading about what’s going to be happening in the workplace sometime soon! Part 7.
The year of multi-Cloud
According to Kevin Deierling, VP of Marketing, Mellanox Technologies:
For over 40 years the IT industry has been soaring like an eagle on an upcurrent. Moore’s Law has been that upcurrent – until recently. We are now approaching the physical limits for shrinking transistors, and the industry is looking for alternative solutions to maintain progress: such as 3-D stacked chips, specialist chips optimized for certain functions, new materials and technologies.
This is not so much the end of progress as the end of reliable, predictable progress – even as the call for digital transformation gathers momentum. Moore’s law made it relatively easy for the industry to plan ahead, now we are at the mercy of the winds of innovation. And maybe global politics and economics too, as China seeks to challenge US semiconductor supremacy and Silicon Valley continues to de-centralise.
Without that promise of ever faster chips, 2019 will see increased demand for faster networks to connect expanding clusters of hundreds or thousands of computers as efficiently as possible. So this year my money’s more on the high speed networking market than the computers being linked. It no longer makes sense to be installing 10G Ethernet when the cost of 25G is approaching parity, and more critical applications are already going for 100G or even 200G – another nail in the coffin of Fibre Channel. Demand for bare-metal cloud and security is also pushing software-defined functionality into programmable adapters – another way to lift the burden on expensive, over-worked servers. So SmartNICs will surely become mainstream in 2019.
Storage itself is getting faster. NVMe Flash Solid State Disks will overtake sales of SAS/SATA drives this year, accelerated by the availability of integrated single chip storage controllers and NVMe All-Flash-Array platforms. Again, faster storage is another driver for faster networks, so the rise of flash provides another driver for the adoption of higher speed networking. It is, however, too soon for persistent memory storage – still constrained by high costs, and the need for major software changes. What’s more, the major incumbent providers of flash memory are reducing their SSDs’ latencies, further paring down the advantages of persistent memory.
Above all, 2019 will be the year of multi-cloud. Businesses want quicker, easier solutions for secure connectivity between public cloud and their own private clouds, increasingly between multiple different public clouds too. Bullet proof Security and Data Center Interconnect technologies such as EVPN will become vital for multi-cloud connectivity this year.
Reflections and Predictions - From Brad Parks, VP of Business Development at Morpheus
In 2018 we saw some substantive changes in the hybrid cloud market… some of which were easy to predict and others which came out of nowhere. If I had to give the past year a summary headline it might go something like “Industry consolidation marks the end of first-generation cloud management.”
Cloud management is a broad topic and to be honest, it’s been a bit nuts. Vendors have used this term to describe a fragmented array of products ranging from optimisation to security to automation to migration and more. The last year has been a mix of failures, acquisitions and consolidation as the market shook out quite a bit. News hit over the summer that CMP startup, Apprenda, was shutting operations while more recently CMP grandfather, RightScale, was gobbled up by Flexera to strengthen its asset optimisation suite. Cloud cost analytic player, CloudHealth, saw a similar fate as VMware tried to round out its own multi-product CMP stack back in August.
What might we read into these occurrences? The market is clearly demanding more full-stack solutions… it’s no longer enough to merely turbo charge or cost optimise some VMs. Cloud management is getting ready to enter a more mature phase where the demands are greater and not everybody is able to play.
Here are a few areas to keep an eye on in 2019
Dealing with data
Simon Sharp, vice president international at ObserveIT, explores some data sensitive issues:
2019 – the birth of the passwordless society
and other predictions from Jesper Frederikson, VP at Okta:
“You would have been hard pressed to remember a day in 2018 not marked by news of a data breach. What this highlighted was the diminishing importance and strength of password authentication, and the growing shift towards a passwordless society. Having more human signals such as biometrics, usage analytics and device recognition will remove the reliance on simple and repeat passwords, and in turn, better secure systems.
In addition to this, organisations should move to a discrete and modern identity system that removes any reliance on personal information, such as passwords, to increase security. This will ensure safety as stolen personal information would become worthless on the black market, acting as deterrent to hackers.”
Increase in cloud will see a rise in cyber attacks
“Cyber criminals by their very nature are opportunistic and will go where the money goes. The rate of switching from on-premises to cloud-based security solutions will increase exponentially, in 2019, but weakness by organisations in deploying sufficient controls, will see rising cyber attacks.
To help organisations better safeguard systems, machine-learning techniques will grow in maturity for both predictive models and response automation. This will help organisations deal with rising attacks and enable human intervention for more serious attacks.”
Automation-first mindset will propel the DevOps agenda
“As every company becomes a software company in 2019, more and more investment will be placed on an agile DevOps strategy, fuelled by the need to automate. This automation-first mindset will be a major change for most IT teams and will need to be driven by the CIO, along with an injection of fresh talent or a major investment in training for existing IT teams.
On top of this, CIOs should avoid developing their own technology as much as possible. DevOps is about automation, but there are a growing number of excellent tools that engineers can string together to be successful. After a good DevOps model has been adopted by the CIO of an organisation, it’s important to transition to a mindset where the team learns how to leverage existing technologies instead of developing all on their own.”
“Over the last five years the root cause of the vast majority of data breaches has been human failure. It often gets presented as a chain of complex technical events - but this is to divert blame and soften the blow of fines or obfuscate what really happened to customers. People tend simplify their thinking on cyber-defences. In reality once a chink in the armour is found the attackers bounce around 'inside' looking for another weak link. Today 'inside' is hard to define, since the effective 'inside' is likely outsourced at least once.
Data processing might not be a core business function, but data is core to business.
The main problem with 'Phone Home' IoT devices is that their 'Home' can be spoofed. In the simple case, they leak data to the wrong place. IoT devices are notorious for being designed for function and usability - often brought to market too early because they can be upgraded in the field using 'Phone Home'. When 'Home' is spoofed they can be 'upgraded' to whatever the attacker wants. The most common case is leaving the functionality as is, but adding a backchannel proxy so that the attacker can enter an organisation and appear to be on the organisation's network.
GDPR will bite some organisations, the message that filters through to others is that organisations are responsible for their data - a contract with an outsourcer is not going to help in a breach. I expect to see organisations taking more pro-active steps; first to ensure that there is more encryption of cloud data, and secondly to reduce the number of people that can access that data in the first place.”
Energy efficiency optimisation
Hiren Parekh, Director of Cloud and Hosting, OVH:
“I think a continuous top datacentre trend for me is how to optimize energy efficiency. It has always been a challenge for providers to enhance their power usage effectiveness (PUE) rating, but now a growing call from the industry to reduce carbon footprints has placed a spotlight on the issue. While being green was once a nice-to-have attribute, I think it will be deemed highly important in 2019. At the same time, as infrastructure and apps become more ubiquitous and consequently datacentres become more powerful, businesses are beginning to wake up to the importance of ensuring all datacentre usage is as carbon efficient as possible. At OVH we are proud to have deployed a 70% water-cooling and 30% air-cooling approach across our datacentres.”
The Future of Networking is Software-Definedaccording to Evan Kenty, Managing Director EMEA at Park Place Technologies.
The network layer of the data centre has so far resisted the tide of digitisation; expect this to change in 2019 and beyond. Businesses today are travelling to where their customers are, rather than waiting for customers to come to their location. Retail establishments, restaurants, banking companies and many others are beginning to offer their customers with options such as pop-up locations at concert venues or offices inside of larger stores. To do this, they need to bring a robust network with them – something that’s only possible with wireless.
SDN implementations have already begun to transform networks from static, overprovisioned and inflexible to efficient, agile and programmable, and they’re expected to evolve and accelerate as wireless technologies such as Gigabit LTW and 5G develop. Wireless enables businesses to stay connected while retaining all the requirements of a wired network, including reliability, security, quality of service, and application continuity. By being able to open with one or two wireless connections, organisations still get reliable connectivity with day-1 deployments and they can venture out further than where only a network connection exists.
Changes in the European Channel
Over the past year, channel has become a more technical sell. With OEMs having locked down their technology, reselling hardware now delivers minimal margins for resellers. This is leading to a more innovative channel culture, with more and more partners having to really innovate with the services and solutions that they offer in order to remain competitive.
As this is coming to a head, large global channel partners are playing a more diluted role in the wider ecosystem, and this is beginning to frustrate manufacturers and customers alike. In 2019, we will see the channel continue to shift from simple value-added reselling to providing stronger front-end management consulting to help customers get more from their technology. The channel will take on a broader role, acting as more than just a point of product procurement, likely leading to significant M&A activity in the coming year.
Driverless Cars and the Maintenance Industry
With a ‘digital highway code’ already being considered for driverless cars in the UK, there’s no doubt that autonomous vehicles have the potential to revolutionise the TPM industry. As we move into 2019, we can expect driverless cars to create a more cost-efficient way of transporting hardware and automating data centre maintenance.
However, the task for all service organisations in the wave of automation is ensuring that customer relationships are managed effectively without losing the human touch completely. It’s likely we will see an increase in organisations looking to add value by diversifying their service offering and to encourage face-to-face interaction with clients outside of traditional touchpoints such as delivery and collection.
IT in the UK Public Sector
Following the announcement of the UK’s Autumn Budget, the Institute for Fiscal Studies warned that many public services will continue to feel squeezed for some time to come. Yet IT spending in certain parts of the public sector could see a substantial increase.
Indeed, the health sector is one area where there is a significant amount of work to be done. A recent report for the regulator NHS Improvement found that a slow uptake of technology was a key factor in hampering the ability of the NHS to answer 999 calls quickly. Digital technology to access patient records is still in its infancy, as well as a lack of auto-dispatch technology to speed up responses to cardiac arrests.
With the government’s announcement to provide £20 billion worth of extra funding for the NHS, we should start to see real change in the coming year. Following the report’s recommendations, this should be in the form of vital improvements to the NHS’s wider infrastructure, which will help staff work as efficiently and effectively as possible.