This issue of Digitalisation World includes a major focus on the Internet of Things (IoT), where we asked a whole range of vendors what role they thought that IoT would play in the enterprise. Perhaps unsurprisingly, none of the respondents were anything other than extremely positive as to the future of IoT.
Apart from the need to come up with a better name (Internet of Things is so clunky!), the major issue that has yet to be overcome with IoT is how/when/where it will co-exist with and/or replace ‘traditional’ technology in so many areas of our lives. Like most of us I’m sure, I spend a good deal of time contemplating the technology future and how this will impact on our everyday lives, for better or worse. While we can never turn back the clock nor, I suspect, ban any particular technology developments, I do just wonder whether the technology developers ought to be somehow bound up with the consequences of their inventions. Impossible, of course, but if the downside of any innovation had to be addressed by the inventing company, perhaps they would not be so mad keen to rush forward with the idea that developing AI, virtual reality, driverless cars and the like is only a good thing.
DW is not a philosophy paper, but someone needs to think through the chain of events that we are already witnessing as technology replaces more and more human beings and seems to be placing more and more power and money in fewer and fewer hands across the globe. It might be rather crude to say to an inventor: ‘Your new machine has just put 100,000 people out of a job, now you’re going to have to pay the social care bill for these people until such time as they find alternative jobs’, but there’s no doubting that many of the world’s current humanitarian problems are being exacerbated, if not caused, by the growing social unrest over the effects, direct or indirect, of technology.
It is naïve of the manufacturers of such technology to claim that they have no social responsibility, but it’s equally naïve to imagine that governments alone can act to put some kind of checks in place. And, of course, if we’re all happy to make use of the many great things that the digital world has already delivered, can we complain about the downsides?
On a more basic level, I’m interested in how some of the new ideas are actually going to work in practice. Driverless cars are my current ‘obsession’. I’m not concerned with whether they are a good or bad thing, no, rather, with the practicalities of their introduction onto our roads. I’ve spent plenty of hours thinking about this and can only conclude that, unless we end up with roads for driverless cars and roads for human-driven cars, there is going to have to be one particular day in each and every country where the complete switch is made. To have driverless and human-driven cars on the same roads can make no sense – so long as safety continues to be the prime selling point of the driverless car.
Of course, the reality is that, when every logistics company discovers it can sack all its drivers and use driverless trucks instead, they’ll be signing up pretty quickly, but if these lorries are on the same roads as human-driven cars, the safety argument goes out of the window.
So, I guess the issue is one of responsibility. The responsibility that the tech giants have, not just to make money for themselves and their shareholders, but also to understand that they will enjoy success for just as long as their customers are content; the responsibility that governments have to ensure that, somewhere in the complex global economy, there is some consideration given over to the human element in the technology chain; and the responsibility that we all have to understand what we are buying – where it came from, who made it, and at what cost?
Angel Business Communications Ltd, 6 Bow Court, Fletchworth Gate, Burnsall Rd, Coventry CV5 6SP. T: +44(0)2476 718970. All information herein is believed to be correct at time of going to press. The publisher does not accept responsibility for any errors and omissions. The view expressed in Digitalisation World are not necessarily those of the publisher. Every effort has been made to obtain copyright permission for the material contained in this publication. Angel Business Communications Ltd will be happy to acknowledge any copyright oversights in a subsequent issue of the publication. Angel Business Communications Ltd © Copyright 2016. All rights reserved. Contents may not be reproduced in whole or part without the written consent of the publishers. ISSN 2396-9016
An update to the Worldwide Semiannual IT Spending Guide: Line of Business from the International Data Corporation (IDC) forecasts worldwide corporate IT spending funded by non-IT business units will reach $609 billion in 2017, an increase of 5.9% over 2016. The Spending Guide, which quantifies the purchasing power of line of business (LoB) technology buyers by providing a detailed examination of where the funding for a variety of IT purchases originates, also forecasts LoB spending to achieve a compound annual growth rate (CAGR) of 5.9% over the 2015-2020 forecast period. In comparison, technology spending by IT buyers is forecast to have a five-year CAGR of 2.3%. By 2020, IDC expects LoB technology spending to be nearly equal to that of the IT organization.
"Companies' adaptation of Innovation Accelerators, such as Internet of Things, Cognitive/AI systems, and 3D Printing, together with the four Pillar technologies of the 3rd Platform, to both new product and service developments and day-to-day business operations has fundamentally increased Line of Business spending on IT," said Naoko Iwamoto, senior market analyst with the IDC Japan IT Spending Group. "The Innovation Accelerators have put the line of business units in the frontline of the digital transformation and have forced them to work either alone with the ecosystem outside of the IT organization as 'shadow IT' or in closer collaboration with the IT department than ever before."
IDC's Line of Business taxonomy identifies two major types of technology spending – purchases funded by the IT organization and purchases funded by technology buyers outside of IT. Joint purchases can be funded by either IT or the functional business unit while "shadow IT" projects are funded from the functional area budget without the knowledge, involvement, or support of the IT department. Although some technology categories are dominated by IT spending, most involve outlays from both IT and the business units. For example, worldwide IT spending on servers, storage, and network equipment is forecast to total $114.1 billion this year, while LoB spending on these items will total $52.9 billion. However, IT is not the primary source of funding for all hardware purchases. Business unit spending on PCs, monitors, mobile phones, printers, and tablets will total $83.8 billion worldwide this year compared to $76.2 billion spent by the IT department. And line of business buyers will spend more on software applications in 2017 ($150.7 billion) than IT buyers ($64.7 billion).
The technology categories that will see the most spending from LoB buyers in 2017 will be applications ($150.7 billion), project-oriented services ($120.3 billion), and outsourcing ($70.3 billion). The categories that will receive the most spending from IT buyers this year will be outsourcing ($149.2 billion), project-oriented services ($82.2 billion), and support and training ($79.8 billion). Combined IT-LoB purchases of outsourcing and project-oriented services ($422 billion) will represent nearly one third of all technology spending worldwide in 2017. The technology categories that will see the fastest growth in spending over the 2015-2020 forecast period are tablets (16.2% CAGR for IT and LoB purchases combined) and midrange enterprise servers (14.7% combined CAGR). LoB buyers will also continue to invest aggressively in applications and application development and deployment (8.5% and 9.3% CAGRs, respectively).
In 2017, IDC expects LoB technology spending to be larger than IT organization spending in five industries: discrete manufacturing, healthcare, media, personal and consumer services, and securities and investment services. By 2020, this number is forecast to grow to nine as the insurance, process manufacturing, professional services, and retail industries see LoB purchases move ahead of IT purchases. The industries with the fastest growth in LoB spending are professional services (6.9% CAGR), healthcare (6.6%), and banking (6.5%). However, LoB technology spending is forecast to grow faster than that of the IT organization in all 16 industries covered in the spending guide.
On a geographic basis, the IT organization will be the largest source of technology spending throughout the forecast in all but four countries: the United States, Canada, Saudi Arabia, and the United Arab Emirates. And like the industry trend, LoB spending is forecast to grow at a faster rate than IT-led technology spending in nearly every country. The countries that will experience the fastest LoB spending growth include Indonesia and the Philippines (each with a 12.2% CAGR), Argentina (11.1% CAGR), Peru (8.7% CAGR), and India (8.4% CAGR).
"Explosive cloud and other 3rd Platform technology adoption is enabling U.S. lines of businesses to rely less on enterprise IT than any other country to fund their technology purchases," said Eileen Smith, program director, Customer Insights and Analysis. "On average, U.S. line of business will fund 62% of their technology purchases in 2017. Looking to increase productivity and reduce organizational costs, IDC expects supply chain, human resources, and sales executives will fund the largest share of their companies' technology purchases over the forecast period."
"While the LoB-funded IT spending shows steady growth of 3.1% CAGR in the forecast period in Japan, almost 70% of technology spending comes from IT with a 1.3% CAGR," said Iwamoto. "As the competition escalates in the worldwide marketplace as well as with the disruptors from different industry segments, Japanese companies are trying to hold their position by employing a globally standardized IT and business processes initiated at the headquarters. The reinforcement of the IT governance among Japanese large enterprises will keep the higher ratio of IT funded."
According to the International Data Corporation (IDC) Worldwide Quarterly Converged Systems Tracker, the worldwide converged systems market revenues decreased 1.4% year over year to $3.09 billion during the fourth quarter of 2016 (4Q16). The market consumed 1.6 exabytes of new storage capacity during the quarter, which was up a moderate 4.0% compared to the same period a year ago. For the full year 2016, worldwide converged systems market revenues increased 5.8% to $11.3 billion when compared to 2015.
"The converged systems market is going through a period of change," said Eric Sheppard, research director, Enterprise Storage & Converged Systems. "We are seeing strong growth from products with new architectures, increased levels of automation, and heavy use of software-defined technologies. This growth has been offset by reduced spending on traditional converged systems and a conscious decision by some vendors to terminate some parts of their product portfolio."
Converged Systems Segments
IDC's converged systems market view offers four segments: integrated infrastructure, certified reference systems, integrated platforms, and hyperconverged systems. Integrated infrastructure and certified reference systems are pre-integrated, vendor-certified systems containing server hardware, disk storage systems, networking equipment, and basic element/systems management software. Integrated Platforms are integrated systems that are sold with additional pre-integrated packaged software and customized system engineering optimized to enable such functions as application development software, databases, testing, and integration tools. Hyperconverged systems collapse core storage and compute functionality into a single, highly virtualized solution. A key characteristic of hyperconverged systems that differentiate these solutions from other integrated systems is their scale-out architecture and their ability to provide all compute and storage functions through the same x86 server-based resources.
Advertisement: Gigamon
During the fourth quarter of 2016, the combined integrated infrastructure and certified reference systems market generated revenues of $1.57 billion, which represented a year-over-year decrease of 15.7% and 50.8% of the total market. Dell Technologies was the largest supplier of this combined market segment with $705.1 million in sales, or 44.9% share of the market segment.
Top 3 Vendors, Worldwide Integrated Infrastructure and Certified Reference Systems, Fourth Quarter of 2016 (Revenues are in Millions) | |||||
Vendor | 4Q16 Revenue | 4Q16 Market Share | 4Q15 Revenue | 4Q15 Market Share | 4Q16 /4Q15 Revenue Growth |
1. Dell Technologies* | $705.1 | 44.9% | $1,043.4 | 56.0% | -32.4% |
2. Cisco/NetApp | $486.7 | 31.0% | $336.7 | 18.1% | 44.5% |
3. HPE | $228.9 | 14.6% | $324.1 | 17.4% | -29.4% |
All Others | $149.7 | 9.5% | $158.6 | 8.5% | -5.6% |
Total | $1,570.3 | 100% | 1,862.8 | 100% | -15.7% |
Source: IDC Worldwide Quarterly Converged Systems Tracker, March 23, 2017 |
* Note: Dell Technologies represents the combined revenues for Dell and EMC sales for all quarters shown.
Integrated Platform sales declined 8.6% year over year during the fourth quarter of 2016, generating $823.5 million worth of sales. This amounted to 26.6% of the total market revenue. Oracle was the top-ranked supplier of Integrated Platforms during the quarter, generating revenues of $406.9 million and capturing a 49.4% share of the market segment.
Top 3 Vendors, Worldwide Integrated Platforms, Fourth Quarter of 2016 (Revenues are in Millions) | |||||
Vendor | 4Q16 Revenue | 4Q16 Market Share | 4Q15 Revenue | 4Q15 Market Share | 4Q16 /4Q15 Revenue Growth |
1. Oracle | $406.9 | 49.4% | $362.5 | 40.2% | 12.2% |
T2* IBM | $70.4 | 8.5% | $116.7 | 12.9% | -39.7% |
T2* HPE | $70.1 | 8.5% | $74.8 | 8.3% | -6.3% |
All Others | $276.1 | 33.5% | $347.3 | 38.5% | -20.5% |
Total | $823.5 | 100% | 901.3 | 100% | -8.6% |
Source: IDC Worldwide Quarterly Converged Systems Tracker, March 23, 2017 |
* Note: IDC declares a statistical tie in the worldwide converged systems market when there is a difference of one percent or less in the vendor revenue shares among two or more vendors.
Hyperconverged sales grew 87.3% year over year during the fourth quarter of 2016, generating $697.4 million worth of sales. This amounted to 22.6% of the total market value. The hyperconverged system market surpassed $2.2 billion in global revenue during the full calendar year (2016), which was 110% higher than 2015.
An update to the Worldwide Semiannual Big Data and Analytics Spending Guide from International Data Corporation (IDC) forecasts worldwide revenues for big data and business analytics (BDA) will reach $150.8 billion in 2017, an increase of 12.4% over 2016. Commercial purchases of BDA-related hardware, software, and services are expected to maintain a compound annual growth rate (CAGR) of 11.9% through 2020 when revenues will be more than $210 billion.
"After years of traversing the adoption S-curve, big data and business analytics solutions have finally hit mainstream," said Dan Vesset, group vice president, Analytic s and Information Management. "BDA as an enabler of decision support and decision automation is now firmly on the radar of top executives. This category of solutions is also one of the key pillars of enabling digital transformation efforts across industries and business processes globally."
The industries that will be making the largest investments in big data and business analytics solutions in 2017 are banking, discrete manufacturing, process manufacturing, federal/central government, and professional services. Combined, these five industries will spend $72.4 billion on BDA solutions this year. They will also be the largest spenders in 2020 when their total investment will be $101.5 billion. The industries that will experience the fastest growth in BDA spending are banking (13.3% CAGR) and healthcare, insurance, securities and investment services, and telecommunications, each with a CAGR of 12.8%. It should be noted, however, that all but two of the industries covered in IDC's BDA Spending Guide will experience double-digit CAGRs from 2015-2020.
"The three industries that comprise the financial services sector – banking, insurance, and securities and investment services – all show great promise for future spending on big data and business analytics. This technology can be applied across key use cases throughout these financial institutions from fraud detection and risk management to enhancing and optimizing the customer's journey," said Jessica Goepfert, program director, Customer Insights and Analysis.
"Outside of financial services, several other industries present compelling opportunities," Goepfert added. "Within telecommunications, for instance, big data and analytics are applied to help retain and gain new customers as well as for network capacity planning and optimization. Meanwhile, the media industry has been plagued by massive disruption in recent years thanks to the digitization and massive consumption of content. Here, big data and analytics can help firms make sense of and monitor their readers' habits, preferences, and sentiment. Vendors targeting the big data and analytics opportunity would be well-served to craft their messages around these industry priorities, pain points, and use cases."
BDA technology investments will be led by IT and business services, which together will account for more than half of all big data and business analytics revenue in 2017 and throughout the forecast. Services-related spending will also experience the strongest growth with a five-year CAGR of 14.4%. Software investments will grow to more than $70 billion in 2020, led by purchases of End-User Query, Reporting and Analysis Tools and Data Warehouse Management Tools. Non-relational Analytic Data Store and Cognitive Software Platform will experience strong growth (CAGRs of 38.6% and 23.3% respectively) as companies expand their big data and analytic activities. BDA-related purchases of servers and storage will grow at a CAGR of 9.0%, reaching $29.6 billion in 2020.
From a company size perspective, very large businesses (those with more than 1,000 employees) will be responsible for more than 60% of all BDA spending throughout the forecast and IDC expects this group of companies to pass the $100 billion level in 2018. Small and medium businesses (SMBs) will also be a significant contributor to BDA spending with nearly a quarter of the worldwide revenues coming from companies with fewer than 500 employees.
On a geographic basis, the United States will be the largest market for big data and business analytics solutions with spending forecast to reach $78.8 billion in 2017. The second largest region will be Western Europe with spending of $34.1 billion this year, followed by Asia/Pacific (excluding Japan) at $13.6 billion. Latin America and APeJ will experience the fastest growth in BDA spending with five-year CAGRs of 16.2% and 14.4% respectively.
Nimbus Ninety’s research, in partnership with Ensono, identifies leadership and vision as large contributors to failed digital transformation projects
Nimbus Ninety, the UK’s independent research community for disruptive business and technology leaders, have released their latest research into digital transformation in partnership with Ensono, a leading cloud solutions and Hybrid IT services provider. The Digital Trends Report interviewed 251 senior stakeholders responsible for digital transformation initiatives and found that 36% view increased competition from digitally driven companies as a challenge, yet more than half (52%) of respondents rated their organisations’ progress toward achieving their digital ambitions as adequate or poor.
The report, which reviews forces and barriers to digital transformation, revealed that 30% still do not feel well-equipped to seize the opportunities presented by digital. Conversely, 46% stated their organisation has significantly transformed over the past 12 months, suggesting a widening gap between those who are embracing digital transformation and those who have yet to start.
The report also measured barriers to digital transformation, with legacy systems taking an overwhelming lead in blocking progress.Accenture’s Dynamic Digital Consumers Survey revealed that if a global digital giant such as Google or Amazon launched an offer comparable to their mobile operator, 44 percent of respondents would leave their provider, forming a gloomy outlook for Communications Service Providers (CSPs).
The introduction of the eSIM also brings new challenges for CSPs as switching between providers becomes even easier. The survey found that 68 percent of online consumers would be interested in using a device with an eSIM with 50 percent saying it was because of the ability to switch from one provider to another more easily and quickly to get a cheaper call or data plan, accelerating the race to the bottom for data and connectivity and further commoditizing CSPs’ core business. Furthermore, competition will increase as new players enter the market by embedding the eSIMs into the devices at the point of manufacturing, bypassing the network connectivity provider and going direct to the customer.
“The traditional CSP business is becoming commoditized and as a result they must dramatically accelerate their shift into new markets or the window of opportunity will close,” said Francesco Venturini, global industry managing director for Media and Communications, Accenture. “There are opportunities ripe for the taking if CSPs invest in their core business so it remains healthy, while at the same time pivot to accelerate innovation and expand their reach to provide new digital services, offering real value to their customers. To do this they must change their operating models and break down the silos they currently work in to be able to move with the speed and adaptability required to succeed.”
An important differentiation CSPs have over the digital giants is the richness of data available to them because they control the end-to-end delivery platform right through from back office functions to the devices, and top-to-bottom from the applications to the network. If they apply analytics and use this data to their advantage, it allows them to unlock a broader range of innovative new monetization opportunities and to further optimize the customer experience.
One area where this can be applied is within digital video, which is increasingly in demand and provides a huge opportunity not only to drive new revenue through the content, but also considerably from digital advertising. With 37 percent of respondents saying they would turn to CSPs for Pay TV channels and 34 percent for Video on Demand (VOD) services, the door is evidently open for CSPs to succeed in this space.
The Internet of Things (IoT) is also creating fresh growth opportunities for CSPs to offer new services. While it is relatively early days, the connected home, which revolves around machine-to-machine communication offering cost and time savings to consumers, is quickly becoming a reality. But with so many connected devices emerging, consumers face issues with interoperability, usability and security.
The survey found that once consumers do invest in this connected technology, over half (54 percent) experience challenges. Issues include being too complicated to use (14 percent), cannot connect to the internet (13 percent), the set-up does not work (12 percent), among others such as a lack of personalization, privacy concerns or customer support.
The home is becoming a connected and personalised ecosystem of services and CSPs have a significant opportunity to be that single provider to manage the ecosystem in the home:
71 percent of on-line consumers globally who own or plan to purchase connected home services would choose a telecom operator, if they offered itCSPs ranked in second position as the preferred providers for education and learning, home monitoring, online security and storage for smart devices and services49 percent of consumers would choose a CSP for connected health services “The race is on to better serve the evolving and emerging digital ecosystem in the home,” said Venturini. “But with competition advancing, if CSPs do not make the necessary changes to offer hyper-personalized services quickly, they are at risk of losing market share and becoming organizations that purely offer connectivity with no added value, and that means shrinking revenues.”
Industrie 4.0* has been underway for more than five years, and while many businesses have begun some promising Industrie 4.0 projects, key challenges remain that are making algorithms the heartbeat of these projects, according to Gartner, Inc. By 2020, Gartner predicts that at least 30 per cent of Industrie 4.0 projects will source their algorithms from leading algorithm marketplaces — a significant rise from less than 5 per cent in 2016.
"Industrie 4.0 projects are facing two significant challenges," said Thomas Oestreich, managing vice president at Gartner. "First — in the connected world of cyber-physical systems — they need to deal with the sheer volume, real-time velocity and diversity of data. Second, in order to drive new value and differentiating innovations, new algorithms need to be developed. This is making algorithms the pulse of Industrie 4.0 initiatives."
Mr Oestreich added that developing new algorithms requires skills and competencies that most companies do not have yet. To increase time to market and speed up the development process, some organizations employ service providers and combine this with using algorithm marketplaces.
Reusable Algorithms Can Reduce Development Time
Analytics vendors have started creating marketplaces for software components, such as analytical algorithms, to bring greater flexibility and choice to end users. These marketplaces will bring the benefits of the app economy to software development. They will radically lower software distribution costs and improve access to thousands — if not millions — of available algorithms.Algorithm marketplaces offer reusable algorithms, which help organizations speed up their development processes and cope with the transformational changes introduced with digital business. "Reusing prebuilt algorithms and applying them to a specific use case can significantly reduce development time and will offer an important library, expanding the possibilities for in-house development teams," said Mr Oestreich.
Advertisement: Gigamon
"We encourage CIOs to build a task force with data and analytics leaders to evaluate algorithm marketplaces, and then create their own library of available and potentially useful algorithms," said Mr Oestreich.
Early adopters of Industrie 4.0 are also renovating their enterprise resource planning (ERP) solutions. ERP systems are connected to Internet of Things
(IoT) infrastructure that consists of sensors and actuators, middleware to collect and store data, and applications and analytics to make decisions and trigger actions.
"Many ERP solutions are old, and they cannot cope with the amount of data and transactions to be processed, and the level of granularity in business transactions," said Christian Hestermann, research director at Gartner
The music industry is a good example of how an industry has gone through transformation. Customers went from buying complete albums in a record store to streaming one individual song, which triggers an immediate invoice about the microamounts due. "ERP could fast become the bottleneck of digital business, not allowing a business to act quickly enough to grasp digital business opportunities in a fast-changing business world," Mr Hestermann added.
CIOs need to develop digital business moments to grow their business. Signals coming from sensors inside products or from external sources could be used to offer additional services to customers. "This will likely require the modernisation of the ERP solutions involved, as older ones will not support the level of granularity and the volumes of microtransactions required," said Mr Hestermann.
Gartner said that, by 2020, 50 per cent of the companies that have renovated their ERP core and migrated their IoT infrastructure to a standardised platform will increase customer interactions by over 20 per cent.
"CIOs should determine where IoT and digital business play a role in their business scenarios, and develop Industrie 4.0 value chains by modelling the business capabilities that their organisations need," concluded Mr Hestermann. "They also need to assess their current state and their needs for renovation on all layers of the IoT architecture and take the necessary measures to improve."
As cloud computing evolves, it should move away from experimentation and towards enterprise-wide implementation. The first in a regular series of comment pieces from Gartner.
Cloud computing was originally a place to experiment, and has come a long way as a critical part of today’s IT. After 10 years, companies should look for even wider scale investments.
In its first decade, cloud computing was disruptive to IT, but looking into the second decade, it is becoming mature and an expected part of most disruptions. For the past 10 years, cloud computing changed the expectations and capabilities of the IT department, but now it is a necessary catalyst for innovation across the company.
As the technology matures, objections to cloud computing are lessening, although myths and confusing technology terms continue to plague the space.
“As it enters its second decade, cloud computing is increasingly becoming a vehicle for next-generation digital business, as well as for agile, scalable and elastic solutions,” said David Mitchell Smith, vice president and Gartner Fellow. “CIOs and other IT leaders need to constantly adapt their strategies to leverage cloud capabilities.”
It’s not too late to begin planning a roadmap to an all-in cloud future. Mr. Smith provides a few predictions about what that future will look like.
By 2020, anything other than a cloud-only strategy for new IT initiatives will require justication at more than 30% of large-enterprise organizations.
During the past decade, cloud computing has matured on several fronts. Today, most security analysis suggests that mainstream cloud computing is more secure than on-premises IT. Cloud services are more often functionally complete, and vendors now offer migration options.
Importantly, innovation is rapidly shifting to the cloud, with many vendors employing a cloud-first approach to product design and some technology and business innovations available only as cloud services. This includes innovations in the Internet of Things and artificial intelligence.
As the pressure to move to cloud services increases, more organizations are creating roadmaps that reflect the need to shift strategy. At these organizations, projects that propose on-site resources are considered conservative, as the reduced agility and innovation options decrease competitive agility. Enterprises will begin to pressure IT departments to embrace cloud computing.
Keep in mind that not all projects can utilize cloud services due to regulatory or security concerns or even the money that has been invested in the projects. Also, some enterprises might lack the correct skill sets and talent.
By 2021, more than half of global enterprises already using cloud today will adopt an all-in cloud strategy.
The key to an all-in cloud strategy is not to “lift and shift” data center content. Instead, enterprises should evaluate what applications within the data center can be replaced with SaaS, refactored or rebuilt. However, an all-in strategy will have more impact on IT compared to a cloud-first or cloud-only strategy.
By and large, companies that have shifted to all-cloud have not returned to traditional on-premises data centers, with even large companies embracing third-party cloud infrastructure.
Enterprises should begin to plan a roadmap for their cloud strategy, and ensure that lift and shift is only being done when necessary, such as part of data center consolidation efforts.
Shortlist confirmed.
Online voting for the DCS Awards opened recently and votes are coming in thick and fast for this year’s shortlist. Make sure you don’t miss out on the opportunity to express your opinion on the companies that you believe deserve recognition as being the best in their field.
Following assessment and validation from the panel at Angel Business Communications, the shortlist for the 24 categories in this year’s DCS Awards has been put forward for online voting by our readership. The Data Centre Solutions (DCS) Awards reward the products, projects and solutions as well as honour companies, teams and individuals operating in the Data Centre arena.
DCS Awards are delighted to be joined by MPL Technology Group as our Headline Sponsor, together with our other sponsors, Eltek, Vertiv, Comtec, Riello UPS, Starline Track Busway and Volta Data Centres and our event partners Data Centre Alliance and Datacentre.ME
Phil Maidment, Co-Founder and Owner of MPL Technology Group said: "We are very excited to be Headline Sponsor of the DCS Awards 2017, and have lots of good things planned for this year. We are looking forward to working with such a prestigious media company, and to getting to know some more great people in the industry."
The winners will be announced at a gala ceremony taking place at London’s Grange St Paul’s Hotel on 18 May.
All voting takes place on line and voting rules apply. Make sure you place your votes by 28 April when voting closes by visiting: http://www.dcsawards.com/voting.php
The full 2017 shortlist is below:
Data Centre Energy Efficiency Project of the Year
New Design/Build Data Centre Project of the Year
Data Centre Management Project of the Year
Data Centre Consolidation/Upgrade/Refresh Project of the Year
Data Centre Cloud Project of the Year
Data Centre Power Product of the Year
Data Centre PDU Product of the Year
Data Centre Cooling Product of the Year
Data Centre Facilities Management Product of the Year
Data Centre Physical Security & Fire Suppression Product of the Year
Data Centre Cabling Product of the Year
Data Centre Cabinets/Racks Product of the Year
Data Centre ICT Storage Hardware Product of the Year
Data Centre ICT Software Defined Storage Product of the Year
Data Centre ICT Cloud Storage Product of the Year
Data Centre ICT Security Product of the Year
Data Centre ICT Management Product of the Year
Data Centre ICT Networking Product of the Year
Excellence in Service Award
Data Centre Hosting/co-location Supplier of the Year
Data Centre Cloud Vendor of the Year
Data Centre Facilities Innovation of the Year
Data Centre ICT Innovation of the Year
Data Centre Individual of the Year
Voting closes : 28 April
www.dcsawards.com
The successful Managed Services & Hosting Summit series of events is expanding to Amsterdam in April, assessing the impact of market trends and compliance on the MSP sector in Europe. Expert speakers from Gartner and a leading legal firm involved in assessing EU General Data Protection Regulation (GDPR) impact will be providing keynote presentations, speaking as evidence emerges that many MSPs need to “up their game”, particularly in their sales and customer retention.
Customers are demanding more, so Bianca Granetto, Research Director at Gartner will examine new research into digital business and digital transformation market dynamics and what customers are really asking about.
Another keynote speaker, Renzo Marchini, author of Cloud Computing: A Practical Introduction to the Legal Issues, is a partner in law firm Fieldfisher's privacy and information law group which has over 20 years' experience in advising clients across different sectors and ranging from start-ups to multinationals. He has a particular focus on cloud computing, the “Internet of Things”, and big data.
Finally, for those seeking guidance on the high level of merger and acquisition activity in the sector, David Reimanschneider, of M&A experts Hampleton, will look at where the smart money is going in the MSP business and what the real measures of value and time are, and when to sell.
The European Managed Services & Hosting Summit 2017 which will be staged in Amsterdam on 25th April 2017 will build on the success of the UK Managed Services & Hosting Summit which is now in its seventh year and will bring leading hardware and software vendors, hosting providers, telcos, mobile operators and web services providers involved in managed services and hosting together with channels including Managed Service Providers (MSPs) and resellers, integrators and service providers seeking to developing their managed services portfolios and sales of hosted solutions.
The European Managed Services & Hosting Summit 2017 is a management-level event designed to help channel organisations identify opportunities arising from the increasing demand for managed and hosted services and to develop and strengthen partnerships aimed at supporting sales. The event has attracted a strong line up including many of Europe’s leading suppliers to the MSP sector such as: Datto, SolarWinds MSP. Autotask, Kingston Technology, RingCentral, TOPdesk, ASG, Cato Networks and Kaseya.
For further information or to register please visit: www.mshsummit.com/amsterdam
FLASH FORWARD - A one-day end-user conference on flash and SSD storage technologies and their benefits for IT infrastructure design and application performance.
1st June 2017 – Hotel Sofitel Munich Bayerpost
Since the very early days of flash storage the industry has gathered pace at an increasingly rapid rate with over 1,000 product introductions and today there is one SSD drive sold for every three HDD equivalents. According to Trendfocus over 60 million flash drives shipped in the first half of 2016 alone compared to just over 100 million in the whole of 2015.
FLASH FORWARD brings together leading independent commentators experienced end-users and key vendors to examine the current technologies and their uses and most importantly their impact on application time-to-market and business competitiveness.
Divided into four areas of focus the conference will carry out a review of the technologies and the applications to which they are bringing new life together with examining who is deploying flash and where are the current sweet spots in your data centre architecture.
The conference will also examine what are the best practices that can be shared amongst users to gain the most advantage and avoid the pitfalls that some may have experienced and finally will discuss the future directions for these storage technologies.
The keynote speakers and moderators delivering the main conference content are confirmed as respected analyst Dr. Carlo Velten of Crisp Research AG, Jens Leischner the founder of the end user organization sanboard, The Storage Networking User Group, Bertie Hoermannsdorfer of speicherguide.de and André M. Braun representing SNIA Europe.
Sponsors include Dell/EMC, Fujitsu, IBM, Pure Systems, Seagate, Tintri, Toshiba, and Virtual Instruments and the event is endorsed by SNIA Europe.
Registration is free for IT managers and professionals from end-user organisations via the website www.flashforward.io/de using the promo code FFDEEB1. Resellers and other representatives of channel organisations are also welcome to attend for a small registration fee.
Vendors interested in sponsoring the event should contact paul.trowbridge@evito.net.
Top three tips to get value from digital-first strategies.
By Richard Whomes, Senior Director Sales Engineering, Rocket Software.
2016 was heralded by some as the year of digital transformation for businesses. You only need to look as far as your local post office or supermarket, both of which are likely to have replaced regular tills with self-service checkouts, to see the impact digitisation is having on everyday businesses. It’s no surprise that digital transformation has become a buzzword phrase in the last 12 months, with leaders across industries championing its ability to revolutionise company operations. However, the reality of the situation is that modernising business processes to bring them into the digital age is presenting a serious challenge for many; it’s not always easy to see what digital transformation looks like for your own business.
Recent research has revealed that 81 per cent of CIOs believe legacy systems are having a negative impact on their businesses, but although there is clearly a case for infrastructure investment, the need to disrupt needn’t be tantamount to huge expenditure. As Jason Kay, Chief Commercial Officer, IMS Evolve, explains, if the Internet of Things (IoT) solution you are considering for your business requires any kind of rip and replace, maybe you should think twice before taking the plunge as your existing infrastructure is actually smarter than you think.
The Internet of Things is at the heart of many digital transformation strategies – streamlining processes and improving efficiencies. Yet, for those looking to reap the rewards of IoT, ripping out supposedly out-dated systems and replacing them with new, state of the art facilities won’t do the business any favours. Not only will this exercise be extremely costly, but it’ll also remove the infrastructure that is already there – infrastructure that holds a mountain of data that can be put to work improving the business.
The potential business benefits of the Internet of Things are truly transformational; not only to increase productivity and save money, but to generate more by adding value to the business’s core purpose. This is fantastic in principle, but how many are willing to explore this potential if there’s a huge initial investment attached to it? Take refrigeration and cooling with the food retail industry as an example. It seems unlikely that replacing every in-store fridge, freezer or food delivery van is going to appeal to a multiple retailer whose core business focus is selling produce, as the cost will far outweigh the immediate benefits. For businesses undergoing digital transformation, rather than investing in brand new equipment, the answer lies in existing infrastructure. The data that is an increasingly vital part of many enterprises’ digital strategies is already being generated, businesses simply need a way of extracting, understanding it and releasing its value.
Using an IoT layer, businesses can tap into the available data locked within legacy machines. By integrating it with supply chain and merchandising systems as well as the fridge control systems in real time, the temperature of each fridge can be automatically managed to suit its specific contents. As a result, not only is energy consumption reduced, but a higher quality product can be achieved, resulting in a better customer experience.
Likewise, within the food manufacturing industry, IoT technology at the edge can provide the necessary insight to monitor each stage of the process when creating foods in batches. Consistency of both ingredients quantities and environmental factors can be regulated and the available data from each stage of the process united to ensure the highest quality, most profitable end product every time.
There’s no denying that the Industrial Internet of Things is gaining momentum, but legacy equipment should not be a setback to its fruition – it should be at the heart of it. Many enterprises have the data needed to modernise sitting unused in their current systems, and now is the time to unlock it. IoT offers a world of opportunity but it must be implemented in a way that makes business sense. By taking full advantage of existing infrastructure with an IoT layer, enterprises can efficiently and effectively transform their organisation in a way that supports their core business purpose.
It’s been an interesting time in the app market. Last year Apple boasted a record year for downloads from its app store, amidst industry cries that apps are dead. Apps such as Nintendo’s Super Mario Run and the cultural phenomenon of Niantic’s Pokémon Go have also reinvigorated hope that apps still appeal to the mass market. So, are the older statistics that suggest that smart phone users download only one app per month, now redundant? And, has the Lazarus that was the app come back to life?
By Mark Armstrong, VP and MD EMEA at Progress.
It does look like the clouds are clearing over the long term future for apps. However, rather than this being a consequence of consumers rejecting ‘app fatigue’ – it is in fact a result of new and advancing technologies that are changing our relationship with how we interact with apps.
Enterprises are catching up when it comes to delivering the same technologies that their employees use at home. Consumer apps offer sophisticated functionalities that business apps often can’t match due to finite resources. Yet, as VR and AR technologies become more affordable, businesses have the opportunity to make their apps as compelling or engaging as those that their employees or customers are used to.
However, to ensure adoption, it will be better for businesses to prioritise their app development. Users expect quality and have a limited tolerance for apps which don’t perform which is why 80 per cent of apps are deleted after just one use. So, spreading resources thinly to develop apps to solve every need won’t work. Concentrating on solving challenges that align with business priorities will help to narrow down the deliverables and increase the likelihood of developing an engaging app.
And, additionally, businesses should look to popular consumer apps such as Pokémon Go and Super Mario Run for inspiration. Both apps bring something people have wanted for years – Mario on their smartphones – while Pokémon Go brought a new kind of experience in AR (both have also been helped by lashings of nostalgia).
E-learning apps have had the most success to date in replicating such experiences through gamification. For example, Kineo worked with their client McDonald’s UK on a Till Training Game, which delivered an engaging and memorable learning experience to support the launch of a new till system to 1,300 restaurants. As well as trending on social media sites as learners set up self-styled leader boards to compete against each other.
Now businesses have the opportunity to layer on top both AR and VR experiences to capture users’ attention and to demonstrate that they are a digitally forward enterprise.
New technologies - which apps can tap into and integrate with - deliver excitement, interactivity and extra level of utility. And this type of interactivity is changing. The global Intelligent Virtual Assistant (IVA) market size is expected to reach $12.28 billion by 2024 according to a new report by Grand View Research. The virtual assistant is set to become the next new interface for applications. Gartner says: “Conversational systems shift from a model where people adapt to computers to one where the computer “hears” and adapts to a person’s desired outcome.”
So, apps built to capitalise on voice recognition technologies and AI are where enterprise apps will see real success, leveraging powerful decision-making intelligence and the natural input method of voice recognition to deliver improved interactivity, experience and utility. The information provided by the digital assistant is personalised, accurate and relevant - in other words every app’s dream scenario.
In 2017 will see tech giants double down on AI products that use user and context data such as location and time, and combine it with advanced search and powerful decision making to respond directly to voice requests and questions. Google opened up its assistant to developers in December 2016, allowing them to build "Direct and Conversation Actions" for Google Assistant. This is a huge opportunity for a new wave of applications.
The future of apps is looking more exciting than ever before. New technologies have sought to reinforce their relevance. The ways in which they are adapting to the way users want to use them, and also driving ease of use, is increasing their popularity once more. Businesses and app developers have the opportunity to take ‘best practice’ lessons from consumer applications and apply these to their long term app development strategy. In doing so, we will truly start to see a meta-app era where apps are truly smarter, genuinely interactive and can anticipate our needs in both the consumer and business domain.
Cybersecurity breaches are happening at an industrial scale. The unabated volume, scale, and magnitude of these breaches are forcing the entire industry to re-think how security should be managed, deployed, and evaluated.
By Adrian Rowley, Lead Solutions Architect EMEA, Gigamon.
While it is simply unacceptable to be complacent when it comes to cybersecurity, there still seems to be some confusion among many organisations when it comes to their perceived level of security and actual cyber-readiness. In fact last year, an IT Security study found that only 55 percent of UK government organisations have an IT budget dedicated to security solutions, which just isn’t enough. This lack of commitment is especially worrying with the increasing amount of attacks across vital industry sectors – such as energy, water, telecoms, financial services, transport, defence and government – commonly referred to as Critical National Infrastructure (CNI). So where do investments in security need to be made?
While these type of threats and attack are not new by any means, an increasing amount of CNI continues to move online and more devices are connecting to networks with questionable levels of security. Previously, the Supervisory Control and Data Acquisition (SCADA) architecture in these CNI systems were isolated from the outside world and therefore more difficult for hackers to infiltrate. However, now attackers don’t need to expend nearly as much energy to hack CNI organisations , as the proliferation of the internet in this sector has made their networks much more accessible. One of many examples of the impact of an attack on CNI was seen in Ukraine in December 2015 where the electricity grid was taken down by a cyber-attack, affecting nearly a quarter of a million citizens. It doesn’t take a vivid imagination to envisage how an attack to a country’s construction, finance, telecoms, transport, or utilities systems could have devastating consequences. Unfortunately, despite companies’ best defences, increasingly organised hackers are not only getting through, they are staying hidden and undetected on networks for longer.
Organisations and security vendors must avoid complacency and instead fight smarter to identify, isolate and eliminate cyber threats faster. Companies need to constantly examine the way that their data security models are deployed and managed and ensure they are fit for purpose.
Advertisement: Gigamon
There are inherent issues in trying to stem the rise in attacks on CNI systems, and that problem goes back to the fundamentals of the security model itself. The traditional way in which organisations had set up their security model has led to cybersecurity systems that are simply inadequate at addressing the level of cyber breaches that organisations face today.
It seems that for a while, in order to keep out hackers, the main focus was to bolster their perimeters. There was the simplistic assumption of what was outside the perimeter was unsafe and what was inside considered secure. That perimeter security typically consisted of a firewall at the internet edge and endpoint security software such as an antivirus solution, at the user end. However, most of the perimeter firewalls and endpoint security software solutions leverage rules and signatures to identify malware. Today, many of the cyber breaches exploit zero-day vulnerabilities: vulnerabilities that have been detected but a patch for it does not yet exist in various pieces of software. Consequently, it is increasingly difficult for traditional perimeter-based solutions to prevent malware and threats from breaking in. Ultimately, this means hackers are by-passing perimeters and staying on networks.
Another aspect of the original model was a high reliance on employee trust; employees were considered trustworthy while everyone else was a threat. However, many offices now have employees who use personal computing devices, such as smartphones for business use, or their work force consists of more than just employees, but also consultants, contractors, and suppliers all needing to access the network and IT resources. This creates multiple points of entry for potential hackers and makes the simple trust model unrealistic, as the threat could just as easily come from within.
Furthermore, security appliances were deployed at fixed locations. Typically, these would assume a fixed perimeter or a set of fixed “choke” points at which traffic was expected to traverse and be monitored for threats. However, with the advancement of IoT, BYOD, and the general mobility of users and their devices, the predictability of traffic patterns and these fixed “choke” points has diminished. Additionally, the adoption of the cloud has blurred the edge and perimeter boundaries. This is making the workplace a far more dynamic environment with far less predictability on where the boundaries and choke points lie. Consequently, the ability to consistently and comprehensively identify all threats based on the static deployment of security appliances at fixed locations has been severely impaired.
Despite these issues, and the fact that cybercriminals are becoming much more sophisticated in their approach, many organisations are still using traditional security architectures to prevent network breaches. Criminals have set their sights on bigger targets with much greater fall out. Today’s threats are far stealthier, more sophisticated and destructive at an industrial scale. Many of them are grouped under an umbrella called Advanced Persistent Threats (APT), named so as they compromise the network and take up residence there for lengthy periods of time, and are the source of many of the recent large scale breaches.
Modern security strategies have to be forged on the assumption that breaches are inevitable. In other words, there must be a growing emphasis on detection and containment of breaches from within, in addition to prevention of breaches. Since the network is the primary medium that bridges the physical, virtual and cloud environments, network traffic is becoming increasingly critical for its role in providing the window to the enterprise for malware and threats. Organisations need to have persistent visibility to analyse network traffic for threats, anomalies, and lateral movement of malware. There needs to be a structured platform-based approach that delivers traffic visibility for a multitude of security appliances in a scalable, pervasive, and cost effective manner. Such a platform would deliver visibility into the lateral movement of malware, accelerate the detection of exfiltration activity, and could significantly reduce the overhead, complexity and costs associated with such security deployments.
With the current threat landscape including industrialised and well-organised cyber threats on a national level, it is no longer sufficient to focus on the security applications exclusively. Focusing on how those solutions get deployed and how they get consistent access to relevant data is a critical piece of the solution. A security delivery platform in this sense is a foundational building block of any cyber security strategy.
The changing cyber security conditions are driving a need for a fundamental shift in security. As organisations accept the inevitability of network breaches, their focus should shift to security architectures that detect malware and threats within the organisation, and respond to mitigate risk. Doing this requires far deeper insight and far greater coverage across the infrastructure than traditionally feasible and consequently a new model for deploying security solutions. This model must have pervasive reach and insight into the network and equips organisations to better protect themselves against the hackers that have raised the stakes.
The changing nature of today’s work environment, combined with constantly evolving technology, is causing organisations to evaluate the best ways to enable employee communications and collaboration. Dispersed workforces, mobile workers, telecommuting, increased travel disruption and expenses are creating demands for more efficient real-time interactions. These all point to a very logical and justifiable move toward video conferencing as a required tool in any company’s IT offering.
By Andy Nolan, Vice President UK, Ireland and Northern Europe at Lifesize.
Many organisations are having to manage a workforce that may be located outside the main office, country and time zone. They are therefore experiencing greater demand for collaboration technology from employees that adopt an agile working approach and managers that need to improve efficiency and productivity whilst maintaining seamless communications.
This has seen an explosion in integrated Software as a Service (SaaS) conferencing solutions that are able to deliver easy to use consumer like services. Many organisations that have adopted this technology see immediate benefits due to an instant ‘on service’ and quick adoption process with users that enable them to communicate seamlessly with colleagues, customers and partners alike.
Many of these organisations are seeing real tangible benefits in terms of employee recruitment, retention, and an increase in productivity, leading to a significant competitive advantage. At Lifesize, we monitor customer feedback very closely once they adopt a solution. For example, one of our customers has reported that they have seen a clear enhancement in their supply chain communications, as well as significant improvement in collaboration across dispersed teams. In addition, using a cloud-based solution has resulted in minimal internal IT costs for most of them.
As with any move to a new technology solution, a successful transition to cloud-based video communication technology requires a few key steps:
First, ensure you’re making the decision that best meets the needs of your organisation. Most businesses move to a cloud solution in search of the following benefits:
1. Reduced Cost — IT departments are under pressure to reduce expenses. Investing in on-premises video conferencing infrastructure and hardware can cost hundreds of thousands of dollars. A cloud-based video conferencing service extends IT budgets much further than a lump sum, upfront payment for an on-premises solution.
2. Increased Flexibility — It is difficult to deliver and support video conferencing sessions that may be scheduled with little to no notice, but this kind of flexibility and agility is required in a dynamic business environment. With an intuitive, cloud-based application, IT administrators can get valuable time back by enabling users to become more self-sufficient. In addition, enterprise-class, cloud-based video communication platforms offer more flexibility to scale up or down as per business and user requirements than those services that grew out of consumer applications.
3. Simplified Automation — IT admins who choose a cloud-based video conferencing vendor need no longer struggle with the decision of whether to build and manage the video conference infrastructure in-house, freeing up valuable time, money and effort. A web-based admin console gives IT administrators easy access to dashboards to check usage statistics and to customise settings.
4. Increased Device Diversity — Video conferencing software needs to support a wide range of “bring your own device” (BYOD) technology. It can be challenging for support staff to stay current on all of the different smartphones and operating systems upgrades, so organisations often prefer to give that task to a cloud provider who can also handle seemless integration and support. Professional cloud-based video conferencing solutions bring collaboration to the devices employees use every day from browser- and desktop-based applications for laptops and tablets to mobile applications for smartphones. No need to worry about software version or complicated updates.
5. Interoperability — On-premises video conferencing products usually lack interoperability and still live in a proprietary software world that requires users to purchase an expensive gateway appliance to call third-party video communication devices. Organisational decision makers who move video conferencing to the cloud make interoperability the vendor’s responsibility and avoid being saddled with incompatible equipment that leaves users in isolated silos.
Once you’ve determined that a cloud-based video conferencing solution is right for your organisation, it’s time to vet your vendors. Decision makers should pay close attention to the vendor service level agreement and support, ask about outage levels and discuss vendor availability. This is a business-critical tool that must have the highest levels of support and service levels.
Those who are interested in continuing to provide their employees with the benefit of a traditional face-to-face meeting in a conference room, along with the ability for remote party dial-in and face-to-face meeting options, should look at a hybrid model including a cloud-based video conferencing solution with a conference room-based hardware component to offer employees the best of both worlds.
Implement Effectively
Lastly, effective implementation is key — even if the software is a perfect fit for your organisation, it won’t work if it isn’t implemented correctly. Effective implementation includes:
1. Choosing a Point Person — Selecting the right team and establishing who does what and when mitigates the likelihood of a stalled implementation and ensures that everyone knows who has the final word.
2. Establishing a Timeline — Imposing a clearly defined timeline and ensuring the vendor will be available when needed eliminates the risk of ongoing implementation with no end in sight.
3. Maintaining Focus — Resisting the temptation to think about every detail rather than focusing on the big picture is vital to moving the implementation forward. Take care of the high-level issues first and enjoy the additional features after implementation.
4. Listening to the Expert — Clients often expect software vendors to configure the new software to match their internal operations, which can lead to long-term negative consequences. Remembering who the expert is — the vendor when it comes to the new software and the organisation when it comes to business operations — is vital to implementation success.
5. Gaining User Buy-in — After a long vetting process, it’s common to assume the internal sales process is over. Nothing could be further from the truth. The employees who will be using the new software the most may not have been involved in the approval process. Ensuring that they value — and champion — the use of the new software will increase the likelihood of a timely implementation. It is important to gain the trust and buy-in of the users and give them a say in the implementation.
As businesses continue to employ distributed workforces around the world, the demand for an enterprise-ready video communication solution that can provide a seamless, connected experience will continue to increase. Organisations that go “all in” and commit to improving employee communication and collaboration stand to benefit from a workforce that returns the favour with increased productivity, less staff churn and great employer-employee relationships.
DW asked the ICT vendor community for their views on how IoT will (or will not) impact on the enterprise over the coming months and years. Here we publish their thoughts and comments. We’ve also been running a series of IoT blogs on the Digitalisation World website, which are well worth checking out at: https://digitalisationworld.com/blogsall
Connected consumer devices have become widespread in smart homes, and they are making their way into enterprises at a rapid rate. However, as manufacturers rush creative concepts to market, security is often an afterthought and these devices can quickly become low hanging fruit for hackers looking for an easy way into corporate networks. Hackers are now able to scan networks for vulnerable devices – often protected with very basic credentials – before uploading malicious code and hijacking said device. While a compromised gadget may pose a challenge on a home network, it can have truly disastrous consequences for businesses as they begin to share, transmit and connect to increasingly valuable data.
A good example of how vulnerable Internet of Things (IoT) devices can be is a recent attack that took place at an unnamed university. In this case, 5,000 IoT devices, including vending machines and light sensors, were hijacked. On their own, they would add little value for hackers, however these devices were connected to the university’s administrative network, which essentially provided access to the university’s entire IT infrastructure. If it had failed to have been spotted in time, the entire campus network could have been brought down.
Monitoring and securing the endpoint continues to be the key solution to managing IoT, as it was with BYOD. IoT is about creating connectivity for a wider range of devices; what we see now with watches and glasses could conceivably extend to any object. In fact, Cisco predicts that there will be 50 billion connected devices worldwide by 2020. With any object offering the capability to send, receive and store data, enterprise data security will need to extend a lot farther than it does now.
IoT can create new efficiencies and productivity in the long-run, however it’s becoming increasingly clear that these new endpoints come with new risks. It’s therefore wise that organisations firstly ensure that every ‘thing’ that is connected to their network is necessary. Does it add value to the business, or is it merely a novelty to have a kettle that turns on remotely? Having separate networks for IoT systems may be the best option if this is the case.
Secondly, research needs to be conducted on those that are creating the internet-enabled devices and objects to ensure that they have adequate security measures in place from a manufacturing standpoint. A full evaluation should be carried out to assess the risks.
As with network security, IoT requires a layered approach to defend against the multitude of risks. Most endpoint devices are poorly protected in comparison with network defences, and often there is little oversight or management of those protections, once installed. IoT requires that organisations have a means to track and monitor all these new devices to secure enterprise data and network access. Policies must be in place that prioritise data security for employees and in the event that a security incident does occur, data needs to be safeguarded and steps must be taken to prevent recurrences of the incident. With GDPR pending, enterprises will soon need to prove that data is protected in a breach situation, so having a persistent connection to each device to prove compliance will be essential.
The IoT’s ability to gather and leverage data on a vast scale means that it now sits at the forefront of business innovation. From delivery vehicles, to ATMs, to air conditioning systems, the scope for connecting devices in industry appears to be as big as, if not bigger than, the comparable consumer opportunity. However, for the business world, adopting the IoT comes with major barriers: currently, the tools that help businesses deploy and secure IoT devices are not fit for purpose.
Some of the main issues in today’s connected enterprises centre around weak encryption and authentication, which leave IoT devices vulnerable to hacking. Some device systems are ‘closed’, meaning that they are hard to remotely maintain and update. Once organisations have numerous IoT devices, it becomes difficult from an operational standpoint to get physical access to each device to fix any flaws. When the size of the IoT network goes into thousands, deploying the device and a security solution for it becomes a logistical nightmare.
Businesses require next-gen firewall technology, with its advanced traffic inspection options, because traditional firewall products were never designed with mass rollouts in mind. This has made security vendors rethink some of the traditional design paradigms, to improve scalability and ease of use from an operations standpoint, whilst not giving up on any of the required technical capabilities.
Many startups in the M2M space have created products that meet the form factor requirements, the need for very simple handling and support for wireless WAN connectivity. However, they fall short of meeting security requirements and sustainable scalability throughout product lifecycles. Meanwhile, the majority of firewalls available today are so expensive that it is simply not feasible to implement them at every connected device. Other technologies attempt to run an application that encrypts the data. In this case, there is no DoS protection, so infrastructure is not properly secured.
As there is no ‘one size fits all’ solution, companies need to approach IoT security in a case-by-case manner. A good place to start is to build a clear picture of all network components – how are devices connected? what sits between the devices? is there any remote monitoring taking place? Companies should also look at segmentation. This will ensure that only those who need access to certain zones have access. If the company doesn’t have enough segmentation, this could present a serious security risk, particularly when third parties are concerned. IT teams must create a plan for the network and security architecture and look at both existing security technologies and the built-in security of their devices. Only at this point is it possible to make an informed decision about suitable security tools.
Why master data management is crucial for creating business value from information to support digital transformation.
By Dr Rado Kotorov, VP Products and chief innovation officer at BI and data analytics pioneer, Information Builders.
‘Industry 4.0’ and, ‘Enteprise 4.0’ are the terms being used to describe the current digital transformation of enterprises based on data analytics, mobility, the cloud and the internet of things (IoT). Within the next three years it is anticipated that enterprises will be consuming and analysing data from billions of connected devices.
The PriceWaterhouseCooper (PwC) report: ‘Industry 4.0: building the digital enterprise’, states that “Data analytics and digital trust are the foundation of Industry 4.0.”
To have maximum impact, data needs to be understandable to all employees and not just be the preserve of data scientists. However, unless your employees on the front line can trust the data in front of them, then they cannot be empowered to use that data to make decisions on the front line that assist your customers, generate value, add revenue and encourage repeat business.
In the book, ‘Organizational Intelligence’, Gerald Cohen, president and CEO of Information Builders, writes, “The path from raw data to data monetization all starts with data capture, which involves many diverse technologies that have evolved to capture specific data types: transactional data, logistic data, marketing data, customer data. The more integrated this data is, the more complete the picture. Transactional data combined with customer data can reveal patterns and trends that can be monetized in marketing operations.”
He warns: “Like any asset, data has quality characteristics and thus requires special attention to maintain it. As the saying goes, “garbage in, garbage out” – if the raw material is of inferior quality, then the finished product will be of poor quality.”
Data cleansing and the creation of the “golden record” are the first steps in creating business value from information to support digital transformation. Organisations need to be able to profile, cleanse and enrich data to support better decision-making: ensuring that enterprise information is consistent, accurate and complete.
Advertisement: Managed Services And Hosting Summit Europe
Since data analytics are a critical component of Industry 4.0, organisations need to be able to identify and correct bad data, in real time, before it enters the enterprise.
Claudia Imhoff, a business intelligence analyst with Boulder BI Brain Trust (http://www.bbbt.us/) has written, “Master data management (MDM) is the fundamental underpinning for a trustworthy and reliable analytics environment. MDM not only ensures consistent information about customers, products, and other major data domains across all systems, but also ensures that business users obtain results that they can unequivocally believe in.”
Using an automated rules engine to dynamically cleanse data, allows data quality rules to be defined then applied across the information landscape, triggering instant changes to existing data to eliminate mistakes. By ensuring the consistency and accuracy of data, it can then be enriched to drive better decision making.
Our core philosophy has always been to use data to enable insights that drive business forward, through pervasive, user-friendly analytics that are accessible to operational employees. Forty years ago Information Builders pioneered self-service analytics on the mainframe, a task that formerly required Cobol programming skills. The ensuing decades has seen our data analytics technology evolve from the mainframe, to client server, to cloud and now to mobile analytics and big data generated by IoT.
Analysts predict that the next three years will see dramatic growth in customer-facing analytics. Gartner has predicted that, by 2018, half of customer service agent interactions will be influenced by real-time analytics.
However, unless they trust the data, employees cannot use it in their daily work to serve the needs of partner organisations, customers, patients, citizens. Without operational staff acting upon information and insights, organisations are unable to reap return on investment from their business intelligence.
For the second consecutive year, Gartner has contacted the reference customers of 17 vendors to ask about the tools and platforms that they use for data profiling, cleansing, visualisation and workflow functionality and recognised Information Builders as a leader in its Magic Quadrant for Data Quality tools.
Once raw data has been cleansed, processed and made more accessible to your operational staff, your data can begin to generate added value for your organisation.
In our book, Organizational Intelligence, we describe how First Rate Investments monetised its data by creating an Info App, called ExecView, which transforms First Rate’s massive data set into informational data points that help clients to figure out how their businesses are trending.
Deborah Repak, general manager of the Products group at First Rate Investments comments:
“We are sitting on a treasure trove of information in our databases, and we asked ourselves: How can we help our clients glean insight from all of this data?” By offering an InfoApp that helps customers to obtain their own insights from data, First Rate created a new business.
Gartner estimates that there were 6.5 billion connected devices in use this year and that this number will increase to 21 billion IoT devices by 2020. Cisco believes this figures will be much higher, predicting 50 billion connected devices by 2020. IoT devices will contribute a huge volume of data, making it a critical requirement to cleanse it before it enters your organisation.
The PwC report, ‘Industry 4.0, building the digital enterprise’, found that 33% of organisations have already achieved advanced stages of digitisation and predicts that 72% of organisations expect to have reached advanced stages of digitisation by 2020.
The report states that to achieve advanced digitisation, “Companies will need robust data analytics capabilities; that will mean making significant changes. They’ll need to focus on developing the people and culture to drive transformation.” The report acknowledges that this digital transformation will require substantial effort and investment. However, when they get this right, PWC predicts that these organisations will achieve ROI within two years.
In its blueprint for digital success, PwC, advises that Industry 4.0 organisations must “become virtuosos in data analytics.” We agree, but before you can begin to play well with data, you must first ensure that the instrument is in tune. Look after the data and the analytics will play true.”
Recent statistics from the Office for National Statistics
(ONS) reveal the UK’s productivity puzzle remains unsolved and, in fact, lags 18% behind other G7 nations. Given that
the UK is one of the most technologically advanced countries in the world, it
is surprising to see technology hasn’t helped cure the UK’s productivity gap,
especially as IT underpins the way in we work today.
By Russell Acton, VP & GM International, Capriza
A huge part of the productivity challenge is how businesses are structured. Too often this is ‘vertically’, around functions such as Finance, Sales, Engineering, Marketing and HR. This can lead to organisational silos creating challenges for the company to operate 'as one'. You can see this manifest itself in the different enterprise applications for each line function, such as SAP for Finance, PeopleSoft for HR or Siebel for Sales. The list goes on.
What many organisations fail to realise is that they need to learn how to operate ‘horizontally’ across the entire business, not in their respective silos. The problem for them is many of their customers believe they already operate this way. In the eyes of a customer, a business is expected to behave as one entity, not operationally disjointed. In fact, many of the frustrations customers’ experience can often be attributed to these disjointed functional silos.
How a company is organised is supposed to support this objective, not create friction. Those who understand this recognise that each line of business function can operate better by accessing the information required, much of it already to be found in another enterprise application. Serving that data up to those who need it, at the time they need it, anywhere, on any device, can have a profound impact on corporate productivity.
Conversely, when they don't, you see all too often the 'heat' or ‘friction’ that builds leading to employee frustration, customer frustration and lost opportunities. It's not unusual to even see a whole job role created just to ensure tasks can be joined up and completed.
By the end of the week, not all the data entered in the systems would be accurate. Because information is being transferred on bits of paper, job codes could be mixed up and actual time spent would sometimes be guessed. This could lead to job cards which might show an underspend or an overspend compared to the planning process. Not a huge issue in itself. Or is it?
It was, for another part of the business; Finance. The data inaccuracies recorded from the field were creating a much bigger problem. Perhaps it took twice as long to install a cell tower due to the conditions on the ground - hidden rocks that need removing first or any one of numerous issues. In the customer purchasing team, which is placing hundreds of orders, unusual invoice line items would be noticed outside of the norm. A natural reaction is to query the resulting invoice from the supplier's accounts. Now there are two teams involved across two companies to sort it out.
All too often, the supplier has to swallow the disputed amount because empirical data does not often exist. The engineers are just recording hours completed, not why. With Capriza, they extended the existing ERP solution to the engineers on their mobiles. Providing simple, easy data entry not only put a smile on the face of the engineers but data can be captured at source, at the point the job is completed. In addition to the information being more accurate and reliable, it is also enriched.
Notes can be easily included so, for instance, that pesky rock can be recorded in a photo. Critically, the information is available ‘horizontally’ across the entire business, enabling finance teams to invoice quicker. The system now tells the finance teams why there may have been extended working hours, so they can inform the customer exactly what occurred. In essence, the information input by the engineers now flows through the organisational silos all the way to the customer. The better availability of data and increased detail has helped improve customer service, reduce invoice queries and speed up payment.
The larger the organisation, the more fraught the friction points can be. But also, there is more opportunity for improvement. Let’s take the example of the second largest US city, Los Angeles. Across its 48 departments, which employ a total of 48,000 employees, it used its own custom-built technology systems, none of which had standardised mobile devices, making access to information difficult. The city needed an enterprise mobility app solution that could work across any of these systems, on any device, be cost effective, and easy for the IT team to deploy.
Friction in thousands of individual processes spanning numerous departments adds up to tremendous amounts of ‘operational drag’, which reduces productivity. If operational drag can be fixed, productivity is increased and the organisation naturally becomes more responsive. But how do organisations work at speed when, in many cases, existing organisational structures and, importantly, the technology built up to support them, holds them back?
Advertisement: DTC Manchester
Organisations need to better understand how the baton is passed between each stage of a process so they can operate fluidly. For example, it is often extremely valuable for sales teams to know what’s in stock and what isn’t, and for field engineers to know what the production pipeline is when committing to servicing customers. All too often, it’s time-consuming to get this information. Generally, though, greater visibility into such processes across an entire business, leads to more informed employees who can perform better for their organisation.
By implementing a micro-app created by Capriza, instead of investing in hardware, infrastructure and development resources, the City of LA enabled its employees to be productive, kept city officials up to date, and Angelenos connected to the services their taxes are paying for.
The best organisations implement systems to support the working processes ‘As is’, not the vertical functions on the organisation chart, which were designed for a different era. They are the ones really driving productivity in the market. Instead of putting a rigid system at the centre of their world, organisations should think about what their workers want to do and then design a support mechanism, built on flexible technology which helps them to do their job as it is today.
Critically, this shift in attitude, where the employee is put at the centre of the universe, relegates the system from being the main attraction. This user-centric change is all about delivering the right information, at the right time, to get the job done simply and easily. People matter, so why let friction caused by technology get in the way of them doing their job?
The 2015-2020 Global Mobile Workforce Forecast predicts a rise from 1.32 billion in 2014 to 1.75 billion mobile workers in 2020. This means mobile workers account for 42% of the global workforce. Mobile applications, designed with the user in mind will, therefore, be a great enabler for workers, especially as organisations embracing a mobile strategy will help employees become 34% more productive, gaining 240 hours of work per year.
With micro app technology now readily available, organisations should reassess their mobile requirements. Those operating legacy software, not built for modern-day speed and mobility, can easily build micro apps to quickly transform themselves from digital dinosaur to mobile predator, helping to modernise and update processes that plague many older organisations. Particularly for organisations struggling to digitally transform legacy systems, their entire survival may depend on unlocking their processes to deliver in a micro-service world.
Enabling employees to make ‘Uber-quick’ decisions will remove ‘blocks’ across the different vertical silos and will significantly improve the employee's ability to deliver value. Critically, that value can be turned into profit.
Employees are faced today with an information overload. It’s hard for them to wade through the maze of systems screens to extract the important information. Simplifying things can make a big difference. Moving from a reactive email culture to a more proactive way of working using push notifications and alerts synced to the user’s key performance indicators (KPIs) is an easy win.
Imagine an employee sifting through vast amounts of unread emails, which can be like drinking from a firehose. Often, important information gets lost in the never-ending flow. Being able to filter the most important things through smart alerts and notifications helps highlight key tasks and puts users in control when things need to be done. This helps them be most effective. Merely giving them this power promotes quicker, more efficient responses because they can quickly ‘Swipe-left, Swipe-right’, using their professional judgement to decide how best to increase their productivity.
Giving employees the ability to realise their true potential is essential in a world where robots and artificial intelligence threaten tomorrow’s jobs. Perhaps by removing the burden which legacy technology has on workers, they can show their true value to their employees. People are vital for all businesses. Enabling them with modern technology can help unlock the productivity puzzle and will power them for the future.
DW asked the ICT vendor community for their views on how IoT will (or will not) impact on the enterprise over the coming months and years. Here we publish their thoughts and comments. We’ve also been running a series of IoT blogs on the Digitalisation World website, which are well worth checking out at: https://digitalisationworld.com/blogsall
I believe the IoT has tremendous potential to transform how we work by helping us to improve the way we do things.
The future workplace could – and should – be like a sports stadium where, with the help of TV cameras covering every angle, professional sportsmen and women can watch replays of their performance over and over again in order to learn from their mistakes and make them better at what they do. They can look at tiny details and identify potential for marginal gains which, if they can be implemented, will make the difference between good and exceptional performance.
The same principles could easily be applied to the world of work. Everything we do – whatever our job – could be watched over by video cameras. Connect these with cloud-based observational management systems such as Cloudview and there is nothing to stop this happening today. Such systems are secure, reliable, convenient, easy to use and inexpensive to install and run.
The technology to do this is already available – all that’s preventing us is our mind-set. It’s time to stop thinking of video surveillance as an arcane security product readily abused by Big Brother or a boss eager to monitor their behaviour of their workforce.
Think instead of how video can be used to help streamline workflow in a factory or ensure that people lift heavy loads correctly to avoid injury. What about the potential applications in a hospital, studying the performance of surgical or A&E teams to improve the placing of equipment so that it’s readily to hand when time is of the essence?
The use of cloud-based observational management could also help to sell products and services. These days we are all conscious of the supply chain and want to know the provenance and composition of what we eat. Rather than just a pretty picture on the packaging of our pizza to tell us about the maker’s brand values, how about a QR code that takes our smartphone to live video of the manufacturing line to win our trust?
We can – and must – embrace cloud-based video surveillance as a key tool to help us replay what we do and so learn from our mistakes.
As mobile devices, sensors and the Internet of Things (IoT) proliferate, businesses are using the cloud to transform themselves and their customer’s experiences. The result is a hyperconnected world where communities collaborate to solve challenges and every company must re-think its fundamental assumptions about the business it is in, the services it provides and its relationships with customers, competitors and the world.
Hyperconnected networks offer companies competitive advantage by linking buyers, sellers and users. For some of the most innovative companies physical assets have become unnecessary. For example, Uber owns no cars, but is the world’s largest ride-for-hire company and Airbnb is the world’s largest accommodation provider but owns no rooms.
One of the compelling benefits of cloud operation is that it serves as an integration backbone for enterprise collaboration allowing data to be exchanged at any time and from anywhere. This enables better and faster decision-making throughout the value chain and drives innovation efficiently within a networked and connected enterprise.
Companies today are faced with the challenge of a complex and ultracompetitive global market. Strong competition from emerging economies coupled with increasing costs of innovation make product and brand differentiation challenging. For these companies to thrive globally, they need to speed up internal decision making throughout the value chain. That would allow them to better respond to market demand with more innovative and better differentiated products.
Companies must be willing to replace current systems and processes in order to use data in better ways. And companies that do use big data have been found to generate higher revenues that those that do not, often through developing entirely new revenue streams based on the data that they hold.
For many companies, security concerns are a significant barrier to cloud adoption. They believe that on premise data is more secure than hosting on the cloud. However, evidence shows that the opposite is true and that professionally managed cloud data is safer because according to the Intel Security Group around 43% of security breaches come from within organisations. This figure may however be much higher in smaller companies. The technology deployed by cloud hosting companies that governs data access is as robust as any achievable on premise. This means that hackers on the outside will face state of the art security and those on the inside will be discovered immediately.
Because no-one can accurately predict what businesses will look like in the years to come companies need to keep their data current and available. So rather than locating it in silo’s that restrict access it needs to become a useable and useful business asset that is deployed for profit.
The only route to this level of sustainable innovation, added security and better returns is to adopt cloud hosting, and thereby reveal and release the business benefits of hyperconnectivity.
Researchers have estimated that 25 years ago, around 100GB of data was generated every day. By 1997, we were generating 100GB every hour and by 2002 the same amount of data was generated in a second. We’re on trajectory – by 2018 – to generate 50TB of data every single second – the equivalent of 2000 Blu-ray discs – a simply mind-boggling amount of information.
By Barry Bolding.
While the amount of data continues to skyrocket, data velocity is keeping pace. Some 90% of the data in the world was created in the last two years alone, and while data growth and speed are occurring faster than ever, data is also becoming obsolete faster than ever.
All of this leads to substantial challenges associated with identifying relevant data and quickly analyzing complex relationships to determine actionable insights. Which certainly isn’t easy, but the payoff can be substantial. CIOs gain better insight into the problems they face daily, to ultimately better manage their businesses.
Predictive analytics has become a core element behind making this possible. And while machine learning algorithms have captured the spotlight recently, there’s an equally important element to running predictive analytics – particularly when both time-to-result and data insight are critical: high performance computing. “Data intensive computing,” or the convergence of HPC, big data and analytics, is crucial when businesses must store, model and analyze enormous, complex datasets very quickly in a highly scalable environment.
Firms across a number of industry verticals, including financial services, manufacturing, weather forecasting, life sciences & pharmaceuticals, cyber-reconnaissance, energy exploration and more, all use data intensive computing to enable research and discovery breakthroughs, and to answer questions that are not practical to answer in any other way.
There are a number of reasons why these organizations turn to data intensive computing, three of which are outlined below:
In manufacturing, the convergence of big data and HPC is having a particularly remarkable impact. Auto manufacturers, for example, use data intensive computing on both the consumer side and the Formula 1 side. On the consumer end, the auto industry now routinely captures data from customer feedback and physical tests, enabling manufacturers to improve product quality and driver experience. Every change to a vehicle’s design impacts its performance; moving a door bolt even a few centimeters can drastically change crash test results and driver safety. Slightly re-curving a vehicle’s hood can alter wind flow which impacts gas mileage, interior acoustics and more.
In Formula 1 racing, wind flow is complicated by the interplay of wind turbulence between vehicles. During a race, overtaking a vehicle – for example – is difficult by nature. Drivers are trying to pass on a twisting track in close proximity to one other, where wind turbulence becomes highly unpredictable. To understand the aerodynamics between cars travelling at over 100 miles per hour on a winding track, engineering firms have turned to data intensive computing in order to produce images like the one below:
Simulation and data analysis enables auto manufacturers to make changes far more quickly than when running physical tests alone, as they try to address new challenges by altering a car’s material components and design layout. On the consumer side, this leads to the development of more fuel-efficient and safer vehicles. On the Formula 1 side, modeling is key to producing safer and faster supercars.
The promise of data intensive computing is that it can bring together the technologies of the newest data analytics technologies with traditional supercomputing, where scalability is king. This marriage of technologies empowers the development of platforms to solve the most complex problems in the world.
Developed for supercomputing, globally addressable memory and low latency network technologies bring the ability to achieve new levels of scalability to analytics. Achieving application scalability can only be done if the networking and memory features of the systems are large, efficient and scalable.
Notably, two apex cloud virtues are feature richness and flexibility. To maximize these virtues, the cloud sacrifices user architectural control and consequently fails to meet the challenge of applications that require scale and complexity. Companies across all different verticals need to find the right balance of usage between the flexibility of cloud and the power of scalable systems. Finding the proper balance results in the best ROI and ultimately segments leadership in a highly competitive business landscape.
Just as the cloud is a delivery mechanism for generic computing, now data intensive, scalable system results can be delivered without necessarily purchasing a supercomputer. Deloitte Advisory Cyber Risk Services – a breakthrough threat analytics service – takes a different approach to HPC and analytics. Deloitte is using high performance technologies of Spark, Hadoop, and the Cray Graph Engine, all powered by the Urika-GX analytics engine to provide insights into how an organization’s IT infrastructure and data looks to an outside aggressor. Most importantly this service is available through a subscription-based model as well as through system acquisition.
Deloitte’s platform combines supercomputing technologies with a software framework for analytics. It is designed to help companies discover, understand and take action against cyber attackers, and the US Department of Defense currently uses it to glean actionable insights on potential threat vectors.
In the end, the choice to implement a data intensive computing solution comes down to the amount of data an organization has, and how quickly analysis is required. For those tackling the world’s most complicated problems, gaining unknown insights into data provides a distinct competitive advantage. Fast-moving datasets help spur innovation, inform strategy decisions, enhance customer relationships, inspire new products and more.
So if an organization is struggling to maintain its framework productivity, data intensive computing may well provide the fastest, most cost-effective solution.
=
There is no denying that in 2017 Artificial Intelligence (AI) is the technology industry’s big buzzword. With IDC predicting that the market will grow to be worth $5.05 billion in 2020 and Gartner believing that 85 percent of customer interactions will be done without a human in 2020, analysts too are fueling anticipation for this technology to become more mainstream.
By Steve Garrity, founder, Hearsay Systems.
Not limited to the technology world, AI seems to be making waves across the board. In healthcare, for example, recent research from CBI found that the number of healthcare-focused AI startups went from less than 20 in 2012 to nearly 70 in 2016, a high growth trajectory that is set to continue. The automotive industry is also seeing similar take up of AI, notably Uber acquiring AI start-up Geometric intelligence and Ford investing $1bn during the next five years in AI company Argo. Uber intends to use the technology to better predict supply and demand of cars, as well as work towards the roll-out of a self-driving car fleet, while Ford is also working on its autonomous vehicle.
For the financial services industry, artificial intelligence is likely to take a slightly different form. Already robo-advisers are being used to manage investment portfolios and chatbots in the back office environment. The financial industry is particularly suited to taking advantage of AI because of the amount of data it accumulates and has to analyse. From econometrics, peer analyses, foreign and domestic tax rates to bank fees, interest rates and cash flow, all will have an impact on an investment decision and as such needs to be correlated and analysed collectively.
As is clear from the different types of AI developments, this technology is very industry-specific and there is far from a “one size fits all” recipe for success. Enterprises need to tailor their data analytics to the specific industry they’re in and base AI services on the data they can harness, analyse and use to meet real market demand.
There are a number of reason why this shift to vertical-specific AI services will dominate 2017 technology development:
Lack of in-house data experts – Current employees who do have deep industry knowledge often don’t have the analytical skills required to turn data into actionable insights; in 2015, an MIT Sloan Management Review showed that four in 10 companies report the lack of analytical skills as a critical challenge, but only one in five have done anything about it. This is worsened in the UK where the technology sector already faces a considerable skills gap; indeed a report into the UK digital skills crisis highlighted that 93 percent of technology companies believed that the digital skills gap affects their commercial operations and talent acquisition (research done by TechUK).
With AI becoming mainstream in both the consumer and enterprise worlds, companies that continue to do nothing will risk irretrievably falling behind. Moreover, any outside analytical support will have to find ways to apply their general data models to specific industry needs. Accenture’s partnership with Amazon Web Services is one recent example of an attempt to help clients marry industry expertise with robust data capabilities.
Vertical-specific cloud and AI tools can more quickly provide customised applications – For example, an AI system that swims in a data pool only containing credit card transactions will not only become expert at detecting fraud, but also will be able to translate that metadata into proactive suggestions. If your metadata shows you’re a frequent traveller, your bank will not only know not to deny you a coffee after your flight to Hong Kong, it might also prompt you to switch to a credit card with more frequent flyer rewards points. Critically, data can inform an AI tool to do more than just one thing, making it even more valuable in everyday life, as users utilise the technology for convenience.
Companies are getting smarter about their technology investments – Technology vendors will no longer be able to make general appeals to the enterprise or try to woo consumers with aesthetics and style. The software solutions that succeed in 2017 must be able to map to specific customer and business paths, and that varies widely by industry. Data company IDG projects worldwide revenues for information technology products and services to grow to $2.7 trillion in 2020, and a large proportion of that momentum will come from third party platforms that aid companies in verticals such as financial services and manufacturing.
Complex sales cycles mean engineers need to be more than engineers – Selling a highly integrated SaaS service means sales cycles are longer and engineers need to get involved right away, engaging directly with customer and prospects. Anyone who acts solely as a liaison between the two groups should be concerned about job security, even if they have great “people skills.” The key here will be to specialise as much as possible, train your team in understanding the intricacies of a product as well as how to work together with engineers to add value.
Big companies are shedding bloatware – Big companies will eschew bloatware for applications that provide essential industry-centric applications – and nothing else. For too long investment has been ploughed into the next new shiny innovation, which means 2017 will see a much more strategic approach to technology procurement where value to the bottom line is a focus above all else; new solutions will therefore have to be a “perfect fit.” IDC predicts worldwide spending on cloud applications will increase from $70 billion in 2015 to more than $141 billion in 2019, with the vast majority of growth in industry-specific applications.
A new star company will emerge in 2017 that harnesses the power of vertical expertise – The darling of 2016, Slack, aggressively incorporated third party apps to help their users build their channels into something much more than email. As Aaref Hilaly of Sequoia Partners argues, services like Slack that focus on integration and automation with existing systems will be used more often. With daily services capturing vertically relevant data, it is only a matter of time before an AI company lands an exceptionally impressive win within its target market
Ultimately, an AI-empowered business will be a stronger competitor in the market. When also tailored to a specific industry, it will be able to access analytics that informs a truly “above and beyond” strategy. Disruptors such as Airbnb and Uber have already proved that it is no longer only about offering a good product or service. In fact, it can often be the same service or product, delivered in a much better way, thanks to the use of data analytics. The consumer world is already benefiting from data that has been collected and harnessed over years; it’s now time for the business world to take that one step further. Given the precedent set by this new generation of data-focused companies, it would be foolish to underestimate the potential market shift that is on the horizon.
87% of people believe digital will disrupt their industry in some way, but it’s not other businesses that will bring this disruption, it’s the employees themselves.
By Mike Guggemos, Chief Information Officer, Insight.
You used to only be able to use the latest tech in the workplace, but now digitally savvy consumers own it themselves, and are now adapting to the digital tide faster than organisations. A decade of smart technology has seen the integration of mobiles, laptops and tablets into mainstream society leading to the emergence of a hyper-connected workforce; yet many organisations have not adapted to this shift.
I take a look at how mobile is set to transform the way we work in the next decade, and how businesses can prepare for a digital tomorrow.
The ‘always on’ employee lives beyond their desk; they’re accessing emails and documents on the go, carrying everything they need on their mobile, tablets or laptops and using digital tools to manage workflow, productivity and decision making.
At Insight, we expect the next few years will see an emergence of mobile as our primary work device. For many this is already happening, but those remaining will likely follow suit when mobile applications that allow applications to pair with multiple screens emerge. Businesses need to think now about how they should adapt their business models to reflect these new savings, efficiencies and levels of connectivity.
Advertisement: Gigamon - An intro to the first pervasive Visibility Platform into Hybrid, Private and Public Cloud.
Technology such a voice enablement should be utilised to create efficiencies within the office environment. Consumers are already using voice activated devices such as Alexa in the home, but a virtual office assistant could be the start of a big change in the way we work. With the use of voice commands driven by voice recognition, employees could use it to power up projectors and even locate documents. Though still in its early stages, businesses should begin to trial how they could incorporate voice technology into the workplace as voice was and is this generation’s killer application.
Innovations in infrastructure and travel are set to bring whole new meanings to our understanding of globalisation. However, digital advancements are already making consumers – of all ages – become ‘global’ in new ways; and no more than video.
Once again, a tool that started in the consumer sector following the introduction of FaceTime and Skype, video-calling is now main-stream. With the power to create face-to-face meetings across diverse locations, businesses that can reduce their reliance on email and integrate video-calling into how they communicate with colleagues, clients and prospects, will enhance the relationships that have the most impact on their bottom line. At this stage, the secret is to invest in mobile devices that have video calling enabled, but also build out the network capabilities needed to handle this additional flow of data.
One of the perks of a hyper-connected workplace is the freedom to access the system remotely. However, this comes with security risks. For me, one of the simplest ways to utilise mobile devices and their applications is to use them as a security authenticator. With the right developments, you could essentially have a device that acts as a thumb print reader, iris scanner and voice recognition in your pocket and all on one device. With its practicality, mobile devices essentially become workplace passports, enabling the identification of employees as they access company environments, wherever they may be.
As we continue to use advanced technology in our homes and shops, it’s becoming increasingly apparent that the majority of work environments are missing out on these benefits. The growth of mobile adoption in the workplace through BYOD will continually encourage businesses to invest in how they can effectively capitalise on the trend, so it’s time to start planning for that digital tomorrow, now.
Is your security evolving alongside the IoT threat?
Asks John Ferron, Chairman of Ivanti – Powered by HEAT Software.
The paranoia surrounding IoT heightened with last year’s major DDoS attack on Dyn, and today barely a week goes by without warning that any connected device, from the office fridge to university vending machines, can be weaponised. Whilst this initially appears to be yet another fear-mongering tactic utilised by tabloids to sell more newspapers, this very futuristic form of cybercrime is not something that should be taken lightly. For example, in early February 2017 a DDoS attack flooded the Austrian Parliament’s website, shutting it down for almost half an hour – luckily no data was lost. Even more worryingly, hackers were able to disable 123 of 187 security cameras in Washington DC on the week of Trump’s inauguration. With fevers running so high on many different sides of the political spectrum, the lack of video surveillance could have potentially put people’s lives in grave danger. It’s safe to say that whilst the rise in IoT or connected devices has enabled leaps in personal and professional productivity, their vulnerabilities mean that cybercriminals are now able to attack not only data, but mission critical services which could put human life at stake.
As you are most probably aware, malware is a collective term for all malicious software that can infect connected devices – this term includes a mass of cyber-weapons that plague the internet, from viruses, to Trojan horses, to modern day ransomware.
The first ever malware weapon was aptly titled “Creeper” and lived out its life in 1971 infecting computers on ARAPNET – it is believed to still be inspiring black hat scoundrels to this day. One of the more famous malware executions was the 1999 ILOVEYOU computer worm. Exploiting both computer vulnerabilities and base human emotion, the ILOVEYOU worm attacked tens of millions of Windows Personal Computers with a simple email message that had “I Love You” innocently written in the subject line. The “LOVE-LETTER-FOR-YOU.txt.vbs” proved too much for the enamoured masses to resist costing businesses over US$15billion.
Late 2016 and early 2017 have seen these deadly but simplistic cyber-weapons evolve into more sophisticated forms of malware, for example the new Satan strain, which encrypts victim's files with RSA-2048 bit and AES-256 bit encryption, as well as the growing industrialisation of malware. These new weapons are able to both steal data from within IoT devices, such as tablets and mobile phones which could potentially be connected to a business network, or hold connected devices to ransom.
DDoS, or Distributed Denial of Service, attacks have also started getting a lot of public attention. DDoS attacks see cybercriminals attacking a single device or network by flooding it with data from multiple devices. The IoT has made DDoS attacks that much more powerful as cybercriminals are able to hack into connected devices, such as webcams, and use them as foot soldiers within a mass botnet army. This makes life only too easy for cybercriminals with a bit of IT expertise, as the constantly evolving IoT web is fraught with vulnerabilities. Manufacturers are so obsessed with keeping up with consumers’ demand for more and more connected devices that unfortunately security drops to the bottom of the priority list.
Cybercrime has therefore become a lucrative business model, with the risk of getting caught extremely low, and more and more businesses prepared to pay any Ransoms. However, the real issues arise when hackers start to attack devices, from smart lights in a connected city to IoT devices within a hospital, which could put people’s life in danger. This illustrates that the security of the IoT is now not just an IT problem, but one that should be recognised by governments and all individuals who own a connected device.
As the above examples make painfully clear, it’s not just PCs that IT service and support staff must worry about, even though they have been the core focus of business IT security in the past. From IP security cameras and door readers, through to mobiles, tablets and wearable technologies, there is a proliferation of devices capable of introducing new threats to the corporate network. The IoT (Internet of Things) is affecting business strategy, risk management and network control, and managing it demands an extensive range of technologies and skills that many organizations have yet to master. Adding on to this, many of the IoT devices that have been manufactured in China are built with profit, rather than privacy and security, in mind which provides easy pickings for hackers.
One of the thorniest security issues therefore surrounding the consumerization of IT, or the proliferation of IoT devices, concerns device management - how do you successfully manage and protect your organization against such a diverse and unsecure set of devices?
No-one is suggesting that IT leaders or even governments try to ban consumer IoT devices outright - such a retrograde step would neither be practical nor successful. Smart watches, head-mounted displays and other wearables can significantly enhance productivity, staff well-being and even help with employee retention. Try to block usage, and it will only go underground, creating an even more dangerous shadow IT risk. Business is booming in the global IoT market, with Research Nester claiming that it will reach a mind-blowing US$724.2 Billion by 2023, so attempting to stop its growth would be a fruitless exercise. Rather, concerned parties should be raising awareness of the damage that vulnerabilities within the IoT can cause (if the media isn’t doing a good enough job of that already!) and educating manufacturers and IT departments on how to ensure that their devices are secured.
A big step for IT leaders is to put more faith in the IT service desk. Operating on the frontline of IT, this function is in a great position to coordinate endpoint visibility and control efforts - but, very importantly, only if it has the right, unified set of automated tools at its disposal. These should be able to discover all devices connecting to the network and then enforce a stringent company or government agreed policy – for example, controlling access to corporate resources and automatically enforcing roles-based configurations.
Effective patch management is another vital element which can eliminate most known threats, and here it’s important to find a provider which can support a wide range of device and system types. Combine that with application whitelisting to combat zero day threats, and encryption to ensure data is kept safe no matter where it ends up. The aim is to create a truly layered defence strategy that will prove incredibly difficult for cybercriminals to penetrate. Contrary to popular belief, a lot of cybercriminals are actually a lot less skilled than is first assumed, and are only able to wreak so much havoc because the vulnerabilities within networks and IoT devices are blindingly obvious and very easy to attack. Traditional tools like AV and firewalls are, of course, still important but mainly as a last resort when all else has failed.
Finally, make sure that you embrace automation. Whilst it is a controversial term that to many means a loss of jobs, its critical purpose in this case is to help stretched IT teams stay on top of threats as the proliferation of smart devices causes an explosion in endpoints. IT leaders would also do well to encourage real-time collaboration between service and security teams. Your IT service desk is in the perfect place to spot and link incidents which may indicate a wider attack on the organisation. That intelligence needs to be escalated straight to security teams, just as they should pass on details of unpatched vulnerabilities that need remediating. Just as the growth of connected devices is allowing emergency services to work together to make cities safer, IT and Security devices need to unite the defend their devices and services against the threats caused by the developing IoT.
DW asked the ICT vendor community for their views on how IoT will (or will not) impact on the enterprise over the coming months and years. Here we publish their thoughts and comments. We’ve also been running a series of IoT blogs on the Digitalisation World website, which are well worth checking out at: https://digitalisationworld.com/blogsall
An Accenture report in 2014 revealed that 63% of insurance executives across nine countries thought that wearable technologies would be widely adopted by the insurance industries within the next two years.
In 2017, it would appear that we’ve exceeded those expectations. We have already seen this with companies like Vitality Health, which was one of the first insurance companies to offer policies based on data gathered directly from policyholders via wearables. This first-hand data coupled with existing data sources enables insurers to make smarter decisions that are tailored to the individual and give the consumer more control over their premiums.
There are plenty more potential uses, such as Google Glass being used as an independent witness in property damage claims and health wearables being used to show the impact of a crash injury on a personal trainer’s ability to work or provide proof of the impact in a personal injury claim.
The automotive industry is really embracing IoT in a very interesting way. Easycar Club, a peer-to-peer car sharing service, is a great business example of how success can stem from interconnectivity between people and devices.
Amazon’s partnership with Ford to become the first tech provider in-car is another example of a fantastic opportunity to make voice search and voice control elements of the connected car conversations, which could well be the best accelerator for the technology and IoT.
Connected devices are transforming the world of insurance as we know it. With the evolution of machine learning and artificial intelligence, we’re moving away from a world focused on IoT-enabled products to a world filled with IoT-enabled cities, households and individual lives. In this new age, the insurance-as-a-service model that telematics first put on our radar will surely reign supreme, giving the customers the value they seek from us.
Increasingly connected homes and devices will mean the insurance sector - quite rightly - moves away from a focus on restitution towards a focus on prevention. This in turn will lead to customers having more control over not only their policy but also the quality of their life – effectively removing the need for a policy pay-out.
The internet of things refers to the interconnectedness of devices via the Internet so they can send and receive data. Many devices now are connected to the internet, typical devices like mobiles, tablets, laptops and PCs but also other appliances like toasters or music devices! Hive is good example of controlling your central heating with your mobile device.
The collection of all this data which is being sent and received enables us to understand our human behaviour and patterns more, meaning we can understand consumers and clients. It means we can provide a service catered to their needs and based on mitigation and solving the issue before it happens. It also means you can target your advertising to the right audience, and increase your sales. The key phrase by Microsoft is ‘digital transformation’ as businesses of all sizes can use data, discover and automate their processes.
However, the collection of data does bring with it some security concerns and data protection issues. Should you be allowed to harvest this data and use it to target people? Is this ethical? What data should be collected if any? All these devices and systems collect a lot of personal data about people and their devices, such as where they go, what they do and other potentially incriminating information. If hacked, people will know the typical hours you are not at home for example or will know your usual haunts. So the IoT does have a lots of potential, but also security concerns.
The other security concern is the link between the digital devices and the datacentres which store the data they collect. Can this easily be breached? The devices share information and communicate between each other, data is shared by devices and stored. However, who should be concerned about the security of devices and datacentres? Should the user who invests in the device also invest in the security and maintenance of it?
It seems we have a lot more questions than answers when it comes to the IoT. The Internet of Things is presenting opportunities for businesses to grow their revenue and improve their services. However, it also comes with challenges to businesses, including unique security challenges such as monitoring and maintaining physical devices.
Machine learning isn’t new, but it is certainly gaining momentum. The time has come for enterprises across the board to be aware of its application and potential to transform the way they work in this modern age.
In basic terms, machine learning is the science of getting computers to perform a task without being explicitly programmed, using algorithms that allow the machine to ‘learn’ from the data. Machine learning represents a major step on the journey towards human-level artificial intelligence (AI), and it is an area of technological development that is attracting a lot of interest and investment.
Machine learning is already being applied more often than many realise, from Google’s self-driving cars to online recommendation offers. It is also used in fraud detection, while wearable technology companies are exploring its potential to save the NHS money by creating devices that can diagnose patients faster, without the need for a visit to the surgery.
While not all enterprises will be creating products that benefit from machine learning applications, there is serious potential for all to benefit from machine learning systems that help businesses defend themselves from what is one of the growing areas of threat: cyber attack.
No business is too small to be immune from the threat of cyber attack and the cost can be crippling in terms of lost data, customer trust and money. As the cyber criminals become increasingly confident and their attacks increasingly complex, machine learning offers one of the only viable solutions to improving defences. Manual cyber security is no longer sufficient due to the vast amounts of data and sources – the latter due to the rise of the Internet of Things (IoT) – and the fact that humans are often slower and less precise than computers.
Machine-learning systems can offer the personalised approach that is necessary in this era of sophisticated cyber attack. Enterprise networks aren’t neat and structured environments where a simple security policy is enough to deliver protection, but a dynamic place where patterns and threats emerge and evolve on an ongoing basis. Machine learning DNS monitoring tools, such as Nominet’s turing 1.3, are dynamic, learning from the user and the patterns to react and refine over time, delivering personalised insight that can ensure organisations are doing all they can to protect themselves from cyber attack. It’s time to let the algorithms do the hard work to keep your business safe!
MSP continues its ‘journey’ with Kaseya VSA; benefits, including increased efficiency, continue to grow.
Remote monitoring and management (RMM) is what makes today’s managed services businesses viable and effective. Consequently, choosing the right solution is pivotal to a managed services provider’s (MSP’s) success.
This is especially the case when the company in question is, like P2 Technologies, renowned for its customer service and its honest, impartial advice. To instil confidence in it customers, it must be sure of its own decisions and software choices before counselling others.
UK-based P2 Technologies provides complete managed IT support to around 40 professional businesses across the country, acting as an entire IT department or complementing a customer’s in-house IT team. Around four years ago, the business was expanding, but its legacy RMM software wasn’t capable of growing with it.
Says Martin Page, P2 director: “It wasn’t scaling and we couldn’t really do all the advanced things with it that we needed, such as patching. Also, the support from the vendor was poor.” He recognised that having a “single pane of glass” – or a management display that integrated all parts of a client’s IT infrastructure with help desk and RMM all in one place – would help the team provide a faster and more accurate response to customers.
Although business was growing, the company at that time was still small with only five staff. So P2 faced a difficult decision. Should they be bold, take a risk and make a large investment in their RMM solution, even though they couldn’t be certain their expansion would continue? Or play it safe finance-wise, but end up with a solution that might only be fit for purpose for a short time?
“What we liked about Kaseya VSA was that it was being used by much larger MSPs than us, so we could be assured it would scale up successfully as required,” says Page. Kaseya VSA is a fully-integrated IT management platform, which automates all IT from a single dashboard – the ‘single pane of glass’ high on P2’s wish-list.
Included in the solution is a service desk management application providing a flexible, workflow-driven approach to service management, including incident remediation and escalation policies. System monitoring provides instant notification of problems or changes such as low disk space, processor spikes and memory issues.
“It was around three times the cost of our old system, but we believed we could get five times the value from it. We could also see that it was a huge solution and we knew we would build up our use of it stage by stage,” Page explains.
“Obviously investing in the Kaseya solution was a giant leap for us, but we knew it would give us far more capability and as a result more to offer our clients as we grew – it was a clear choice to select VSA.
That was four years ago – and P2’s entrepreneurial spirit has obviously paid off. Its service team can now view and assess the state of each client’s infrastructure in “one real quick way,” according to Page. “We can remote control it, patch it, apply procedures to it very easily from here, without having to interrupt the customer.
“It provides many productivity benefits for ourselves and our customers too. Four years ago, we were processing around 800 service desk tickets per month. Now, the customer base and service staff have doubled, and we are still processing the same amount of tickets. We have spent a lot of time automating procedures for common tasks and using service desk data to find repeat issues and eradicate them to improve efficiency and customer service simultaneously,” says Page.
Additional benefits include P2’s ability to deliver more proactive service; for example, the system will monitor if a power supply is going down on a server and almost instantly the service team can begin to remedy the situation. “Nothing gets missed these days,” says Page. “This makes it so much easier to achieve and maintain our service level agreements (SLAs).”
Although P2 isn’t a 24/7 business it can now act as though it is with the help of VSA to monitor customer infrastructures around the clock. Also its service team can schedule routine maintenance, such as a reboot out of hours to minimise disruption to a customer’s business.
Customers also rely on P2’s advanced reporting. “If we are in a service review with a customer and we want 30-day’s worth of data, instead of examining a hard disk on a server, we can quickly pull that information out of the Kaseya system to give the customer an overview. Previously this would have taken so much effort; now it’s instant and easy.”
Just as P2’s use of Kaseya VSA is growing, so is the software itself. “We started on version 6.3 and we’re now on 9.3. We’ve gone through every upgrade without a single problem. And the good thing is that the product evolves with every iteration and we gain extra benefits. For example, version 9.0 gave us the world’s fastest remote control. This speed is a real advantage that allows us to effectively and efficiently service our customers,” says Page.
P2 has always seen its relationship with Kaseya as long-term – and is increasingly engaging with other users to “understand their challenges and where the software could take us.”
Page explains that for P2 the Kaseya software is complementary to their service. “We don’t use it as a sales tool. But increasingly there are saleable components included - and as Kaseya continues to go down this route this could be very advantageous to us and something we may well explore more fully in the future.”
It’s clear that Page believes the software has enormous further potential for P2.
DW asked the ICT vendor community for their views on how IoT will (or will not) impact on the enterprise over the coming months and years. Here we publish their thoughts and comments. We’ve also been running a series of IoT blogs on the Digitalisation World website, which are well worth checking out at: https://digitalisationworld.com/blogsall
This year will see an ever-growing urge across the tech industry to leverage the Internet of Things (IoT) for anything from automating the collection of data to programming manual actions in the physical world. However, there is a legitimate and growing concern that these devices aren’t inherently protected and continue to pose huge security risks. The most striking of these were the DDoS attacks launched through IoT devices such as cameras and DVRs against the Post Office, TalkTalk and Dyn, bumping thousands of users offline. Just recently, we saw another attack on a US university, which suffered when an IoT malware infection hit its own vulnerable vending machines. The possibilities for infiltration are only broadening with insecure IoT.
Existing and incoming regulations complicate the issue even further. The looming General Data Protection Regulation (GDPR) will change the way all organisations handle personal data, and is something many businesses aren’t prepared for. Companies will need to walk the tightrope between collecting data to drive business objectives and gathering it securely enough to avoid huge fines. It will pay for companies to keep up to date on changing regulation and security developments in the IoT landscape. This way, businesses can use IoT to their advantage, not their downfall.
One of the most important elements to consider is ensuring that connected devices only navigate trusted networks. It’s worth carving out a separate network for IoT to ensure connected devices only connect to resources that the IoT devices are intended for. This way, in the event of an attack, the damage is minimised as far as possible. Enterprises may want to digitise existing Machine-to-Machine (M2M) or Operational Technology (OT) deployments of electronic technology to further extend their ROI. Enterprises should do so with caution – the teams that deployed these legacy systems may not have as wide-ranging knowledge of information security and threats have evolved since systems were deployed years ago. Look to systems that can help establish trust at the boundary between legacy and IT networks and that ensure that core security requirements are met.
Some companies are just a few steps away from implementing IoT devices on a significant scale but the pace of the growth in this technology means it’s essential to stay ahead of the curve. Factoring security in from the beginning empowers companies to benefit from the business value that IoT brings without fearing the risks.
Immediately, IoT devices can revolutionise simple tasks such as note making, wireframing and almost anything we currently do by hand. It won’t be long before it is the norm to use digital notepads and upload, and share with ease. These are tools and apps that already exist, but we’d expect them to become increasingly ubiquitous.
For those working in larger spaces, IoT technology can allow employees to connect and communicate face-to-face where previously they couldn’t. Take Lego’s HQ for example, where connected devices allow employees to see where their colleagues are, and how open they are to distraction. It only gives people an idea of the area (as opposed to an invasive tracking system), and people can turn it off for solo work. However it can facilitate face-to-face communication, and ultimately working, which can only be a good thing.
IoT technology perhaps has the most potential for impact upon those working outside. Cloud storage/sharing tools have revolutionised collaboration with remote workers, along with tools such as video conferencing - although none can truly replace the immediacy of being in the office. However, at the top end of the IoT-workplace relationship comes projects such as the iRobot Ava 500, which combines video conferencing tech with a physical, moving presence in the workplace.
For coworking spaces
Of course this may lead to the dissolution of the office as we know it, with the workforce spread over numerous locations. Indeed this is already happening, with businesses, big and small, increasingly using co-working spaces as a base for remote teams - away from regional and/or national HQs.
Most excitingly, within these, there are opportunities for cross-company collaboration - something we have always advocated. It’s not impossible to dream of a scenario in which people can put tasks into a database and their fellow co-workers have alerts appear on a device, through which they can sign up to provide the service. It’s a bold vision, but an exciting one.
Gartner predicts that 8.4 billion connected "things" will be in use in 2017, up 31% from 2016. Following the hype of smart homes and smart cars, next stop for the IoT power train is the smart office. An intelligent, connected office goes beyond simple admin being automated, but instead offers a means to create a significant changes to fundamental business processes and applications quickly and continuously throughout the day. The revolution of IoT in the workplace lies in the potential of the technology to change the way people work and interact.
Machine-to-machine (M2M) technology is changing productivity patterns. IoT is enabling a new era of virtual assistants whereby algorithms, voice recognition technology and cognitive apps will be available across a range of interfaces, leveraging the ability to use data mining, pattern recognition and natural language to mimic the way the human brain works. Cognitive application development means software is written to be responsive to commands, able to recognise and respond to them before they are even inputted.
One of the key areas where improvement will be seen within in businesses through IoT is supply chain management. Being able to have full insight into every stage of the process globally, and being able to respond instantly to the smallest variable, gives those who invest in this infrastructure an advantage over competitors. As well as supply chains, any businesses that have multiple touch points for partners and customers will benefit from 24/7 communication between all of this data. When you apply cognitive applications and the advanced analytics to all this insight of data then you have the means to spot trends that would be near impossible for the human eye alone.
The cost of this technology can be considerable, organisations therefore need to ensure that they select and implement the solution that fits their needs best. IoT is all about building relationships between devices, systems and people and having the right ecosystem in place is the first “smart” step. The beauty of using cognitive applications is that once they are in in place then they will begin to evolve and improve themselves exponentially; resulting in improved efficiency, agility and cost savings.
According to an Accenture study, artificial intelligence (AI) is going to add a further US$814 billion in 2035 to the UK’s economy – with rates increasing from 2.5 percent to 3.9 percent in 2035.
By Talend CTO, Laurent Bride.
We have seen a machine master the complex game of Go, previously thought to be one of the most difficult challenge of artificial processing. We have witnessed vehicles operating autonomously, including a caravan of trucks crossing Europe with only a single operator to monitor systems. We have seen a proliferation of robotic counterparts and automated means for accomplishing a variety of tasks. All of this has given rise to a flurry of people claiming that the AI revolution is already upon us.
However, while there is no doubt that there have been significant advancements in the field of AI, what we have seen is only a start on the path to what could be considered full AI.
Understanding the growth in the functional and technological capability of AI is crucial for understanding the real world advances we have seen. Full AI, that is to say complete, autonomous sentience, involves the ability for a machine to mimic a human to the point that it would be indistinguishable from them (the so-called Turing test). This type of true AI remains a long way from reality. Some would say the major constraint to the future development of AI is no longer our ability to develop the necessary algorithms, but, rather, having the computing power to process the volume of data necessary to teach a machine to interpret complicated things like emotional responses. While it may be some time yet before we reach full AI, there will be many more practical applications of basic AI in the near term that hold the potential for significantly enhancing our lives.
With basic AI, the processing system, embedded within the appliance (local) or connected to a network (cloud), learns and interprets responses based on “experience.” That experience comes in the form of training through using data sets that simulate the situations we want the system to learn from. This is the confluence of Machine Learning (ML) and AI. The capability to teach machines to interpret data is the key underpinning technology that will enable more complex forms of AI that can be autonomous in their responses to input. It is this type of AI that is getting the most attention. In the next ten years, the use of this kind of ML-based AI will likely fall into two categories:
There is no doubt about the commercial prospects for autonomous robotic systems for applications like online sales conversion, customer satisfaction, and operational efficiency. We see this application already being advanced to the point that it will become commercially viable, which is the first step to it becoming practical and widespread. Simply put, if revenue can be made from it, it will become self-sustaining and thus continue to grow. The Amazon Echo, a personal assistant, has succeeded as a solidly commercial application of autonomous technology in the United States.
Autonomous vehicle technology is one of the most publicised and one of the most needed applications of AI. There were 1,730 reported road deaths in Great Britain in 2015 and a further 22,144 serious injuries. Autonomous vehicle technology has the potential to significantly reduce this figure and greatly improve availability and efficiency of transportation for everyone.
In addition to the automation of transportation and logistics, a wide variety of additional technologies that utilise autonomous processing techniques are being built. Currently, the artificial assistant or “chatbot” concept is one of the most popular. By creating the illusion of a fully sentient remote participant, it makes interaction with technology more approachable. There have been obvious failings of this technology (the unfiltered Microsoft chatbot, “Tay,” as a prime example), but the application of properly developed and managed artificial systems for interaction is an important step along the route to full AI. This is also a hugely important application of AI as it will bring technology to those who previously could not engage with technology completely for any number of physical or mental reasons. By making technology simpler and more human to interact with, you remove some of the barriers to its use that cause difficulty for people with various impairments.
The use of AI for development and discovery is just now beginning to gain traction, but over the next decade, this will become an area of significant investment and development. There are so many repetitive tasks involved in any scientific or research project that using robotic intelligence engines to manage and perfect the more complex and repetitive tasks would greatly increase the speed at which new breakthroughs could be uncovered.
There is also the tantalizing possibility that as we increase the capability of our AI systems, they could actually perform research and discover new avenues to explore. While this is still a long way away, it could greatly accelerate the discoveries needed for many advancements that could improve and extend our lives.
The dystopian vision of robots assuming complete control of society is unlikely; the nuances of perception, intuition, and plain old “gut-check reactions” still elude machines. Learning from repetition, improving patterns, and developing new processes is well within reach of current AI models, and will strengthen in the coming years as advances in AI – specifically machine learning and neural networking – continue. Rather than being frightened by the perceived threat of AI, it would be wise to embrace the possibilities that AI offers.
This year will, in all likelihood, see more successful hacking efforts than any preceding year. This is not due to 2017 being in any way special but simply because we are not doing enough of the right things to combat hackers.
By Brian Chappell Director, Technical Services EMEAI & APAC, BeyondTrust.
Over the past 30 years we have seen some significant changes in the approach to security. In 1987 (when I started in the IT industry) directly networked systems (beyond modem access) were the preserve of the elite, hacking was what people who wrote code did and it was crackers that sought to gain unauthorised access to systems and applications. In that environment, defences were focused around the physical security of systems.
Move forward 30 years and we find ourselves in an infinitely connected environment, everything is communicating with everything else (or very nearly) all the time, not limited by cables but wirelessly exchanging information of all types.
Over this period, we’ve seen lots of approaches to Cyber Security however one common thread through many Cyber Security projects is an attempt to solve the whole ‘problem’ at once. Approaching Cyber Security as a monolithic project is likely to result in failure or worse still, a badly implemented solution.
In many organisations, the Cyber Security team are seen as the team that say “No” and the whole concept is seen as a necessary evil that’s going to make it harder to be productive. What has that got to do with onions? Let’s think of constructing our Cyber Security model as a collection of layers, just like the skins of an onion. When we look at an onion, we take the role of the external malicious actor, we can see the attack surface but we can’t see to the core. Each semi-transparent skin combines to present an opaque defence to prying eyes. If we were inside the onion then we may be able to peer through 2 or 3 skins with relative ease but even within, any attempt to see through more than a few skins is going to result in our progress being blocked.
It’s easy to see how this forms a good model for planning our Cyber Security strategy. Thin layers of security, each relatively simple and lightweight, that combine to provide a barrier to unauthorised access. Day-to-day activity which involves interacting with only a few security layers is painless, only minor additional effort is required from the user. The more layers involved, the greater the effort required to get through the controls. Of course, not all layers are equal and some layers will be ‘thicker’ the closer you get to sensitive resources and information.
Tackling security in this way helps us not only by breaking the implementation into manageable layers, it also helps by keeping each layer simple. Simpler layers are easier to implement, easier to manage and more likely to succeed. It also helps the organisation accept change but presenting it incrementally. Keeping the scope of each project smaller not only makes it easier to identify the stakeholders and their use cases but also to get their buy-in for the project as a whole. Addressing security in layers also allows us to control the impact on the user base and prevent any unreasonable loss in productivity.
Some Cyber Security layers are probably already present within your infrastructure. Your skin, or attack surface consists of your firewalls, proxies, VPN servers and other edge devices that either present services to the broader internet or provide access points into your networks. At the simplest level, it’s relatively straight forward, we know which accesses will be allowed and which denied. It’s our first layer and one that’s often managed separately to the rest of the infrastructure.
Identity is another layer within the Cyber Security onion, a simple layer which takes credentials from the user and provides authentication and authorisation for other systems. Identity management (IDM) provides us with tools to help manage this layer and ensure that authorisations granted provide the capabilities intended and no more. As the users’ primary interface with the environment, ensuring we have control over that identity is vital. Wherever possible try to keep the identity layer within a single directory service with a single identity for each user, this simplifies the management of this layer dramatically.
In many organisations identity is hampered by the need for privileged access to various systems. We look to the authorisation of identity to provide the necessary privileges but there is the opportunity to extract privileged access into a separate layer. It’s commonly only required for a subset of users and presents a complication for managing identity that’s unnecessary.
Privileged access requirements tend to come in two forms: that required to administer systems and that required to allow users to run privileged processes and applications. In a Windows context, we use the examples of users who manage servers and users who need to install software, change their IP settings or any other task that needs admin privileges. When we address this requirement for privilege through identity we give the user privilege rather than privileged access to the task it’s needed for.
If we separate out privileged access into another layer, in fact two layers then we can keep identity simple and privilege becomes simple. If we look at the natural use of privileged access, managing the infrastructure, we can create a layer accessed through our standard user account that contains the model defining how, where and when we can get privileged access and to what. That creates a layer with a clearly defined scope that gives us more control through a simpler, more targeted security model than by assigning privilege through identity. We don’t have to worry about misuse of privileged access because it’s constrained in this layers model which links the user to the asset where privilege will be granted. The user account is never altered to provide privilege, access is provided through a light-weight mechanism with full auditing.
The other use of privileged access, the need to run processes and applications that need privilege to execute properly, can similarly be delivered through another layer in our Cyber Security model. This layer changes how we address privilege at the base level. It helps us define the privilege required by the process or application to execute properly. It provides the ability to allow applications to run with the least privilege necessary for the user to be productive. This layer does not alter the privilege of the user account, it allows the process or application to run with the necessary privilege in a fully secure manner. This keeps identity separate and the security model for this layer simple.
While the identity layer can be used to control access to the privilege layers (usually through group memberships) it doesn’t define the privilege or how it’s provided.
Hopefully this will get you thinking in terms of layers and the clear benefits there are in keeping each simple. What will become apparent is that the whole is greater than the sum of its parts, that’s where the biggest benefit comes and it does it without even trying. Each layer properly implemented will contribute to the whole.
There are other layers which would form part of a good Cyber Security strategy which include but are not limited to vulnerability management, event monitoring and behavioural analysis. When we consider the primary opponents, we are at war with, the Cyber Criminals, their methods of entry and exploitation need to be key when we are defining the basic layers we need to implement. What we can’t lose sight of is the basic premise of IT systems, to enable us to do business. Any action we take needs to maximise the opportunity to do business while minimising the impact of those controls and ensuring the whole thing is manageable. A lot to do all at once but relatively simple to approach in layers.
Introduced in the UK, Germany, Australia, and the Netherlands at the turn of the century, the first network sharing agreements were conceived as a way of helping wireless operators to offset the high cost of launching 3G services in hard-to-cover areas.
By Dr Mohamed Nadder Hamdy, director mobility network engineering, CommScope.
Despite the opportunities for CapEx and OpEx savings that network sharing represents, however, the initial surge in interest in the model soon declined as most operators chose to build their own 3G networks.
Potential cost savings continue to be the main driver behind network sharing, of course, although the amount of savings available to an operator depend on the depth of the sharing arrangement.
As more equipment is developed to support passive forms, such as site sharing, and active forms, in which a common RAN network, spectrum resources and core networks may be shared among mobile network operators (MNOs), so the range of options available to operators grows.
Ultimately, any move to network sharing will be driven by the need to maximise enterprise value, the major benefit being a net reduction of between 10 and 40 percent in CapEx and OpEx, dependent on the option chosen.
A common practice around the world, passive network sharing concerns the sharing of passive non-electronic infrastructure and facilities.
Shared assets might include the real estate upon which a cell site is located, the tower space, equipment cabinets, or buildings at the base of a tower, or the power, lighting and air-conditioning systems that support this equipment.
In some markets, site sharing has even become a regulatory mandate. But whether mandatory or voluntary, passive sharing can save MNOs up to five percent on CapEx and as much as ten percent on OpEx over the course of five years.
To illustrate the strength of interest in its adoption, consider the North American tower companies such as American Tower, Crown Castle, and Global Tower Partners, who have each invested billions of dollars in acquiring passive infrastructure from operators in the hope of brokering passive sharing agreements with MNOs.
Active network sharing refers to the sharing of active infrastructure and radio spectrum, and includes a number of models involving elements in the RF path such as antennas, base station equipment, and transmission lines.
Certain sharing strategies can take the partnership between MNOs deeper, enabling operators to share radio spectrum, infrastructure management systems, and administrative resources such as billing systems and even customer service platforms.
Traditionally, active infrastructure has been less commonly supported than its passive counterpart, but is beginning to become more widely considered – especially because of the potential benefits it represents for rural broadband.
In addition, it’s expected that the number of active RAN sharing joint ventures between operators is set to increase due to the high cost of deploying LTE. As a result, active RAN sharing is likely to be the next evolutionary step in infrastructure sharing, unlocking even greater CapEx and OpEx savings than passive RAN sharing.
Roaming between operators within the same country can reduce investments by geographically dividing the cost of the necessary infrastructure between the operators involved. It can also allow new operators without physical radio access networks to roam on the networks of other operators, meaning that guest operators can then provide services in new markets without having to deploy additional infrastructure.
However, while national roaming may be the easiest and least costly model of network sharing, it provides the least amount of control and flexibility for the guest operator. It also consolidates the overall number of mobile networks, homogenising retail offerings and quality of service, and making it harder for a single operator within a market to differentiate itself from its competitors.
What’s more, competition may be restricted around pricing, as the retail tariffs charged by the roaming operator will, to a large extent, be based on the wholesale charges paid to the operator being visited.
While the potential cost savings and benefits grow as the depth of sharing increases, so too do the complexity and the risks faced by operators.
Sometimes, particularly on sites with limited space, or with health and safety regulations, operators may find themselves forced to share the same antenna. Alternatively, in a bid to reduce power usage, emissions and aesthetic impact, a number of countries, including Brazil, Canada and Jordan, have stipulated that operators looking to deploy new services must be willing to share passive and/or active elements within the networks, including antennas.
Solutions are available, however, to counter the challenges that site sharing can present.
The deployment of low-loss combiners (LLCs), for example, enables operators to share existing sites with others, while experiencing minimal insertion loss compared to traditional hybrid units.
Generating new revenue without the need for a major CapEx investment, the benefits of an LLC include the cost savings made from sharing the full RF path as well as the tower structure, with the associated reduction in rent, faster deployment due to easier installation, and faster network rollout, avoiding the lengthy, complication and often uncertain site acquisition phase of deployment.
Technology is evolving, transforming the way we live and do business. At the heart of this transformation lies the network. Operators and technology providers are rethinking the purpose, roles and usage of networks in helping customers to increase bandwidth, expand capacity, enhance efficiency, speed deployment and simplify migration.
From remote cell sites to massive sports arenas, vital infrastructure is necessary for the network to flourish and for businesses to succeed. Network sharing, in its various forms, offers operators the means with which to extend their coverage while ensuring that costs remain low. And, as technology continues to evolve, so the options will increase, along with the benefits they deliver.
DW asked the ICT vendor community for their views on how IoT will (or will not) impact on the enterprise over the coming months and years. Here we publish their thoughts and comments. We’ve also been running a series of IoT blogs on the Digitalisation World website, which are well worth checking out at: https://digitalisationworld.com/blogsall
Internet of Things is a very complex mixture of technologies and services, with lots of security issues on every link of the delivery chain. Here are some of them:
- Vulnerabilities in gadgets. Most of new IoT gadgets are developed and produced as fast as possible, with no security tests. Even known vulnerabilities are often not fixed because a manufacturer uses third party components, or because he just doesn't want to spend money. For example, in 2013, our security experts detected several critical DVR vulnerabilities that allow attackers to access DVRs remotely and use them for botnets. The vulnerable firmware, including Samsung Web Viewer, was used in many DVRs sold under dozens of brands and widely available via Internet. So there was no surprise for us, when three years later we saw reports about botnets made of millions of infected DVRs and CCTVs.
- Vulnerable communication protocols. Our study shows that more than 60% of popular 3G/4G modems are vulnerable to Remote Code Execution. In other IoT devices, you may see unprotected Wi-Fi or Telnet with default settings. There are known cases of medical devices like defibrillators and injection pumps with default Bluetooth passwords.
- Flawed web applications. As a part of Smart Grid security research, our experts could find online control panels for many individual smart grid systems via Internet (200,000 solar power stations and 1,000,000 inverters connected to one server). Most of these web pages weren't protected by passwords or the passwords were easy to retrieve. After bypassing authorization, an intruder can obtain access to system parameter control, and even play with mechanical systems (disabling inverters, fire alarms, etc).
- Automation and over-connectivity help to attack vulnerable IoT gadgets really fast. With desktop PCs, the infection required some kind of interaction between the user and the malware: you had to open a letter or a web page. With IoT, that step is completely removed: hackers discover and penetrate thousands of vulnerable devices instantly with an automatic scanner or a special search engine like Shodan.io.
- Lack of security policy and education. Most of people can easily install an antivirus on a PC, or update flawed PC software. But in case of IoT gadgets it could be tricky as many of them don't have an easy interface. It's hard even to see if an IoT device is compromised as users aren't educated to look for the warning signs in such devices as home routers, web-cameras or electricity meters. Vendors advertise simplicity and "plug-and-play" models of IoT usage, but don't care to explain security issues.
With all these security problems in consideration, we expect the rise of IoT attacks in 2017. The range of targets will widen including smart TV, cars and other transport systems, home appliances, medical equipment and wearable gadgets.
In addition to known types of attacks (DDoS, ransom, information and money theft) we may also see vulnerable IoT used as a first step for more serious targeted attacks on critical infrastructures that will lead to physical disasters. According to our research, building automation and energy management systems are most common among vulnerable control systems available online.
The massive growth of mobile usage and the almost unrestricted expansion of IoT devices means that we’re set to grapple with a series of security issues as the Internet of Things grows. Most of these devices, whether it’s an electronic lock, alarm or heating system, can be controlled by a mobile app. This is where problems can arise: hackers are increasingly using mobile as a medium through which to conduct their activities.
An example of this can be seen in a project we carried out at Promon in November. By making use of a fake app and social engineering techniques, we demonstrated how it would be possible to steal a Tesla driver’s personal details and subsequently use them to locate, unlock and start the engine of a Tesla car.
Remotely controlling and stealing a Tesla car is a particularly dangerous example of just what can be done, but in theory, any app without the necessary protection in place could be affected. This goes for any IoT device that can be controlled via an app – for example, a house alarm could be deactivated by a hacker, or an electronic lock could be disabled.
Makers of these devices have little or no control over the malware that might be residing on users’ mobile devices. However, what they can do is prioritise the implementation of app security. One way of doing this is by introducing self-defending software that protects the app from the inside out. If the app is protected, then IoT devices are in turn shielded from a variety of attacks coming from mobile. With there being very few IoT standards in place at the moment, more needs to be done on the app side in order to provide a more effective safety net.
The ubiquitous smart phone may soon be replaced by its smarter industrial-grade cousin, IoT. Facilities managers, architects and manufacturers are just beginning to recognise the new possibilities in the workplace where everything—from a lighting fixture to a photocopier—is a rich data source and ripe for automation.
As sensors become cheaper and smaller, much of the infrastructure and machines around us will be able to talk to each other - as well as talking back to us.
For example, from a facilities management perspective, the revolving doors or the lifts in your building will be self-aware of their performance status, and capable of sending an alert out to a service technician that an essential part will need replacing in the next two weeks, based on the volume of its usage. This predictive alert will trigger a visit by a field service technician to provide the necessary maintenance before a failure occurs.
It’s also likely to dramatically change the design of our workplaces. The volume of data available from IoT will give architects and designers unprecedented insights into how we really use our workspaces based on day-to-day real-world insights. This will guide everything from how cubicles are configured, buildings are constructed and infrastructure and machines are serviced.”
As the Internet of Things (IoT) becomes more of a reality, it opens up new possibilities that didn’t exist before. We have fridges that can order your food, cars that can drive themselves and homes that light up and heat up at your instant command. However, as more devices become connected to the Internet, they rely heavily on cloud services, which could leave organisations open to data breaches.
As IoT continues to grow we are seeing more and more devices of questionable security enter the workplace. These can be infected by malicious malware and used to disrupt business services in a DDoS attack. Unlike traditional DDOS attacks, which are made by computers, mirai botnets are made up of multiple IoT devices such as tablets, phones and cameras. These can be manipulated by hackers and used to target servers causing an overload and operational downtime. Last year DNS provider, Dyn, fell victim to this and as a result several major businesses such as Paypal and Twitter suffered downtime. However, organisations running services on-premises were not affected by this attack because it did not affect any internal network environments. Choosing to run infrastructure and services internally definitely mitigates the risk of outage from external forces.
It’s an example that helps illustrate the importance of infrastructure strategy to any organisation working in and around IoT technology or applications. Many organisations that choose cloud services do so for simplicity and convenience because traditional IT infrastructure, even with virtualisation, is complex and can be difficult to implement, particularly for small and midsize organisations. It has only been recently that hyperconvergence has made on-premises data centres as simple to use as the cloud. Likely it will simply continue to evolve continuously and with more emerging technology. At the moment, however, organisations have the choice of easy-to-use hyperconvergence for increased security and stability, or choose to go with cloud providers for complete hands-off management and third party reliance.