Change fuelled by digital transformation and the data revolution.
In its inaugural Voice of the Enterprise (VotE) Digital Pulse survey, 451 Research finds that IT leaders are embracing a new model of off-premises, service-oriented IT solutions and will be looking to harness data in new ways to differentiate themselves in 2018. Respondents revealed that the top three IT initiatives for 2018 are all data-centric: business intelligence, machine learning/artificial intelligence and big data.
The survey finds that IT organizations’ ability to exploit digital transformation is uneven with over 60% of organizations having no formal transformation strategy in place and many admitting they face challenges in achieving optimal business-IT alignment.
Sixty percent of enterprises surveyed for Digital Pulse say they will run the majority of their IT outside the confines of enterprise datacenters by the end of 2019, chiefly using off-premises service provider environments such as public cloud infrastructure and SaaS.
Accordingly, the largest spending increase in 2018 is for IT delivered ‘as a service,’ at the expense of the traditional on-premises model. 451 Research finds information security is also high on the IT agenda, with 16% of organizations saying that area is getting the largest budget increase.
Providers such as Microsoft and Amazon Web Services (AWS) are emerging as enterprises’ most strategic technology suppliers; 35% of organizations say Microsoft will be their most strategic partner by the end of 2019, compared to 33% today, while 17% say AWS will hold that position two years from now compared to 7% today.
The Digital Pulse survey also highlights a revolution in how organizations will harness data to differentiate themselves and create new value. The top three IT initiatives for 2018 were all data-centric: 45% of respondents pointed to business intelligence, 29% mentioned machine learning/artificial intelligence while 28% said big data.
The growth opportunity around data is clear with almost 30% of organizations saying ML/AI is a top priority in 2018 while just 12% of respondents use these solutions today.
Meanwhile, usage of much-hyped technologies such as blockchain remains very low, but more organizations will begin to move from tire kicking to actual deployment over the next year, with 12% of Digital Pulse respondents citing blockchain as a top IT priority for 2018.
“The survey suggests that many – but certainly not all – organizations are finally reaching the point where they can focus on endeavors that help differentiate the business, instead of merely keeping the lights on. In 2018 we expect to see much of this effort focused around a new set of approaches to data optimization and analysis,” said Melanie Posey, Research Vice President and General Manager, Voice of the Enterprise, 451 Research.
Worldwide IT spending is projected to total $3.7 trillion in 2018, an increase of 4.5 percent from 2017, according to the latest forecast by Gartner, Inc.
"Global IT spending growth began to turn around in 2017, with continued growth expected over the next few years. However, uncertainty looms as organizations consider the potential impacts of Brexit, currency fluctuations, and a possible global recession," said John-David Lovelock, research vice president at Gartner. "Despite this uncertainty, businesses will continue to invest in IT as they anticipate revenue growth, but their spending patterns will shift. Projects in digital business, blockchain, Internet of Things (IoT), and progression from big data to algorithms to machine learning to artificial intelligence (AI) will continue to be main drivers of growth."
Enterprise software continues to exhibit strong growth, with worldwide software spending projected to grow 9.5 percent in 2018, and it will grow another 8.4 percent in 2019 to total $421 billion (see Table 1). Organizations are expected to increase spending on enterprise application software in 2018, with more of the budget shifting to software as a service (SaaS). The growing availability of SaaS-based solutions is encouraging new adoption and spending across many subcategories, such as financial management systems (FMS), human capital management (HCM) and analytic applications.
Table 1. Worldwide IT Spending Forecast (Billions of U.S. Dollars)
2019 Growth (%)
Data Center Systems
Source: Gartner (January 2018)
The devices segment is expected to grow 5.6 percent in 2018. In 2017, the devices segment experienced growth for the first time in two years with an increase of 5.7 percent. End-user spending on mobile phones is expected to increase marginally as average selling prices continue to creep upward even as unit sales are forecast to be lower. PC growth is expected to be flat in 2018 even as continued Windows 10 migration is expected to drive positive growth in the business market in China, Latin America and Eastern Europe. The impact of the iPhone 8 and iPhone X was minimal in 2017, as expected. However, iOS shipments are expected to grow 9.1 percent in 2018.
"Looking at some of the key areas driving spending over the next few years, Gartner forecasts $2.9 trillion in new business value opportunities attributable to AI by 2021, as well as the ability to recover 6.2 billion hours of worker productivity," said Mr. Lovelock. "That business value is attributable to using AI to, for example, drive efficiency gains, create insights that personalize the customer experience, entice engagement and commerce, and aid in expanding revenue-generating opportunities as part of new business models driven by the insights from data."
"Capturing the potential business value will require spending, especially when seeking the more near-term cost savings. Spending on AI for customer experience and revenue generation will likely benefit from AI being a force multiplier — the cost to implement will be exceeded by the positive network effects and resulting increase in revenue," said Mr. Lovelock.
Gartner, Inc. forecasts worldwide enterprise security spending to total $96.3 billion in 2018, an increase of 8 percent from 2017. Organizations are spending more on security as a result of regulations, shifting buyer mindset, awareness of emerging threats and the evolution to a digital business strategy.
"Overall, a large portion of security spending is driven by an organization's reaction toward security breaches as more high profile cyberattacks and data breaches affect organizations worldwide," said Ruggero Contu, research director at Gartner. "Cyberattacks such as WannaCry and NotPetya, and most recently the Equifax breach, have a direct effect on security spend, because these types of attacks last up to three years."
This is validated by Gartner's 2016 security buying behavior survey*. Of the 53 percent of organizations that cited security risks as the No. 1 driver for overall security spending, the highest percentage of respondents said that a security breach is the main security risk influencing their security spending.
As a result, security testing, IT outsourcing and security information and event management (SIEM) will be among the fastest-growing security subsegments driving growth in the infrastructure protection and security services segments (see Table 1).
Worldwide Security Spending by Segment, 2016-2018 (Millions of Current Dollars)
| || |
Identity Access Management
Network Security Equipment
Consumer Security Software
Source: Gartner (December 2017)
Gartner analysts said that several other factors are also fuelling higher security spending.
Regulatory compliance and data privacy have been stimulating spending on security during the past three years, in the US (with regulations such as the Health Insurance Portability and Accountability Act, National Institute of Standards and Technology, and Overseas Citizenship of India) but most recently in Europe around the General Data Protection Regulation coming into force on 28th May 2018, as well as in China with the Cybersecurity Law that came into effect in June 2016. These regulations translate into increased spending, particularly in data security tools, privileged access management and SIEM.
Gartner forecasts that by 2020, more than 60 percent of organizations will invest in multiple data security tools such as data loss prevention, encryption and data-centric audit and protections tools, up from approximately 35 percent today.
Skills shortages, technical complexity and the threat landscape will continue to drive the move to automation and outsourcing. "Skill sets are scarce and therefore remain at a premium, leading organizations to seek external help from security consultants, managed security service providers and outsourcers," said Mr. Contu. "In 2018, spending on security outsourcing services will total $18.5 billion, an 11 percent increase from 2017. The IT outsourcing segment is the second-largest security spending segment after consulting."
Gartner predicts that by 2019, total enterprise spending on security outsourcing services will be 75 percent of the spending on security software and hardware products, up from 63 percent in 2016.
Enterprise security budgets are also shifting towards detection and response, and this trend will drive security market growth during the next five years. "This increased focus on detection and response to security incidents has enabled technologies such as endpoint detection and response, and user entity and behavior analytics to disrupt traditional markets such as endpoint protection platforms and SIEM," said Mr. Contu.
2020 will be a pivotal year in AI-related employment dynamics, according to Gartner, Inc., as artificial intelligence (AI) will become a positive job motivator.
The number of jobs affected by AI will vary by industry; through 2019, healthcare, the public sector and education will see continuously growing job demand while manufacturing will be hit the hardest. Starting in 2020, AI-related job creation will cross into positive territory, reaching two million net-new jobs in 2025.
"Many significant innovations in the past have been associated with a transition period of temporary job loss, followed by recovery, then business transformation and AI will likely follow this route," said Svetlana Sicular, research vice president at Gartner. AI will improve the productivity of many jobs, eliminating millions of middle- and low-level positions, but also creating millions more new positions of highly skilled, management and even the entry-level and low-skilled variety.
"Unfortunately, most calamitous warnings of job losses confuse AI with automation — that overshadows the greatest AI benefit — AI augmentation — a combination of human and artificial intelligence, where both complement each other."
IT leaders should not only focus on the projected net increase of jobs. With each investment in AI-enabled technologies, they must take into consideration what jobs will be lost, what jobs will be created, and how it will transform how workers collaborate with others, make decisions and get work done.
"Now is the time to really impact your long-term AI direction," said Ms. Sicular. "For the greatest value, focus on augmenting people with AI. Enrich people's jobs, reimagine old tasks and create new industries. Transform your culture to make it rapidly adaptable to AI-related opportunities or threats."
Gartner identified additional predictions related to AI’s impact on the workplace:
AI has already been applied to highly repeatable tasks where large quantities of observations and decisions can be analyzed for patterns. However, applying AI to less-routine work that is more varied due to lower repeatability will soon start yielding superior benefits. AI applied to nonroutine work is more likely to assist humans than replace them as combinations of humans and machines will perform more effectively than either human experts or AI-driven machines working alone will.
By 2022, one in five workers engaged in mostly nonroutine tasks will rely on AI to do a job.
"Using AI to auto-generate a weekly status report or pick the top five emails in your inbox doesn't have the same wow factor as, say, curing a disease would, which is why these near-term, practical uses go unnoticed," said Craig Roth, research vice president at Gartner. "Companies are just beginning to seize the opportunity to improve nonroutine work through AI by applying it to general-purpose tools. Once knowledge workers incorporate AI into their work processes as a virtual secretary or intern, robo-employees will become a competitive necessity."
Leveraging technologies such as AI and robotics, retailers will use intelligent process automation to identify, optimize and automate labor-intensive and repetitive activities that are currently performed by humans, reducing labor costs through efficiency from headquarters to distribution centers and stores. Many retailers are already expanding technology use to improve the in-store check-out process.
Through 2022, multichannel retailer efforts to replace sales associates through AI will prove unsuccessful, although cashier and operational jobs will be disrupted.
However, research suggests that many consumers still prefer to interact with a knowledgeable sales associate when visiting a store, particularly in specialized areas such as home improvement, drugstores and cosmetics, where informed associates can make a significant impact on customer satisfaction. Though they will reduce labor used for check-out and other operational activities, retailers will find it difficult to eliminate traditional sales advisers.
"Retailers will be able to make labor savings by eliminating highly repetitive and transactional jobs, but will need to reinvest some of those savings into training associates who can enhance the customer experience," said Robert Hetu, research director at Gartner "As such most retailers will come to view AI as a way to augment customer experiences rather than just removing humans from every process."While many industries will receive growing business value from AI, manufacturing is one that will receive a massive share of the business value opportunity. Automation will lead to cost savings, while the removal of friction in value chains will increase revenue further, for example, in the optimization of supply chains and go-to-market activities.
In 2021, AI augmentation will generate $2.9 trillion in business value and recover 6.2 billion hours of worker productivity.
However, some industries, such as outsourcing, are seeing a fundamental change in their business models, whereby the cost reduction from AI and the resulting productivity improvement must be reinvested to allow reinvention and the perusal of new business model opportunities.
"AI can take on repetitive and mundane tasks, freeing up humans for other activities, but the symbiosis of humans with AI will be more nuanced and will require reinvestment and reinvention instead of simply automating existing practices," said Mike Rollings, research vice president at Gartner. "Rather than have a machine replicating the steps that a human performs to reach a particular judgment, the entire decision process can be refactored to use the relative strengths and weaknesses of both machine and human to maximize value generation and redistribute decision making to increase agility."
According to the International Data Corporation (IDC) Worldwide Quarterly Cloud IT Infrastructure Tracker, vendor revenue from sales of infrastructure products (server, storage, and Ethernet switch) for cloud IT, including public and private cloud, grew 25.5% year over year in the third quarter of 2017 (3Q17), reaching $11.3 billion.
Public cloud infrastructure revenue grew 32.3% year over year in 3Q17 to $7.7 billion and now represents 30.2% of total worldwide IT infrastructure spending, up from 26.3% one year ago. Private cloud revenue reached $3.6 billion for an annual increase of 13.1%. Total worldwide cloud IT infrastructure revenue is on pace to nearly double in 2017 when compared to 2013. Traditional (non-cloud) IT infrastructure revenue grew 8.0% from a year ago, although it has been generally declining over the past several years; despite the declining trend, at $14.2 billion in 3Q17 traditional IT still represents 55.6% of total worldwide IT infrastructure spending.
Public cloud also represented 68.0% of the total cloud IT infrastructure revenue in 3Q17. The market with the highest growth in the public cloud infrastructure segment was Storage Platforms with revenue up 45.1% compared to the same quarter of the previous year, and making up 42.0% of the revenue in public cloud. Compute Platforms and Ethernet Switch public cloud IT infrastructure revenues were up 24.8% and 23.2%, respectively. Compute Platforms represented 43.9% of public cloud IT infrastructure revenue. Private cloud infrastructure revenue was driven by the Storage Platforms growth of 16.1% year over year.
"2017 has been a strong year for public cloud IT infrastructure growth, accelerating throughout the year," said Kuba Stolarski, research director for Computing P latforms at IDC. "While hyperscalers such as Amazon and Google are driving the lion's share of the growth, IDC is seeing strong growth in the lower tiers of public cloud and continued growth in private cloud on a worldwide scale. In the near term, new Intel and AMD platforms released during 2017 should aid in refresh and infrastructure expansion throughout the cloud IT infrastructure segment."
Except for Latin America revenue, which grew 5.0% from a year ago, all other regions in the world grew their cloud IT Infrastructure revenue by double digits. Asia/Pacific (excluding Japan) and Central and Eastern Europe (CEE) saw the fastest growth rates at 50.1% and 35.3%, respectively. Canada (22.5%) and Western Europe (24.6%) had annual growth in the twenties, while the U.S. (18.7%), Japan (17.5%), and Middle East & Africa (MEA) (15.8%) had annual growth in the teens.
Top Companies, Worldwide Cloud IT Infrastructure Vendor Revenue, Market Share, and Year-Over-Year Growth, Q3 2017 (Revenues are in Millions)
3Q17 Revenue (US$M)
3Q17 Market Share
3Q16 Revenue (US$M)
3Q16 Market Share
3Q17/3Q16 Revenue Growth
1. Dell Inc
2. HPE/New H3C Group**
IDC's Quarterly Cloud IT Infrastructure Tracker, Q3 2017 January 11, 2018
* IDC declares a statistical tie in the worldwide cloud IT infrastructure market when there is a difference of one percent or less in the vendor revenue shares among two or more vendors.
** Due to the existing joint venture between HPE and the New H3C Group, IDC will be reporting external market share on a global level for HPE as "HPE/New H3C Group" starting from Q2 2016 and going forward.
Research shows that managed services may be the only chance for growth in the IT industry’s channel; resellers are still switching to this sales model in large numbers, but the sales process and customers relationships are very different; and some of the technology issues need new skills.
Managed services in 2018 will need to deal with a number of issues – some, like security are external, others like the changes needed in sales processes and customer engagement and security, are internal factors. One the main pressures will continue to be the availability of skilled resources, both in sales and in the area of security, where GDPR, to be introduced in May 2018, will provide the main impetus for re-analysis of their positions by most MSPs.
Research from IT Europa ( http://www.iteuropa.com/?q=market-intelligence/managed-service-providers-europe-msps-top-1000) and others shows that there is a continuing race to scale as the economics of managed services depend on having a large customer base, but at the same time, because of the expertise needed to deliver specific vertical market applications, many are having to build on their strengths and specialist further.
The changing nature of managed services….
The bigger MSPs are growing fastest, says the research, and in Europe, the Netherlands has overtaken Germany in numbers of large MSPs. The Netherlands has seen a dramatic acceleration in the number of data centres situated there in recent years. The UK is still biggest, and now has 36% of Europe’s largest managed services providers and is the largest individual market. The technology is changing as well: when asked about what is on the horizon, MSPs say Internet of Things (IoT) has started to appear as an MSP solution area.
This latest study of Europe’s managed services providers shows increased consolidation as well as more specialisation by application area. In the study of the top 1500 MSPs 2017, the listed companies – 112 in number - saw their sales rise by 7.5% yr/yr. The smaller independents by contrast managed a lower 5.5% growth. One reason for the changes has been the rush for scale among managed services companies, with a high rate of mergers among the small players, and acquisitions by larger firms. There seems to be no shortage of available funding, either from the industry itself or venture capital.
These results and a wider discussion on the changing nature of the MSP will be a feature of the
Managed Services & Hosting Summit – Europe, at the Novotel Amsterdam City • Amsterdam on 29th May 2018 (http://www.mshsummit.com/amsterdam/index.php).
This is an invitation-only executive-level conference exploring the business opportunities for the ICT channel around the delivery of Managed Services and Hosting. Topics for discussion will include sales and marketing processes, GDPR, building value in a business with an eye on the mergers and acquisitions market, and skills development to get into those higher margin areas. This is a timely event as the rapid and accelerating change in the way customers wish to purchase, consume and pay for their IT solutions is forcing the channel to completely redefine its role, business models and relationships.
The Managed Services & Hosting Summit is firmly established as the leading managed services event for channel organisations. Now in its eighth year as a UK event, the Managed Services & Hosting Summit Europe is being staged for the second time in Amsterdam and will examine the issues facing Managed Service Providers, hosting companies, channel partners and suppliers as they seek to add value and evolve new business models and relationships.
The Managed Services & Hosting Summit – Europe 2018 features conference session presentations by major industry speakers and a range of breakout sessions exploring in further detail some of the major issues impacting the development of managed services.
The summit will also provide extensive networking time for delegates to meet with potential business partners. The unique mix of high-level presentations plus the ability to meet, discuss and debate the related business issues with sponsors and peers across the industry, makes this a must attend event for any senior decision maker in the ICT channel.
The next Data Centre Transformation events, organised by Angel Business Communications in association with DataCentre Solutions, the Data Centre Alliance, The University of Leeds and RISE SICS North, take place on 3 July 2018 at the University of Manchester and 5 July 2018 at the University of Surrey.
For the 2018 events, we’re taking our title literally, so the focus is on each of the three strands of our title: DATA, CENTRE and TRANSFORMATION.
The DATA strand will feature two Workshops on Digital Business and Digital Skills together with a Keynote on Security. Digital transformation is the driving force in the business world right now, and the impact that this is having on the IT function and, crucially, the data centre infrastructure of organisations is something that is, perhaps, not as yet fully understood. No doubt this is in part due to the lack of digital skills available in the workplace right now – a problem which, unless addressed, urgently, will only continue to grow. As for security, hardly a day goes by without news headlines focusing on the latest, high profile data breach at some public or private organisation. Digital business offers many benefits, but it also introduces further potential security issues that need to be addressed. The Digital Business, Digital Skills and Security sessions at DTC will discuss the many issues that need to be addressed, and, hopefully, come up with some helpful solutions.
The CENTRES track features two Workshops on Energy and Hybrid DC with a Keynote on Connectivity. Energy supply and cost remains a major part of the data centre management piece, and this track will look at the technology innovations that are impacting on the supply and use of energy within the data centre. Fewer and fewer organisations have a pure-play in-house data centre real estate; most now make use of some kind of colo and/or managed services offerings. Further, the idea of one or a handful of centralised data centres is now being challenged by the emergence of edge computing. So, in-house and third party data centre facilities, combined with a mixture of centralised, regional and very local sites, makes for a very new and challenging data centre landscape. As for connectivity – feeds and speeds remain critical for many business applications, and it’s good to know what’s around the corner in this fast moving world of networks, telecoms and the like.
The TRANSFORMATION strand features Workshops on Automation and The Connected World together with a Keynote on Automation (Ai/IoT). IoT, AI, ML, RPA – automation in all its various guises is becoming an increasingly important part of the digital business world. In terms of the data centre, the challenges are twofold. How can these automation technologies best be used to improve the design, day to day running, overall management and maintenance of data centre facilities? And how will data centres need to evolve to cope with the increasingly large volumes of applications, data and new-style IT equipment that provide the foundations for this real-time, automated world? Flexibility, agility, security, reliability, resilience, speeds and feeds – they’ve never been so important!
Delegates select two 70 minute workshops to attend and take part in an interactive discussion led by an Industry Chair and featuring panellists - specialists and protagonists - in the subject. The workshops will ensure that delegates not only earn valuable CPD accreditation points but also have an open forum to speak with their peers, academics and leading vendors and suppliers.
There is also a Technical track where our Sponsors will present 15 minute technical sessions on a range of subjects. Keynote presentations in each of the themes together with plenty of networking time to catch up with old friends and make new contacts make this a must-do day in the DC event calendar. Visit the website for more information on this dynamic academic and industry collaborative information exchange.
This expanded and innovative conference programme recognises that data centres do not exist in splendid isolation, but are the foundation of today’s dynamic, digital world. Agility, mobility, scalability, reliability and accessibility are the key drivers for the enterprise as it seeks to ensure the ultimate customer experience. Data centres have a vital role to play in ensuring that the applications and support organisations can connect to their customers seamlessly – wherever and whenever they are being accessed. And that’s why our 2018 Data Centre Transformation events, Manchester and Surrey, will focus on the constantly changing demands being made on the data centre in this new, digital age, concentrating on how the data centre is evolving to meet these challenges.
Couchbase helps revolutionise digital engagement for the fashion industry with Tommy Hilfiger.
In the global fashion industry, digital innovation plays a critical role in keeping in step with the needs of both consumers and customers. In the era of instant gratification, brands need to deliver a “wow” factor through engaging experiences, which keep them at the forefront of not only consumers’ minds, but also those of retail and wholesale partners. Staying competitive can mean offering consumers new ways to shop or optimizing the sales process to reduce time to market for new collections. Tommy Hilfiger is one example of a global brand that is harnessing the potential of digital transformation to improve its partners’ experience.
The Challenge: Streamline sales and create a more attractive experience for retailers while maintaining sustainability and corporate responsibility.
As technology continues to revolutionize the way customers shop, success in the industry means keeping up with this at every single stage, from the showroom to the shop floor. As part of its digital strategy, Tommy Hilfiger aims to streamline its sales processes and shorten the window between retailer previews of new collections and actually delivering those new products to stores. At the same time, the company seeks to minimize the need to produce and transport samples: reducing costs while maintaining the company’s ongoing drive towards sustainability and corporate responsibility by minimizing the environmental impact that comes with sample creation and shipping throughout the supply chain.
Couchbase’s data platform has supported Tommy Hilfiger in realizing this ambition with the introduction of global Digital Showrooms. By allowing buyers to browse collections, view pieces, and create custom laydowns and orders via touchscreen workstations and a theatre of ultra-high-definition, 4K screens, the Digital Showroom offers a forward-thinking approach to the sales process. It removes the need to create, examine, and deliver samples to retail locations around the world for every new collection. As a result, Tommy Hilfiger can deliver a transformational, engaging experience to partners as they browse and buy the season’s new collection.
Couchbase recognized that in order for the Digital Showroom to provide a truly tailored experience for retail and wholesale partners, it needed a data platform to support this new approach to sales: delivering an easy-to-use, reliable experience to partners.
• Access and share product specifications in real time according to each customer’s needs
• Support a one-click ordering system to complete sales and deliver products onto store shelves faster
• Add new product lines and functionalities as they are created
• Support in-depth analysis of customer orders to help develop a more targeted, relevant, and successful sales experience
• Allow the Digital Showroom to be easily replicated in various locations around the world
The Solution: Couchbase underpins the Digital Showroom with the most powerful NoSQL platform.
Couchbase provided the ideal data platform to underpin the Digital Showroom and help build an engaging digital experience that had never existed before. As the most powerful NoSQL platform available, Couchbase has the flexibility, scalability, and power to support the Digital Showroom’s needs and ambition. Couchbase’s ease of use also meant that the technology rollout was very straightforward.
The Result: A scalable, digitally engaging buying experience and faster time to market for new collections.
Faster sales, greater sustainability Couchbase’s technology has contributed to the success of the Digital Showroom since its launch in 2015. The expected benefits of a faster sales process and reduced sample production are already being realized. For instance, when Tommy Hilfiger’s Asia-Pacific team visited Europe for a buying session, the visit was significantly shortened from the usual three days to just one. It is also recording sales increases, with pre-Fall sales for the Middle East, Africa, and the Netherlands already growing.
Anywhere, anytime engagement With Couchbase, Tommy Hilfiger can develop and deliver a universally engaging experience regardless of device, location, or connectivity. The company can add, access, and combine data in real time due to Couchbase’s NoSQL architecture, so retailers can not only inspect, modify, and create orders as they browse collections, but also place their final order and arrange delivery immediately. At the same time, the Couchbase data platform can reliably operate offline without depending on constant network access to a central data store.
Scaling and expanding to keep pace with growth The Couchbase data platform scales quickly and easily, supporting growth and expansion of the Digital Showroom as well as other innovation initiatives that help the company improve business processes. Tommy Hilfiger can also continuously add to the number of collections available through the Digital Showroom and expand the concept to locations across the world. From the first launched in Amsterdam, there are now 24 Digital Showroom theatres with 59 workstations, in nine cities across the globe: Amsterdam, Milan, Paris, London, Dusseldorf, Stockholm, Copenhagen, New York, and Hong Kong.
With an ambitious rollout plan by the end of 2018, the aim is to have Digital Showrooms in more than 25 locations worldwide with over 100 workstations. With Couchbase powering the Digital Showroom, Tommy Hilfiger can offer a consistently engaging, integrated, and seamless brand buying experience across every market and showroom regardless of market or customer size, anywhere in the world.
“Our Digital Showroom revolutionizes the buying and selling journey for our retail customers and internal sales teams,” said Daniel Grieder, CEO, Tommy Hilfiger. “We are passionate about providing our clients with the best service, experience, and quality. Our Digital Showroom concept completely reimagines the traditional buying approach and establishes a new fashion industry benchmark for business-to-business sales. The concept also supports our ongoing focus on efficiency and will significantly streamline and enhance the Tommy Hilfiger sales experience.”
Wayne Carter, chief architect of mobile at Couchbase said: “Ultimately, the future of retail is digital. It’s no exaggeration to say that Tommy Hilfiger and its Digital Showroom are changing the fashion industry forever and its success is a model for all other brands to follow. Ten years ago, this kind of project would have been unimaginable, with the digital experience sorely lacking. Today, however, digital is as good as real; and often better. In time, both consumers and retailers will expect the digital experience and in-person experience to overlap seamlessly. The ability to access data, and use it to engage with these audiences, will be critical to ensuring success.”
Boosts service quality while reducing costs compared to previous operating system.
In the global cloud hosting arena, City Network stands out from the pack thanks to superb service quality and a unique focus on industry-specific regulatory compliance. Always on the lookout for new ways to enhance this competitive edge, City Network realized Ubuntu offered a highly compelling alternative to its existing, legacy Linux operating system. Easier to use, easier to manage, and backed by Canonical’s expert support, Ubuntu is helping City Network reach new heights of quality and customer satisfaction.
Marrying compliance, transparency, and agility
Today, agility is the key to business success. Companies in every industry are striving to deliver new services more quickly, and they are constantly looking for ways to increase the pace and cost-effectiveness of innovation.
This trend has led to a significant rise in the popularity of infrastructure-as-a-service (IaaS). One of the primary elements of any digital transformation, cloud hosting can deliver the scalability, performance, and flexibility to dramatically improve time-tomarket on new products and services. Yet, for many businesses, stringent laws and regulations make it difficult to adopt IaaS while remaining compliant. This is a problem in particular for highly regulated sectors such as financial services, but with the EU general Data Protection Regulation looming, compliance is becoming an increasingly widespread concern.
This is where City Network comes in – bridging the gap between agility and compliance.
Johan Christenson, CEO of City Network, explains: “The IaaS market is dominated by giants, yet those giants largely ignore the compliance niche. In addition to our flagship public cloud, we also offer semi-private, private, and hybrid ‘Compliant Cloud’ services. These services are tailored to satisfy a vast number of international ISO standards and industry-specific regulations, so our customers can enjoy the advantages of cloud hosting without having to worry about compliance.”
For City Network, quality is everything, so it is always alert to new ways that it can improve its services and sharpen its competitive edge. As a strong proponent of openness and transparency, in 2014 City Network became the first European hosting provider to offer OpenStack to its customers and today, City Network runs the most OpenStack-based public cloud nodes in the world. Yet recently, the company decided that its existing operating system was not providing the best platform for its OpenStack services.
“We knew that there was room to improve when it came to our underlying operating system,” continues Johan Christenson. “It’s important to us to be as nimble as possible, yet the Linux distribution we were using could be unwieldy and it lacked responsive support. What’s more, it wasn’t especially cost-effective. It was clear that the time had come to find a new solution.”
New platform, new partner
City Network set its sights on finding an alternative operating system that was easier to manage, more reasonably priced, and better supported and Ubuntu emerged as the ideal solution.
“We chose Ubuntu for several reasons,” explains Johan Christenson. “The first is quality: the processes that Canonical has put in place made us confident that Ubuntu is the best option for supporting OpenStack on a global scale.
“Second is ease-of-use. Ubuntu is highly intuitive, so our devs can work quickly and accurately and simplicity brings quality. Additionally, the flexibility of how Canonical delivers OpenStack makes it much easier to implement compliance in a scalable way.
“Canonical’s support were also more responsive to our needs, and they were keen to work with us to address our specific challenges.
“Ultimately, we found that Ubuntu was less expensive and more effective than our existing operating system gradually switching to Ubuntu was a unanimous, corporation-wide decision.”
Once City Network received the go-ahead from its regulatory team, the company immediately went ahead with its plan to make Ubuntu its platform of choice, beginning the process of implementing the operating system at all of its 20 data centres worldwide.
to manage its Ubuntu deployments and streamline new cloud launches, City Network is utilising the tools and support delivered by Canonical through the Ubuntu Advantage package. As an Ubuntu Certified Public Cloud Partner, City Network has the further benefit of regularly updated, certified Ubuntu cloud images, which guarantee optimal performance.
Partnering with Canonical also gives City Network the option to sell Ubuntu Advantage on to its own users. This opens up additional revenue streams for City Network, and enables its customers to enjoy direct support from Canonical.
Johan Christenson comments: “Banks and insurance companies demand a very high level of security and support. So being able to offer Ubuntu Advantage is critical for us.”
Despite the challenges involved in migrating such a large production environment to a new platform, the process is going as smoothly as can be expected, and City Network is on track to complete the move by the end of the year.
Quality, flexibility, value
So far, City Network has transitioned seven data centres over to Openstack on Ubuntu, and it is already seeing considerable benefits.
“Few companies run OpenStack operationally on the scale that we do,” says Johan Christenson. “Being able to run so many connected data centres is massive. With Ubuntu we can keep the quality up and upgrade without downtime it works really well.”
Since Ubuntu is so much easier to work with, City Network’s employees are significantly happier. The company’s operating costs are also lower, and it is able to pass on these saving to its users.
Most importantly, Ubuntu is leading to happier customers, as Johan Christenson explains: “We’re coming in with a package that customers really appreciate. We’re delivering high quality at a fair price, and our clients trust in our services not just in City Network itself, but in the operating system and support as well.”
Switching to Ubuntu positions City Network perfectly to support digital transformation in the financial services sector. As the drive towards agility grows, financial institutions are becoming increasingly interested in using Ubuntu as their core operating system.
“Recently, one of Sweden’s leading banks engaged us to host the infrastructure for the heart of their business,” confirms Johan Christenson. “This is the first time City Network will be hosting the mission-critical applications of such a large bank, and Ubuntu was essential in securing the deal. Like us, Canonical are nimble and fairly priced – so together we can provide the flexibility that the bank requires, combined with compliance and value.”
Johan Christensen concludes: “Ubuntu is an ideal fit, both functionally and philosophically. From our perspective, there’s no better operating system for OpenStack.”
Since the earliest ecommerce sites went live online, journalists, retail pundits and internet commentators have spoken about the so-called “death of the high street”. But with two decades having passed since those early days of the internet, the vast majority of retailers are still yet to see a complete transition to online shopping.
By Geoff Galat, CMO at Clicktale.
While ecommerce may not yet have forced all traditional retailers out of business, there is no denying that the high street is in a sharp period of decline. According to the latest ONS retail statistics, high street shopping is at a near all-time low, with online sales rising by 21% in the last four years. Seeing this decline, retail commentators predict that half of the UK’s shop premises will have disappeared by 2030, forcing ever more retailers – and customers – online.
Faced with this migration into the digital space, retailers must begin to look for ways to provide their customers with the same standards of experience that they have come to expect in more traditional high street stores.
Simply focusing on converting browsers into customers is no-longer enough. Businesses must now look to understand their customers’ digital experiences – tailoring their approach to individual mindsets, personas and buying moods. The “success” of an online store cannot be measured by mere conversion rates alone, it must be defined by customer experiences, incorporating insights from consumer behaviours, shopping trends and long-term browsing habits.
So how can brands meet this demand and get on board with the digital shift in order to create better customer experiences? Here are five ways that businesses can get started:
To understand and interpret customer needs, brands must reflect on information received from their ‘digital body language’. In our day to day lives we obtain information from others around us; non-verbal clues, facial expressions and gestures. In a traditional high street store, this body language can be vital in deciding how and when to approach a customer or to tailor their experience for the best possible results. In an online environment however, the inability to read a customer’s body language can make it extremely difficult for retailers to tailor their experiences.
This is where digital body language comes in. Through the use of Experience Analytics retailers can anonymously analyse subconscious online behaviours such as micro-signals and gestures – helping to gain an understanding of the customers’ happiness or displeasure, then adapting accordingly.
This layer of insight is advantageous when creating tailor-made digital experiences for customers visiting the retailer’s online store. Instead of being given a one-size-fits-all interaction, they will have an experience better suited to their personal needs, with the path to purchase becoming more linear and ultimately providing a better business outcome.
Regardless of the data available to them, retailers cannot change the mind-sets of the customers arriving on their websites. What they can do however, is use their insights to create a better environment around these customers.
By analysing a customer’s digital body language, retailers can begin to bridge the gap between customer expectations and the realities of online shopping. Taking on the role of a salesperson on the shop floor, such insights can be used to aid a confused customer, resolving potential issues instantly and preventing them from happening again. Each time a problem is resolved, there is potential for the customer journey to improve – a rewarding outcome for both customers and brands.
Online retailers must learn to accept that every customer is unique, with personality traits, moods and different ways of browsing online environments. The process of understanding digital body language must be carefully achieved over a period of time, as each customer provides something original and valuable. By tracking visitor patterns and clicks, and using visualisation tools such as heat-maps, brands can uncover a wealth of rich ‘body language’ data that traditional analytics would never uncover.
For many retailers, digital body language remains a new and largely unexplored concept. In reality however, it is only the tip of a much larger trend – the shift towards big data and Experience Analytics.
Once a retailer has developed enough data about the digital body language of its customers, it can begin to cross examine that data, finding patterns and trends across the brand’s entire customer base. As these insights develop, customer archetypes can be narrowed down and filtered into detailed categories which can become increasingly refined over time. These in-depth insights can prove crucial for understanding the different states of minds that customers experience when they shop.
Over time it will become clear how important it is to understand digital body language and have customer experience embedded at the very heart of online retail brands. The ability to construct an extensive database of customer knowledge gives retailers scope to build rapport with customers, allowing both parties to ultimately benefit from this improved relationship.
Retail is evolving at a faster pace than any time in recent memory, with mobile firmly established as the platform of the future. Retailers that are still merely ‘cobbling’ platforms together risk being left behind as customers want a quick and seamless omni-channel experience and will go elsewhere if brands cannot provide this.
By Scott Clarke, Chief Digital Officer and Global Consulting Leader for Retail, Consumer Goods, Travel and Hospitality, Cognizant.
So what is the key to maintaining a competitive edge? Simply put, it is technology: the catalyst for getting consumers into stores, the brains behind every interaction, and shaping every angle of the customer experience. As omni-channel blurs the line between online and offline, retailers should make it their mission to improve the customer’s shopping experience on both. Here, we discuss the five biggest trends set to boost the customer experience and brand loyalty in retail over the coming year.
1. Conversational AI
By 2020, more than two billion people will use conversational AI on a regular basis. It will become a vital – and possibly the primary – tool for finding, researching and buying products.
In a conversational AI world, virtual assistants will search, open, fetch, command and engage the dozen or more websites, portals, apps and systems we all interact with on a daily basis. “My virtual agent does that” will become the new “there is an app for that”. And whether businesses sell value meals, luxury clothes, airline tickets, sports cars or hotel rooms, there are two important considerations to make a success of the voice-enabled marketplace. Firstly, retailers must make it easy for their customers to find them when they are simply using their voice. Secondly, they must make the buying process seamless once consumers connect to the brand. While some retailers have already made headway, others need to update their strategies to make sure they are prepared for this voice enabled reality.
2. Ultra-fast delivery
Speed and convenience have been key drivers in the retail sector for years. Research shows that 72 percent of consumers would spend more if they could be sure of same-day delivery, while figures indicate the UK high street missed out on almost £5 billion in sales mainly due to not having a same-day delivery option. But distribution is about to be super-charged, with Amazon already talking about 30-minute drone delivery as the capabilities of drones improve day-by-day.
Advancements in autonomous piloting, ‘sense and aware’ technologies and increased battery life mean that delivery drones show promise to be the next disruptive technology within the retail supply chain. In fact, the market is projected to reach $5.59 billion by 2020. Businesses would do well to start investigating strategies now to see if they would fit their business model. Whether to speed up retail deliveries or bring key parts and equipment to heavy industry projects, drones have the potential to enhance process efficiency and reduce costs.
3. Providing richer customer experiences
As technology drives greater innovation and access to new products, the differentiator becomes all about the customer’s experience with the brand. We anticipate more and more retailers will increase their focus on creating omni-channel shopping experiences that are highly personalised, contextual and compelling. The focus will be on delighting customers at each point of interaction, regardless of channel.
With more shoppers engaging in an omni-channel process before purchasing, it will become increasingly important to integrate physical and digital retail experiences. Utilising new technologies such as virtual agents, mobile wallet payments, beacons, face and object recognition, magic mirrors, and smart shelves, will help provide richer customer experiences, drive more footfall to physical stores and increase sales conversions.
4. Robotic tailoring
Personalisation is the battleground for various industries, including retail. However, in many cases, too much choice has diminished the experience of the customer, not enhanced it. We all feel this from time-to-time: too many different brands, colours, shapes and materials to choose from can actually be off-putting and stop us from shopping altogether.
In the future, retailers will produce bespoke and tailored clothing items in a matter of hours, rather than days or weeks. Consumers will no longer go into a store and buy a small or medium sized jumper. 3D body imaging in-store will allow retailers to retain customers’ measurements and preferences, enabling them to create tailored clothing, quickly en masse.
5. Smart stores
Admittedly, many brands have ploughed money into integrating technology into their physical stores to compete for footfall. However, it may not have taken off as expected, because many went head first into the investment without ensuring the appropriate back-end infrastructure, or training for sales assistants was in place to make the most of the benefits that in-store technology can bring. There is simply no point in retailers spending huge amounts of money on in-store technology if it does not work properly and nobody knows how to use it.
However, when done well smart stores have the ability to boost the bottom line. High-end retailer, Rebecca Minkoff, saw sales shoot up by more than 200 percent following the introduction of interactive touch-screens that let shoppers choose products to be sent to their dressing rooms. These dressing rooms also include interactive mirrors that can adjust lighting and contact sales associates. Screens will allow people to view the clothes they are trying on in different colours, sizes and looks, completely personalising the customer experience.
Start preparing now, the future is not that far away
If this decade thus far has taught us anything, it is that technological disruption is unpredictable. Mobile shopping may be the driving force in retail today, but in another decade, virtual and augmented reality could be shaping consumer trends. In retail, as in fashion, no one size will fit all, so retailers must use the data that they have at their fingertips to tailor their services to every individual customer’s preferences.
Our recent survey – The Digital Transformation PACT – found that many organizations are starting to leverage new technology which will radically change the way they do business. We tend to think of these organisations as financial institutions experimenting with blockchain, or manufacturers utilising IoT. What we don’t often think of is how digital is transforming traditional industries like agriculture.
By James Maynard, Offering Management Director – Innovative IoT Business Unit at Fujitsu.
Our survey found that 86% of businesses leaders believe that the ability to change will be crucial to their business’ survival in the next five years. As such the pairing of technology with farming and agriculture is what will shape and drive the agricultural industry in the years to come.
The forces driving this shift are, in many ways, bigger even than technology. The Food and Agriculture Organization of the United Nations predicted in 2009 that globally we need to produce 70% more food for an additional 2.3 billion people by 2050. The socio-economic pressures of this type of demand on our resources ae huge, and in this case technology can act as an enabler for solutions to what is a very serious and complex problem.
To cope with demand, and drive efficiency in the production process, farmers are increasingly turning to using more advanced technology than just five years ago. At Fujitsu we’ve been experimenting with human-centric IoT initiatives like our UBIQUITOUSWARE solutions, what we didn’t expect to be able to develop is something for livestock. Building a ‘Connected Cow’ to support livestock farmers.
The Connected Cow is a system whereby a pedometer monitors the steps a cow takes in a 24-hour period. It sends this data to the cloud, analyses it, and then accurately identify when oestrus – the period of fertility – starts. This data goes to the farmer’s smartphone, tablet or PC, and lets them know when they can artificially inseminate the cow in the optimal time frame.
With this Connected Cow technology the success rate of artificially inseminating cows rises from 44% to 90%. It has been shown that improving the detection of oestrus in dairy cows by 10% above the national average can improve profitability by 0.97p /litre.
It is an example of how the Internet of Things (IoT) as a whole is less about the individual components that collect the data, and more about the systems that can solve problems or industrial challenges in a practical way.
With increasing demand for more advanced technology, it comes as no surprise that a recent survey by McDonald's of UK farmers found that tech talent is becoming crucial for the farming industry.
Speaking to farmers across the country, 61% said they believe technology will have an impact on their business over the next five years.
Three quarters said they would need more access to digital and technology skills and more than half to data and coding knowledge. A further 81% went on to say that access to the right skills is their top priority over the coming 12 months.
In an industry that every individual in the world relies upon, farmers simply cannot afford to stand still - so it’s encouraging to see British farmers are being front-footed in their investment in technology and skills.
Our Digital Transformation PACT survey looked at the ingredients businesses need to successfully digitally transform, and found that businesses must focus on four strategic elements: People, Actions, Collaboration and Technology. It’s the people element that we’re seeing those in the framing industry struggling with most in being able to merge access to technology with the ability to apply it day-to-day.
Across the board, while organisations recognize the role of people in digital success and are taking steps to increase skills, there remains a problematic skills gap and businesses are conscious of the further impact of technological change.
As a technology provider we understand the need to support both industries in terms of solutions, but also the importance of working with educators to ensure there are enough people coming into the talent pool with the right skills to cope with the socio-economic pressures that industries like agriculture are facing.
Indeed schools are increasingly collaborating with industry partners to bring technology into the learning experience, and to provide learning that is practical and brings a sense of realism to subjects that can be difficult to comprehend.
At a time when British farming is facing a number of challenges, understanding how technology can be applied to empower farmers to achieve better results - while ensuring they have access to the right technology skills - is a vital step towards ensuring UK farming thrives.
The wearables market is one of the most talked about industries today, with its uses spanning far and wide across both consumers and businesses. Furthermore, it looks set to be a highly lucrative industry, with CCS Insights predicting it will be worth $20 billion by the year 2020.
By Dr. Shane Rooney, GSMA.
Although initially holding a reputation for being a ‘fun’ consumer-orientated market, wearables are now having a marked impact on a number of industries, proving to offer an enticing business opportunity. It seems consumers are also recognising the benefits, with the worldwide wearables market showing positive shipment growth at 10.3% year over year, reaching 26.3 million during the second quarter of 2017 alone, according to IDC.
While there are many examples of the growth of wearable technology across different sectors, healthcare is perhaps one industry that is seeing the greatest benefit, where devices are being used to track not only general fitness, but the well-being and safety of people across the globe.
Connected wearable devices, such as wristbands and heart monitors also offer benefits to the elderly, increasing access to health information and driving healthcare efficiencies. Heart monitors, for example can help to monitor vital signs, such as temperature and heart rate but they are also proving indispensable when it comes to signalling when elderly people suffer a fall. If the wearer of a connected device falls, the device can automatically alert friends and relatives, reducing the length of time they spend on the floor and the likelihood that they will need to be admitted to hospital.
Why mobile matters
Connected wristbands need to be fully mobile, rather than relying on a nearby hub or phone to communicate data. A fully mobile device can track the wearer’s location and status both inside and outside the home. To minimise the need to replace or recharge batteries, wearable devices also need to be very power-efficient. Employing low power wide area (LPWA) connectivity could increase a wearable’s battery life fivefold in comparison to conventional 2G cellular connectivity, according to Machina Research, meaning they work in a better and safer way, for whoever is using them.
LPWA networks are an emerging, high-growth area of the Internet of Things that complement and extend conventional wide area networks that make use of 2G, 3G and 4G cellular technologies. These types of networks are designed for low cost applications that have low data rates, long battery lives, long reach and operate in remote and hard to reach locations where existing mobile technologies are unable to penetrate.
Some of these gains could be used to streamline the form factor of the wearable device, as well as to increase the frequency of transmitted sensor readings, improving the related analytics. If they are equipped with a voice-capable LPWA technology, such as LTE-M, wearables can also support mobile voice communications to make emergency calls.
An example of this is happening right now in Denmark, where TDC Group is piloting a wristwatch that can monitor the wearer’s vital signs, providing live healthcare data to clinics and hospitals. TDC’s joint venture with Leikr and a local start-up MedHub is supplying the watches for the healthcare pilot using NB-IoT technology.
Just as the technology within the wearable device itself continues to evolve, it is clear so too will the way in which wearables are used, to meet ever changing needs and use cases. However, what will not change, is the need for these connected devices to be mobile, always connected and fully operational - even more so as they move further into the healthcare and assisted living space, where real time updates are fundamental. The unique attributes of NB-IoT technology, and LPWA networks means this type of connected infrastructure has the potential to support a number of wearable technology use cases – allowing innovators and operators to ensure successful deployment of new devices as the IoT industry takes holds.
The world today has a vast problem with food – from a lack of biodiversity to excessive wastage, from poor health linked to over consumption to massive food poverty. We grow enough food to feed 12 billion – far in excess of the seven billion population – yet more than one billion people are under fed. The UN estimates that, on our current path of food consumption and waste, by 2050 we will reach a tipping point and the world will be in a food crisis.
The problems extend from agriculture all the way through the food supply chain to the home, where food wastage – in more economically developed countries at least – is excessive. The UN target calls for the world to cut per capita food waste in half by 2030 – but while changing consumer education and expectation is essential, as is the drive to increase biodiversity, it is within the food supply chain that these changes will come together. Without democratising an incredibly consolidated food supply market, it will be impossible to reduce wastage, embrace innovation and change consumer behaviour. Systemic change is essential.
The way in which consumers have been educated to purchase food – both in store and in restaurants – has changed radically over the past few decades. Following significant consolidation, both retail and restaurant markets are dominated by a small number of organisations delivering a consistent and stable customer experience, one that offers products of identical size, shape and price irrespective of season or country of origin.
Of course, a sizeable proportion of fresh produce will never meet these unrealistic criteria. By creating a consumer expectation for blemish-free goods and specific size, food purveyors have built a market predicated on waste. Even if these ‘non-perfect’ items can be reallocated to sauces or ready meals, damage will occur at each stage of sorting and sifting that will result in further wastage.
Yet what has been achieved by this approach? Economically it is flawed, with subsidised agriculture and incredibly low margins for producers and retailers alike. Consumer populations – certainly in more economically developed countries – are less healthy, due in no small part to excessive consumption and the increasing use of excessive processing to address food safety concerns, especially regarding fresh food, and to extend shelf life. Yet, while much of the population feasts on unhealthy, processed food, by 2027 the world could be facing a 214 trillion calorie deficit. Something has gone awry with the global model of food production and consumption.
Over the past 50 years, the economies and ethics of food production have fallen out of sync. Farmers do not want to produce food that is wasted but every aspect of this low margin model results in wastage. Fears regarding food safety combined with failure of cold chain equipment leads inevitably to food being destroyed. But basic process failures are just one aspect of the problem.
The sheer cost of managing suppliers to ensure product consistency and safety makes it difficult for retailers to embrace new, innovative providers; whilst those with existing contracts cannot afford any risks associated with late delivery or under supply, and hence build in significant contingency. The result is not only more wastage but also minimal opportunity to invest in innovation, to explore opportunities for new, healthier food options or embrace automation to improve efficiency.
Clearly the systemic change required if the world is to avoid the predicted food crisis cannot be achieved overnight. In a difficult, low margin market, with small numbers of players fighting hard to retain share, it is incumbent upon innovators and disruptive market players to leverage digitisation to drive that change.
The most obvious role of digitisation is in minimising avoidable waste. When one in three freight journeys in the UK is food, the use of real-time information to improve routing and distribution planning is key to improving resource utilisation. In addition, using existing sensors on refrigeration units, heating units and air conditioning systems to raise alarms when problems occur to enable immediate rerouting or allocation of items, plus the use of predictive maintenance to avoid equipment downtime, can have a very significant impact on food wastage.
This approach is already being used by forward thinking organisations that are using digital and automation strategies today to reduce avoidable loss of food, achieve huge reduction in reactive maintenance costs, even reducing customer complaints. Add in the use of real-time data to support a comprehensive energy management strategy incorporating a range of different metrics, from seasonal differences to equipment reliability, and organisations can radically reduce annual power consumption. Together these changes result in a reduction in revenue expenditure of tens of millions and, in large estates, percentile point gains on capital employed can run into many hundreds.
Critically, this is being achieved by layering digitisation over existing infrastructure – clearly it is not feasible for retailers to rip and replace control infrastructure across hundreds or thousands of locations. The impact on both profit and customer experience would be hugely damaging.
Instead, by leveraging edge-based processing to ensure information from existing equipment throughout the supply chain is both actionable and actioned to make immediate changes, retailers are able to achieve IoT capacity at pace and with no downtime. It is this frictionless approach to digital adoption that will be key to releasing measurable value.
With this approach organisations can achieve a significant revenue uplift - without the need for massive investment. Indeed, it is the compelling ROI from this initial step of leveraging existing equipment that will be key to providing the investment that will underpin the next level of digitisation – the use of traceability systems to manage the advocacy, source and safety of food.
With the ability to confirm not only that products have been correctly produced but that they have followed the correct processes at every stage of the supply chain, from farm to retailer, digitisation provides a full audit trail of trusted information. This approach delivers low cost governance, radically reducing the cost of supplier ownership for retailers and opening up new opportunities for suppliers to enter the supply chain and create the democracy that is essential to enable innovation.
And it is this innovation that will be key to moving away from the entrenched practices of food procurement that have embedded consumer expectations and misunderstanding. A democracy of participation within the food market will help to educate consumers, improve understanding of food quality and the implications to health, and facilitate the introduction of new products and practices, including biodiversity, that deliver a new consumer experience.
A more predictable marketplace will also encourage investment, enabling SMEs to enter and embrace automation to replace the reliance upon cheap labour to improve productivity. The result should be not only less wastage and a fairer distribution of food globally but also a better consumer experience with access to fresher, healthier and less heavily processed food. In effect, the adoption of IoT to minimise avoidable waste within the retail cold food chain is the essential first step towards full digitisation throughout the food production lifecycle – digitisation that will underpin the global response to the developing food waste crisis.
A fundamental change to the global supply chain will take time. But there are very significant changes that can be made today that not only begin to address the wastage endemic within the food chain but also release the investment required to support the adoption of digitisation throughout the infrastructure that will be key to transforming the end to end business model.
It is by embracing digitisation to improve food safety and advocacy that the market can democratise access in order to generate the innovation key to making fundamental change, from automation to enhanced productivity to improving consumer education and supporting essential change in global food production and consumption.
There are great efficiencies to be reaped by manufacturers who are willing to incorporate the latest technological innovations into their existing manufacturing systems. The Internet of Things (IoT) has been well-documented as a likely fundamental pillar of the upcoming industrial revolution, set to upend the way we live and work. However, its applications within the manufacturing sector are limited without the involvement of artificially intelligent software.
By Chris Proctor, CEO of Oneserve.
The manufacturing industry has had a somewhat paradoxical relationship throughout history with technological innovation. In some instances, the manufacturing industry was at the forefront of developments, with inventions such as the linear assembly line driving an industrial revolution that defined the 18th century. Conversely, in recent times, manufacturing machinery has drawn a reputation for being old and cumbersome.
As a new technological era rapidly approaches, businesses are becoming acutely aware of how new software could revolutionize the way they work. It’s crucial that manufacturers look to incorporate new technologies within their systems, not least in order to stay competitive within the market, but also to make use of the vast financial and temporal efficiencies that they offer.
The Internet of Things (IoT) refers to the interconnection of devices and applications via the internet. Essentially, IoT is a comprehensive network of devices that collect and transfer data to and from one another over the internet. The most commonly used example of IoT in action is connected household appliances like smart fridges or smart energy meters, but the reach of IoT is vast and pervasive, and has particularly useful applications in the world of manufacturing.
A recent research study predicted that there would be 50 billion connected devices by 2020, putting the total global worth of IoT at $6.2 trillion. The manufacturing industry is set to contribute $2.3 trillion to this total by 2025 – a dramatic indication of the capacities these technologies have in the sector.
These devices collect data in enormous quantities, via the 'dumb' sensors embedded into them hence the name ‘Big Data’. When harnessed correctly these devices can provide extremely valuable insight for businesses on how their devices are operating.
By nature, due to the huge amount of machinery and devices that it employs, the manufacturing industry has potential access to extensive amounts of data detailing the operational capacities of each system. The problem, though, is that the data sets are so large that they are near impossible to interpret using existing processing systems, and without interpretation, they are of little use.
In short, there is a glaring discrepancy between the potential insight that manufacturers could have using the data they already hypothetically possess, and the observations that they are able to make using their current, legacy processing systems. This is where AI, or machine learning, is crucially required.
AI software, in particular machine learning programs, are able to close the gap between IoT and manufacturing by allowing businesses to better understand just how their machinery is working.
Sophisticated algorithms are able to interpret historical data sets much faster than existing systems are capable of, and in almost real-time. The analysis allows them to identify patterns and trends, which they can then use to inform later interpretation.
In essence, these algorithms are constantly 'learning' which offers an increasingly insightful interpretation over time. These algorithms are becoming so advanced that it is predicted they will eventually be able to function without any human or manual involvement, but we're still a long way from that.
Perhaps the most valuable application of AI in a manufacturing context is its ability to provide predictive maintenance. Intelligent machine learning algorithms are able to interpret historical data sets, and monitor activity to the extent that future failures can be predicted before they happen. By identifying patterns, this technology understands what the warning signs are for a potential system failure, and will alert the user to this before the failure happens, even autonomously arranging the appropriate specialist to come in and rectify the issue.
Applying AI to the vast data that is collected by a seemingly endless network of interconnected device also helps businesses make better decisions. With the analysis provided by these algorithms, businesses have tangible and sophisticated numerical evidence to support decisions that would have otherwise have to been based, at least in part, on intuition and estimation, I.e. whether a particular aspect of a manufacturing system is prohibitive in its functionality, and whether it needs to be replaced.
This has unprecedented potential in saving time and money for manufacturing companies, mostly because machine downtime is extremely costly for businesses in both financial and temporal terms. A recent research study found that machine downtime can cost businesses up to £18,000 per minute in lost productivity. Our research found that by employing predictive maintenance systems, companies can save over a third of businesses (38%) around 30 minutes per day in downtime, saving them £525,000 per year, per company. These systems have the potential to streamline efficiencies so greatly that they could reduce machine downtime almost completely.
It is clear that there are vast financial and temporal benefits to manufacturers who can not only embrace AI but link it to their existing IoT-enabled machines. However, it is also crucially important that businesses recognize just how pervasive the reach of such technologies will be in the near future. In order to stay competitive in a market that is becoming increasingly contested as manufacturing tech continues to progress, businesses must be fully aware of the disruptive potential AI and IoT have, and their capacity to define another industrial revolution.
Many businesses are sitting on a tremendous number of data sets that, when harnessed appropriately, provide a host of benefits. The analysis that is available through harnessing AI can provide businesses with discernable evidence to guide them through complex decision making, and can diminish machine downtime saving huge amounts of time and money.
Fundamentally, these benefits cannot be reaped without the application of AI. The machine learning offered by these systems provides insight and analysis that simply cannot be replicated by humans at the same level, on the same time frame. On the verge of another industrial revolution, it is imperative that manufacturers look to the potential AI has for them in closing this information void.
In the data centre industry huge advances in critical infrastructure technology are made year on year.
By Steven Carlini, Sr Director-Innovation, Data Center IT Division, CTO Office, Schneider Electric.
Looking back to 2017 a trend that grew to be a dominant force in the industry was Edge Computing. Cisco estimates that within five years 50 billion devices or ‘things’ will be connected to the Internet, and according to IDC, by the year 2020, around 1.7 megabytes of new information will be created every second for each human being.
As more devices become equipped with IoT-ready sensors - a market that’s predicted be worth around $7.1 trillion within the next five years – we can expect the stats around data from both Industrial IoT and Data Centre applications to increase further. All of this digital information will still need to be managed, processed, protected and analysed locally meaning that Edge will continue to be adopted widely.
At Schneider Electric we also believe that Edge will continue play a crucial role in supporting companies beginning to adopt or utilise Hybrid-Cloud services. Quick-to-deploy, prefabricated solutions such as Schneider Electrics Award-winning Micro Data Center Xpress™ will play a key part in the infrastructure environments of the future, especially those where latency, connectivity and service availability are critical to the success of the business.
2017 also saw a huge shift towards off premise IT provided by Internet Giants, Colocation and Telecoms Companies. A report by 451 Research’s Principle Analyst, Penny Jones predicted that the Multi-tenant-data-centre (MTDC) market would continue to grow throughout 2017, with an additional 423,000 square foot of colocation space estimated to be available by the year-end. It would therefore not be unreasonable to predict that this market will continue to grow throughout 2018, with many new players also emerging.
In 2017 Lithium-Ion chemistry became another well-established, and widely adopted battery technology for 3-Phase UPS Systems in Hyperscale data centres. But what’s also interesting is that throughout 2018 you’ll also see Li-Ion begin to emerge within Single-Phase, IoT-ready UPS Systems such as Schneider Electrics Smart-Ups On-Line.
Lastly, we believe that Scalability and Speed of deployment will become even more critical factors in todays data centre environments. In a recent analytical study by Schneider Electric’s Data Center Science Center it was estimated that IT Pod Frames could provide excellent CAPEX savings of up to 15%, whilst reducing time to deployment by a further 21%. Therefore technnologies utilising IT Pod Frames and rack-ready IT architectures such as Schneider Electric’s HyperPod™ may become more widely adopted within the industry throughout 2018.
Commercial industry has been radically changed by the application of digital technologies, and the pace of innovation continues to march quickly on. The Internet of Things (IoT), which provides the ability to map the physical world into a digital model, is now fundamentally re-defining how businesses across the globe operate.
By Roi Carmel, Senior VP of product and strategy for Perfecto.
Ground-breaking innovations like AI are fast transforming into reality - from cognitive chatbots like RBS’s Luvo - to trailblazing healthcare firms like BenevolentAI; who are able to harness AI enabled research to make incredible medical breakthroughs.
And it’s particularly good news for Brits as the United Kingdom takes the lead as the strongest AI ecosystem in Europe. A recent report has counted 121 AI firms in the UK, with London clearly the largest hub: more than double the next highest, Germany, which sits at 15.
So digital growth spells opportunity for many, but for some the flip side of digital innovation is digital disruption. This means that established companies and startups alike are enlisting new technologies in the fight to dislodge incumbents, protect entrenched positions - and to better battle their rivals. And the fierce competition that comes with digital disruption means that companies can no longer be complacent. They can either seize the opportunity - like game-changers Netflix or Instagram - or see their business disappear - like Kodak or Blockbuster.
Digital winners are those organisations that leverage technology to redefine how they serve and engage with customers. Software development teams are rewriting how brands deliver on their promise. No longer is a customer’s first meeting with a company in a shop -- or on the phone -- but instead it has become a digital interaction. Of course, the digital landscape is much more than just apps - physical products which harness connected technology are increasingly popular -- but the common theme is that the most successful companies are moving out of the physical world and into an online environment.
And - when we think of digital winners, it isn't hard to provide examples of the companies who get it right. In retail, Amazon has dwarfed its competition - doing a great job of helping consumers realise that their first port of call for anything they need is through their digital channel.
Tomorrow's winners are placing their transformation bets right now. In traditional sectors such as insurance, companies such as Splice Labs, are adapting to the ride-sharing, home sharing trends by offering on-demand, limited duration policies to fit today’s lifestyles. The same is true for automotive.
The automotive industry is being redefined today. Ride sharing and the treasure trove of data being captured by 100+ sensors embedded in cars will transform how revenue is measured - from units sold to customer lifetime value - and how to create new revenue data driven revenue streams. But who will be tomorrow’s winners and losers?
For us, one key example of the next stage of digital disruption is in enabling more natural online connections with organisations – as we move from interacting with brands through apps, to interacting with these apps through a human voice interface. Chatbots and voice enabled technology solutions are growing ever more popular as consumers again gravitate towards the easiest channel to deal with brands, and chatbots put additional onus on a company to get it right.
Where traditional touch/click interfaces required the user to understand and follow the logic of the app, chatbots change the dynamic. Companies are now taking responsibility for understanding contextually what customers are trying to achieve and to get them through that process as quickly as possible. No longer must a user navigate to the correct place, or search for a feature – the pressure is now on the supplier to know what they want. This might seem as a small change on the surface, but it introduces a sizeable cognitive difference in how consumers interact with brands, tech and specifically with apps.
Because of this change, Chatbots can open new lines of revenue generation. They can prompt consumers towards actions based on their behaviour and queries that they might not otherwise initiate. But, like other emerging technologies, while chatbots create opportunities, that opportunity also comes with another layer of complexity to navigate. A growing set of call centre functions, together with the difficulty of processing open-ended conversations, means that integrating new technologies like chatbots into existing introduces new challenges.
Voice activated technology and chatbots need to accommodate a broad dictionary, and in the future, support imagery inputs. Languages, voice variation, accents and speech impediments all need to be catered for – as well as steering clear of negative or provocative language - and lastly, usage of “millennial slang” adds another level of complexity to a burgeoning industry. Chatbots are complicated, and the requirements of this technology are continually changing – as customer expectations grow and competition becomes increasingly fierce.
So, defining “what is a digital winner” is in industry is just the beginning. The real challenge is how do we get to that winning position?
Can your software development teams consistently deliver flawless innovation?
For lots of companies, one of the greatest benefits of digital innovation is the ability to set big - aspirational - ambitions that are iteratively realised. In a digital world, there are no limits - and goals can be vast. Experts advise that even the largest companies try to “think like start-ups”, embracing a nimble approach where speed and agility is key. Remember:
1. The market is changing fast. Define ambitious goals, then incrementally deliver value. Being able to make changes to adapt is vital
2. You will not get everything right, so being in close touch with your users and being able to change and tweak quickly is key - fail fast and fix fast
3. Customer experience trumps the cool factor. Consumers have an abundance of choices and little patience. Bake quality into planning, development and monitoring cycles - Teams must embrace continuous quality.
Transforming digital experience provides organisations opportunities and challenges that place enormous pressure on improving how software development teams deliver innovation. But the demand for new products and services from consumers and businesses, and the speed at which digital innovation moves, mean that software, apps and products are often rushed to market, and quality issues inevitably arise. For businesses, bugs and failures can result in lost revenue, disappointed customers and in the worse case scenario - brand damage. For individuals, tech that doesn’t work leads to frustration and brand switching. We won’t buy products which don’t do their job, we’ll switch brands (costs of doing so are ever diminishing) and we’ll share our negative experiences with others. And when we interact with digital services - we expect them to be “always on” and ultra-reliable.
In an environment where Amazon reviews and twitter posts can make or break a company, ensuring a flawless user experience is paramount. But in a continuously changing market landscape, companies will simply not get everything right. Technology is changing the market more quickly than businesses are used to, and developing while appreciating this fundamental truth is crucial. For individuals, tech that doesn’t work leads to frustration and brand switching. We won’t buy products which don’t do their job, we’ll switch brands (costs of doing so are ever diminishing) and we’ll share our negative experiences with others. And when we interact with digital services - we expect them to be “always on” and ultra-reliable.
And although quality is critical for the user experience (more to come on that), some defects will escape to production - potentially disrupting customer experiences. Being able to pinpoint issues quickly, fix and deploy quickly is fundamental. Staying close to your customer and understanding their changing requirements and aspirations is crucial.
Of course, software teams focused on delivering innovation fast all know by now that the key to ensuring a compelling and flawless user experience is continuous testing - which integrates fast and complete feedback test coverage - and where products are tested in various real-life scenarios. To inject reality into the development process, developers on agile teams and embracing DevOps need a stable test environment capable of recreating all reasonably expected user conditions.
There are four pillars of success:
1. Bring the end user environment into the process - Maximise digital test coverage (have access to the right web/mobile platforms within the test environment
2. Go Fast - Maximise % of automation (achieve +80% test automation)
3. Learn Fast - Build a fast feedback loop (deliver fast feedback to development teams on quality status)
4. Bake quality in - Embed quality across Dev-QA-Ops (tools that don’t fit slow things down)
Deliver innovation in small chunks of value - Fail fast, fix fast, win big
Digital winners are ultimately those companies, and in particular their software teams, that embrace new digital engagement methods as required game changing moves.
But doing this means that organisations need to be prepared to make mistakes, but to fail quickly, and be nimble enough – and close enough to their customers – to adapt with speed.
So, we urge Dev teams who are executing how brands leverage innovation to re-imagine new ways to serve and engage customers to focus on what we have learned to be the four pillars for success. Perfecto partners with enterprises to help them deliver exceptional digital experiences. In a fast-changing market, we show Devops teams how they can deliver digital quality at high velocity - and help them succeed where others will fail.
 Source: http://asgard.vc/the-european-artificial-intelligence-landscape-more-than-400-ai-companies-made-in-europe/
Successfully competing in today’s digital world requires every business to become agile and flexible. Competitive advantage increasingly relies on software, leading to the rising adoption of DevOps, which removes silos to get people, processes and tools working together to make the application delivery lifecycle faster and more predictive.
By Matt Hilbert, Technology Writer, Redgate Software.
At the same time data has become the lifeblood of business, making managing databases a critical component of information strategy. Ensuring that applications can access and update data is vital, meaning that databases need to be incorporated into the DevOps process. Otherwise they will simply become a bottleneck, slowing down development and impacting agility. Yet extending DevOps to the database requires companies to bring together areas that have traditionally been managed separately – how can they achieve this and what are the benefits?
To find out, Redgate Software carried out a global study into DevOps and the database. The 2017 State of Database DevOps survey spoke to managers in 1,000 organisations across the world, half of whom employed 500 people or more. Respondents ranged from C-level executives and IT directors/managers to database developers and administrators. The message that came back from the research was clear – DevOps is a central part of their operations, and the database is integral to the process.
Essentially there are five key findings from the research that are relevant to all businesses:
1 DevOps is now mainstream
The vast majority of organisations have either adopted DevOps already (47%) or plan to over the next two years (33%). That leaves just 20% of respondents who aren’t planning a DevOps future, showing the importance of the methodology to the entire industry. Perhaps naturally, the larger the organisation, the greater the chance it is adopting DevOps – nearly three fifths (59%) of businesses with more than 1,000 employees were already using it.
2 What is holding companies back?
The biggest challenge cited by respondents to successfully adopting DevOps was a lack of the right skills, mentioned by 21% of those surveyed. This was just in front of a lack of alignment between the development and operations teams (20%).
Turning to bringing the database into DevOps, the greatest obstacle was seen as a lack of consistency. With different teams involved in application development and database operations, respondents worried that the different ways of working of each would hold back a seamless and smooth process. 32% said synchronising application and database changes would be the greatest challenge, with a quarter (25%) naming different approaches as an issue.
3 Integration is happening
However, despite worries about different approaches between developers and database administrators, the picture is changing, with silo-based working becoming less of an issue. The majority of companies (58%) said integration between teams was either great or good, and only 12% said it was poor. This is heartening to see, as is the fact that 75% of companies already had teams that included developers that worked across databases and applications. This cross-functional approach is key to successfully adopting DevOps across the entire product delivery lifecycle.
4 Why move to DevOps?
Databases are the lifeblood of businesses, and any problems with them swiftly translate into downtime, aggrieved customers and bottom line costs. Introducing new developments and applications can be seen as a risk, with 26% of respondents saying that avoiding failed deployments was their biggest challenge when making changes to the database using traditional methods. 18% cited slow development and deployment cycles and an inability to respond quickly to changing business requirements as further problems with old style database deployment options.
All of these factors are leading people to focus on introducing end-to-end DevOps processes, as these mitigate the risk of downtime, while providing the agility to innovate and out-compete rivals. Over a quarter (26%) said DevOps would increase the speed of delivery, with 17% saying it would free up developers for more added value work.
5 Bridging the consistency gap
Companies may be breaking down silos between developers and database administrators, but in many cases this is only at a relatively high level, with each team using different tools and techniques. For example, 81% of respondents use version control with their applications, but just 58% apply it to database development. The majority of those who have adopted DevOps are doing issue tracking, continuous integration, test automation and automated deployment for their application code, yet only a fifth of respondents apply the same to their database development.
Companies clearly see the necessity for applying DevOps across the entire software lifecycle – and recognise the competitive edge that it brings. So, how can they ensure that it seamlessly spans application developers and database developers? Based on the research results, and our experience at Redgate, there are three areas to focus on:
1 Take a holistic view
DevOps is a business imperative, meaning that it needs to be led from a senior management level. Companies should take a high level view across their entire organisation, and look at end-to-end development processes, applying it across all operations if they are to gain real value from it, and remove any bottlenecks.
2 Encourage cross-team working
The old barriers between departments and roles are breaking down according to the State of Database DevOps report. This is good to see and needs to be encouraged and deepened, so that there is close communication and collaboration between everyone working across the development and deployment process.
3 Drive consistency with the same tools and infrastructure
To deliver the real benefits of DevOps businesses need to ensure that everyone in the process is using the same tools and methods. Introducing tools that integrate with and plug into the existing infrastructure contributes to a seamless flow, from development to deployment, and therefore leads to more efficient and faster operations that reduce risk and drive consistency.
In a digital world, DevOps is increasingly central to how businesses operate when it comes to creating, deploying and managing software. Companies can see the benefits – DevOps enables IT to operate in a faster, more seamless manner and to therefore help businesses to innovate and outpace the competition. The database is at the heart of this – now is the time to look at incorporating it into wider DevOps processes if you want to gain the full benefits of the methodology.
The cyber threat landscape has evolved swiftly and dramatically over the last few years. Major stories involving the likes of the NHS, TalkTalk and most recently Uber have jolted the IT departments of many businesses into action, forcing them to consider cyber security as a top priority.
By Gavin Russell, CEO, Wavex.
A large part of this involves ensuring that all relevant corporate data is stored safely and can be quickly restored in the event of a cyber attack, and with the impending deadline for General Data Protection Regulation (GDPR) compliance, this has never been more important. Come May 25th 2018, all businesses must implement a level of transparency around how they are using customer data, and they must also ensure that customers can have their information removed from company databases should they request it. At this point we are all aware of the importance of GDPR compliance, and the severity of the fines that come with failing to take appropriate action in time.
At first, companies might think they can deliver the level of compliance expected of them by applying regular patches and employing anti-virus protection, and of course this is still vital for cyber security protection. But in reality there is much more involved, and a large part of getting over the compliance hurdle is understanding just how much data your business is holding, and where it’s being held.
For a long time now, regularly performing system backups – alongside having a robust disaster recovery solution – has been one of the best strategies for restoring data in the event of an attack, whether it’s via a ransomware email or a distributed denial of service (DDoS) attack, but this also means businesses often have multiple copies of files. If you consider how the majority of businesses often perform these backups – i.e. using a grandfather, father, son methodology – it is therefore more than likely that a single file could exist in ten or more locations, with even more versions potentially undiscovered. Add to this the fact that businesses themselves often structure their data around their clients and internal departments, and that large copies of data are often kept often after system upgrades ‘just in case’, and you start to get an idea of the scale of the problem.
The end result is that businesses have a lot of data; many thousands, if not millions, of files created over the years, and we probably don’t know where much of it is located.
If businesses want to guarantee compliance before the GDPR deadline arrives there are four steps they must adhere to; whether they choose to do it alone or with the help of a trusted IT services provider is up to them. However, following each of these, in order, will ensure that all bases have been covered - not just for 2018, but also for many years beyond.
This first procedure involves businesses assessing and identifying in which areas they are strongest when it comes to compliance, and which require more work to get to the same point. For example, one business might have an up-to-date system that features a strong firewall and anti-virus software, but they might not have a disaster recovery solution in place in case the data is compromised or goes missing.
This step also involves getting a good grasp on where all of your user-identifiable data is held, and once identified the location(s) must be communicated to all relevant and authorised personnel. This is perhaps the most challenging step of this journey, but it can be made easier with GDPR discovery tools available on the market. These can help you to identify your strengths and weaknesses extremely quickly, and could even pick up on things that might be missed otherwise.
One of the major talking points around GDPR is that customers must be able to have their personal data removed upon request, and so businesses must formulate appropriate responses for those who exercise their right to be forgotten, as well as finalising the processes that should take place whenever this happens.
The only way to guarantee compliance is through thorough self-assessment, and this step ensures that businesses have a pre-defined method of ensuring that all appropriate steps have been taken to operate in line with the regulation. Whether this is done via reports, face-to-face meetings or otherwise is ultimately down to the business and its specific requirements.
GDPR is not a flash in the pan but a long-term commitment, and so a future-facing approach is required. This final step involves assessing and implementing any mechanisms that can be put in place to protect data and processes for years to come, regardless of what might lie ahead.
It’s true – the GDPR deadline is just a few months away now, but it’s not to late to take action and avoid those hefty fines that threaten businesses who fail to comply. Ultimately, compliance is irrespective of business size and is equally relevant whether you have 3 employees or 3000, and these four steps make that big step towards compliance so much easier.
Nick Sacke, Head of IoT and Products, Comms365, insists shrink-wrapped services are the key to realising the vision of IoT at scale.
Analysts continue to talk up the potential of the IoT market – but it is hard to see how 21 billion devices will be connected by 2020 given the current, highly bespoke IoT deployment model. There is simply no way a handful of, albeit large, suppliers delivering highly bespoke solutions can realise the full potential of IoT.
Today, the sheer complexity of these long drawn out, custom-built IoT developments are massively constraining adoption. Even the much-vaunted Smart City projects are losing momentum, with many applications stalled at proof of concept. The potential of IoT will never be achieved until we have ubiquitous delivery – and that means developing solutions that can be deployed to companies of every size by a strong reseller channel. From smart parking to smart warehousing, a new generation of channel friendly ‘IoT as a Service’ solutions are set to transform IoT adoption.
Instead, resellers that have been keen to become the first to provide IoT solutions to their customers have rebadged the IoT services from mobile carriers. Unfortunately, not only do these services do little to build on legacy Machine to Machine (M2M) offerings, they don’t maximise the true value of the technology. Furthermore, the major carriers are in many cases working directly with enterprises on the largest and most lucrative M2M deployments, so where does this leave resellers in a congested, price driven market?It is no wonder that IoT has yet to truly take off with resellers. Where is the revenue stream? What is the value of investing in IoT knowledge and expertise when operators are taking by far the biggest piece of the pie? And where does that leave the vision of a connected world; of millions of sensors providing data that can be captured and analysed to drive new efficiencies, cut costs and uncover revenue streams?
What is in it for the channel? With this approach, a reseller can become a trusted IoT advisor for the end customer – in a similar manner to the way resellers have embraced Unified Communications. By adding IoT knowledge and expertise, plus access to several shrink-wrapped IoT solutions, a reseller can take its existing strength in understanding a customer’s challenges and creating a business case.In addition to advocating the service, a reseller can opt to project manage the whole process, even support the installation if required. Essentially, the ‘as a Service’ approach provides resellers with a chance to realise customers’ IoT objectives – but without having to undertake any complex, high risk, bespoke development.
For example, a smart parking system can be used not only to improve traffic management but also reveal new revenue streams. One car park in Cambridge, for example, unveiled significant missed parking revenue due to customers not paying the minimum one-hour fee when only making a quick stop. IoT informed analytics resulted in the creation of lower charges for very short stays – generating £500,000 in additional revenue.
Of course, the sheer logistics of implementing and supporting IoT on mass is daunting for any organisation – which is why the IoT as a Service model relies on an ecosystem of expert companies. Just as the LORA Alliance is a group of small and large companies successfully working together to deliver LPWANs globally, the IoT ecosystem will drive industry standards for networks, sensors and best practice deployment. The new generation of IoT as a Service providers will use this ecosystem to ensure resellers have full and complete access to the expertise and capacity required – from sensor manufacturers onwards – to deliver IoT at scale.And this is key. IoT can deliver efficiency, cost savings and revenue generation – but it will never inspire the trillions in investment predicted by the analysts if every deployment is bespoke. IoT will only become mainstream if the ‘as a Service’ model is adopted. In addition to building on proven, business-driven applications, shrink-wrapping will release IoT from the constraints of expensive, bespoke projects and provide the channel will immediate opportunity to explore IoT’s compelling potential revenue streams.
I recently read an article which began “you can’t predict a disaster, but you can be prepared for one!” It got me thinking, I can hardly remember a time when disaster recovery was a bigger challenge for infrastructure managers than it is today. In fact, with ever increasing threats to IT systems, a reliable Disaster Recovery strategy is now absolutely essential for an organisation, regardless of their vertical market.
By Richard Stinton, Enterprise Solutions Architect, iland.
What does all this have to do with availability zones, I hear you cry? Furthermore, what is an availability zone and is it a good disaster recovery strategy? The purpose of availability zones is to provide better availability while protecting against failure of the underlying platform (the hypervisor, physical server, network, and storage). They give customers more options in the event of a localised data centre fault. Availability zones can also allow customers to use cloud services in two regions simultaneously if these regions are in the same geographic area.
Let us begin our discussion about availability zones by looking at the core capabilities that provide availability and resilience. Dynamic Resource Schedulers (DRS) provide Virtual Machine (VM) placement. That is, which host should run a given VM? A DRS also moves VMs around a cluster based on usage in order to balance out the cluster. High Availability (HA) provides the capability to restart VMs on other hosts in a cluster when either a host fails, or a VM crashes for any reason.
Now, let us look at the advantages that availability zones offer, as well as areas where they may fall short of constituting an effective disaster recovery strategy. This analysis of availability zone effectiveness will be divided based on three key challenges that cloud providers face: handling crashes or downtime, performing maintenance, and offering sufficient storage.
With this in mind, many service providers talk about a ‘Design for Failure’ model when designing resilient services. In a nutshell, this means designing cloud infrastructure on the premise that parts of it will inevitably fail. Resiliency is provided at the application level. At the very least, this requires the doubling up of all applications, and for many deployments this necessitates additional licensing and additional costs for the VMs themselves.
Another crucial area to factor into this analysis is persistent storage. In the past, storage was protected using RAID techniques. Yet as we move to the public cloud, object storage has appeared as a popular way of storing data. This method uses the availability zone topology to protect data — but only if you choose it and pay for it. To protect against individual disk failure, three copies of the data are spread across the storage subsystems.For virtual machines requiring persistent storage, Elastic block storage (EBS) is often used, and is replicated within the availability zone to protect against failure of the underlying storage platform.
EBS storage is not always replicated to other regions.Regardless, having data replicated to another region does not mean that the VMs are available there. It only guarantees back-up storage. VMs would need to be created from the underlying replicated storage. It is also important to note that replicating storage to another availability zone or region only protects against storage subsystem failure. It does not protect against storage corruption, accidental deletion, or recent threats such as ransomware encrypting the files within the storage. To that extent, it is not creating a Disaster Recovery solution.
As the role of chief data officer (CDO) continues to gain traction within organizations, a recent survey by Gartner, Inc. found that these data and analytics leaders are proving to be a linchpin of digital business transformation.
The third annual Gartner Chief Data Officer survey was conducted July through September 2017 with 287 CDOs, chief analytics officers and other high-level data and analytics leaders from across the world. Respondents were required to have the title of CDO, chief analytics officer or be a senior leader with responsibility for leading data and/or analytics in their organization.
"While the early crop of CDOs was focused on data governance, data quality and regulatory drivers, today's CDOs are now also delivering tangible business value, and enabling a data-driven culture," said Valerie Logan, research director at Gartner. "Aligned with this shift in focus, the survey also showed that for the first time, more than half of CDOs now report directly to a top business leader such as the CEO, COO, CFO, president/owner or board/shareholders. By 2021, the office of the CDO will be seen as a mission-critical function comparable to IT, business operations, HR and finance in 75 percent of large enterprises."
The survey found that support for the CDO role and business function is rising globally. A majority of survey respondents reported holding the formal title of CDO, revealing a steady increase over 2016 (57 percent in 2017 compared with 50 percent in 2016). Those organizations implementing an Office of the CDO also rose since last year, with 47 percent reporting an Office of the CDO implemented (either formally or informally) in 2017, compared with 23 percent fully implemented in 2016.
"The steady maturation of the office of the CDO underlines the acceptance and broader understanding of the role and recognizes the impact and value CDOs worldwide are providing," said Michael Moran, research director at Gartner. "The addition of new talent for increasing responsibilities, growing budgets and increasing positive engagement across the C-suite illustrate how central the role of CDO is becoming to more and more organizations."
Budgets are also on the rise. Respondents to the 2017 survey report an average CDO office budget of $8 million, representing a 23 percent increase from the average of $6.5 million reported in 2016. Fifteen percent of respondents report budgets more than $20 million, contrasting with 7 percent last year. A further indicator of maturity is the size of the office of the CDO organization. Last year's study reported total full time employees at an average of 38 (not distinguishing between direct and indirect reporting), while this year reports an average of 54 direct and indirect employees, representing the federated nature of the office of the CDO design.
With more than one-third of respondents saying "increase revenue" is a top three measure of success, the survey findings show a clear bias developing in favor of value creation over risk mitigation as the key measure of success for a CDO. The survey also looked at how CDOs allocate their time. On a mean basis, 45 percent of the CDO's time is allocated to value creation and/or revenue generation, 28 percent to cost savings and efficiency, and 27 percent to risk mitigation.
"CDOs and any data and analytics leader must take responsibility to put data governance and analytics principles on the digital agenda. They have the right and obligation to do it," said Mario Faria, managing vice president at Gartner.
According to the survey, in 2017, CDOs are not just focused on data as the title may imply. Their responsibilities span data management, analytics, data science, ethics and digital transformation. A larger than expected percentage of respondents (36 percent) also report responsibility for profit and loss (P&L) ownership. "This increased level of reported responsibility by CDOs reflects the growing importance and pervasive nature of data and analytics across organizations, and the maturity of the CDO role and function," said Ms. Logan.
In the 2017 survey, 86 percent of respondents ranked "defining data and analytics strategy for the organization" as their top responsibility, up from 64 percent in 2016. This reflects a need for creating or modernizing data and analytics strategies within an increasing dependence on data and insights within a digital business context.
The survey results provided insight into the kind of activities CDOs are taking on in order to drive change in their organizations. Several areas seem to have a notable increase in CDO responsibilities compared with last year:
Gartner predicts that by 2021, the CDO role will be the most gender diverse of all technology-affiliated C-level positions and the survey results reflect that position. Of the respondents to Gartner's 2017 CDO survey who provided their gender, 19 percent were female and this proportion is even higher within large organizations — 25 percent in organizations with worldwide revenue of more than $1 billion. This contrasts with 13 percent of CIOs who are women, per the 2018 Gartner CIO Agenda Survey. When it comes to average age of CDOs, 29 percent of respondents said they were 40 or younger.
The survey respondents reported that there is no shortage of internal roadblocks challenging CDOs. The top internal roadblock to the success of the Office of the CDO is "culture challenges to accept change" — a top three challenge for 40 percent of respondents in 2017. A new roadblock, "poor data literacy," debuted as the second biggest challenge (35 percent), suggesting that a top CDO priority is ensuring commonality of shared language and fluency with data, analytics and business outcomes across a wide range of organizational roles. When asked about engagement with other C-level executives, respondents ranked the relationship with the CIO and CTO as the strongest, followed by a broad, healthy degree of positive engagement across the C-Suite.