No great technology ideas or insights this month – just a brief message to tell you that we’ve re-designed our ICT publishing portfolio under the Digitalisation World title. We’ve launched a new look website, where blogs, podcasts (DigiTalk) and video content will be very much to the fore. We’ve re-styled our weekly newsletters, under the name of Digital Digests, so that each week subscribers will receive more of a mini-magazine – with articles, video and audio content alongside the more traditional text-based stories. And, the DW magazine is starting to evolve along similar lines, so while articles still dominate at the moment, we’ll be making increasing use of podcasts and video content over the coming months.
As with most industry sectors, nobody quite knows how or when traditional ideas and technologies will be replaced by the digital world, but it makes sense to start to move towards this objective sooner rather than later. IP Expo was our first foray into the video interview arena, and it seemed to work well. The results will be appearing on the website and in the newsletters/magazines over time, with more planned in the coming weeks.
Do let us know if you have enjoyed our new look publications, and if podcasts and videos are easier to digest than text-based content.
In the meantime, please do enjoy this issue of Digitalisation World, and I hope that you find the extensive blockchain coverage helpful in terms of understanding the potential it does (or doesn’t) offer your organisation.
Philip Alsop
Global spending on cognitive and artificial intelligence (AI) systems is forecast to continue its trajectory of robust growth as businesses invest in projects that utilize cognitive/AI software capabilities. According to a new update to the International Data Corporation (IDC) Worldwide Semiannual Cognitive Artificial Intelligence Systems Spending Guide, spending on cognitive and AI systems will reach $77.6 billion in 2022, more than three times the $24.0 billion forecast for 2018. The compound annual growth rate (CAGR) for the 2017-2022 forecast period will be 37.3%.
"The market for AI continues to grow at a rapid pace," said David Schubmehl, research director, Cognitive/Artificial Intelligence Systems at IDC. "Vendors looking to take advantage of AI, deep learning and machine learning need to move quickly to gain a foothold in this emergent market. IDC is already seeing that organizations using these technologies to drive innovation are benefitting in terms of revenue, profit, and overall leadership in their respective industries and segments."
Software will be both the largest and fastest growing technology category throughout the forecast, representing around 40% of all cognitive/AI spending with a five-year CAGR of 43.1%. Two areas of focus for these investments are conversational AI applications (e.g., personal assistants and chatbots) and deep learning and machine learning applications (employed in a wide range of use cases). Hardware (servers and storage) will be the second largest area of spending until late in the forecast, when it will be overtaken by spending on related IT and business services. Both categories will experience strong growth over the forecast (30.6% and 36.4% CAGRs, respectively) despite growing slower than the overall market.
The cognitive/AI use cases that will see the largest spending totals in 2018 are automated customer service agents ($2.9 billion), automated threat intelligence and prevention systems ($1.9 billion), sales process recommendation and automation ($1.7 billion) and automated preventive maintenance ($1.7 billion). The use cases that will see the fastest investment growth over the 2017-2022 forecast are pharmaceutical research and discovery (46.8% CAGR), expert shopping advisors & product recommendations (46.5% CAGR), digital assistants for enterprise knowledge workers (45.1% CAGR), and intelligent processing automation (43.6% CAGR).
"Worldwide Cognitive/Artificial Intelligence Systems spend has moved beyond the early adopters to mainstream industry-wide use case implementation," said Marianne Daquila, research manager Customer Insights & Analysis at IDC. "Early adopters in banking, retail and manufacturing have successfully leveraged cognitive/AI systems as part of their digital transformation strategies. These strategies have helped companies personalize their relationship with customers, thwart fraudulent losses, and keep factories running. Increasingly, we are seeing more local governments keeping people safe with cognitive/AI systems. There is no doubt that the predicted double-digit year-over-year growth will be driven by even more decision makers, across all industries, who do not want to be left behind."
Banking and retail will be the two industries making the largest investments in cognitive/AI systems in 2018 with each industry expected to spend more than $4.0 billion this year. Banking will devote more than half of its spending to automated threat intelligence and prevention systems and fraud analysis and investigation while retail will focus on automated customer service agents and expert shopping advisors & product recommendations. Beyond banking and retail, discrete manufacturing, healthcare providers, and process manufacturing will also make considerable investments in cognitive/AI systems this year. The industries that are expected to experience the fastest growth on cognitive/AI spending are personal and consumer services (44.5% CAGR) and federal/central government (43.5% CAGR). Retail will move into the top position by the end of the forecast with a five-year CAGR of 40.7%.
On a geographic basis, the United States will deliver more than 60% of all spending on cognitive/AI systems throughout the forecast, led by the retail and banking industries. Western Europe will be the second largest region, led by banking and retail. China will be the third largest region for cognitive/AI spending with several industries, including state/local government, vying for the top position. The strongest spending growth over the five-year forecast will be in Japan (62.4% CAGR) and Asia/Pacific (excluding Japan and China) (52.3% CAGR). China will also experience strong spending growth throughout the forecast (43.8% CAGR).
Capita and Citrix survey reveals cost, security concerns, and legacy technology as biggest barriers to workspace agility.
The vast majority (84%) of organisations believe an inability to quickly roll-out new services and applications to their workforce is impacting their ability to stay ahead of the competition, according to new research from Capita and Citrix. The survey of 200 CIOs and senior IT decision makers reveals that cost (48%), security and compliance concerns (47%), and legacy technology (44%) are the biggest barrier towards creating the agile workspaces that today's organisations strive for.
The overwhelming majority (93%) of organisations say that younger employees are driving demand for more flexible technology and ways of working. Furthermore, 91% believed the IT user experience was important in attracting and retaining new talent, with 55% of mid-size organisations stating it was 'very important' underlining the value of retaining talented employees for smaller enterprises. However, despite demand from employees, 88% of organisations also admitted that dated capex budgeting models were making it more difficult for them to create agile workspaces.
"The digital revolution has had a substantial impact on how we work and what our expectations of our working environments are. As organisations look to move away from a traditional desktop IT environment to a more flexible one that caters for mobile, remote, and more digital-savvy employees, the appeal of an agile workspace has grown," said James Bunce, Director, Capita IT Services. "The nirvana for many organisations is creating an agile workspace that puts users first giving them everything they need, to work productively, in one single place from applications, to shared data and documents, to self-service support. However, stitching these facets together is not an easy task, particularly as IT environments have grown in complexity. As the research shows, an agile workspace is not just a luxury for organisations but is fundamental to their success and ability to compete."
Bring your own device or bring your own problem?
As organisations look to create a more user-centric workspace, increasing numbers are turning toward a 'Bring Your Own Device' (BYOD) approach:
On average, respondents to the survey think the number of IT support requests due to remote and mobile working has increased by a quarter (25%). Further, as 'shadow IT' has continued its rise, 83% of organisations admitted it is a challenge to keep their remote and mobile working policy up to date.
Is help at hand?
A key factor in user-centricity is assessing how employees feel about their working environment. The research reveals that more than three-quarters (79%) of organisations are currently measuring the IT user experience. On average, organisations measure the IT user experience six times a year, but more than a quarter (28%) say they are only monitoring it once or twice a year.
In recent years, the ability to 'self-serve' has played a central role in improving the user experience, and IT support is no exception.
However, the survey reveals there is still work to be done in ensuring that self-service technology upgrades the user experience, as the majority of organisations (83%) say they still typically find out about users' experience of IT through calls to the helpdesk.
"The findings show that while many organisations have adopted BYOD the full benefits are yet to be realised, as security and support concerns remain. A reluctance to give staff access to their preferred technologies holds back enterprises from becoming truly user-centric, therefore organisations must seek out solutions the enable BYOD in a secure manner while minimising the support burden," added Bunce. "A truly agile workspace should allow IT teams to continually monitor user experience, yet for many it remains a reactive rather than a proactive function, with the onus on users to report issues. Without tools in place to monitor environments in real-time and enable IT teams to rectify issues more quickly, organisations risk creating a dispersed workforce of unhappy employees."
The legacy problem
From a technology perspective, the majority of organisations (87%) say legacy applications are slowing their journey to creating an agile workspace, with the cost of re-architecting or transforming applications (68%), disruption to the user experience (43%), and a lack of in-house skills to modernise applications (36%) cited as the main causes.
Evolving alongside this application challenge has been the shift towards cloud computing, with organisations looking to Software-as-a-Service (SaaS) applications to increase workspace agility. Yet only a quarter (25%) of organisations think SaaS applications meet their requirements, with this figure dropping to 17% in mid-size organisations.
"An agile workspace is one in which employees can access all the applications and data they need, empowering them to collaborate with colleagues over any connection. This is easier said than done for most organisations because migrating from legacy remains a challenge," said Justin Sutton-Parker, Director, Partners, Northern Europe at Citrix. "Ultimately, businesses are looking for cost-conscious, scalable and flexible solutions to enable productivity and deliver efficiency, often combined with a cloud-first approach to help tackle legacy issues and embrace the digital workspace."
Calls for international regulation and upskilling increase.
Leading figures from business, industry, academia and policy have stressed the urgent need for greater global collaboration and transparency, as international governments and institutions adapt to the proliferation of AI technologies across all sectors.
Gathering in London for The Economist Events Innovation Summit 2018, industry leaders discussed the need for ethical, political and economic regulation as AI continues to shape our society and the everyday lives of millions of people. Facing challenges in public trust and fears over employment security - coupled with current global issues in disease, inequality, climate change and political instability - the need for responsible AI advancement was universally advocated.
During the keynote panel discussion on AI's impact on the employment landscape, Heath P. Terry, MD of Goldman Sachs, stated, "We need to reassure the public that AI is not going to replace humans, but work in tandem with us and handle the mundane parts of our jobs. It is a sort of advisor that sits alongside us, freeing us up to perform more difficult tasks."
The following panel went on to discuss the implications of AI and humans working together in this way, with the technology becoming ever more sophisticated. Morag Watson, Vice President and Chief Digital Innovation Officer at BP, explained, "AI will revolutionise the future employment market, creating jobs we won't recognise and can't even conceive of today. This new breed of job risks creating its own form of skills shortage, with employees requiring retraining and upskilling."
Other key themes explored throughout the day included the risks and ethics of AI development, with speakers recognising the potential for AI to be exploited by bad actors and to perpetuate society's existing biases.
Allan Dafoe, Director of GovAI, Future of Humanity Institute at University of Oxford, explained the enormous opportunity we are now being presented with, "You can't trace decisions made by a human brain. But with AI, you can audit every single line of code, offering an opportunity to identify and remove bias by following and understanding the entire decision-making process."
Jo Swinson, MP for East Dunbartonshire and Deputy Leader of the Liberal Democrats, suggested tangible action to commit the industry to ethical standards, "We should consider a form of Hippocratic Oath for those working in AI a 'Lovelace Oath' outlining a set of ethics to abide by, which will become an integral part of being a programmer or data scientist. Although we subcontract our judgement to AI, it's still just lines of code, and we need to maintain a level of human accountability for the decisions a machine might make."
Concluding speaker Demis Hassabis, co-founder and chief executive of DeepMind, spoke openly about his global vision for AI, "I'd be pessimistic about the world if something like AI were not coming down the road. We're not making enough progress on the major challenges facing society - whether that's eradicating disease or poverty or fighting for greater equality - so how will we address them? Either through an exponential improvement in human behaviour, or an exponential improvement in technology. Looking around at politics and recent events, it seems that exponential improvement in human behaviour is unlikely. So, we're looking for a quantum leap in technology like AI".
One of the moderators, Vijay Vaitheeswaran, US Business Editor, of The Economist, summarised the summit's general consensus:
"The connection between the global challenges we face such as water and food shortages and AI means that we now have new powerful tools to solve these problems. The history of innovation has never taken place in a straight line but the idea that AI can be applied diligently and with purpose to tackle intractable problems should give us all hope for the future."
Companies in Europe are enhancing always-on, omnichannel customer service as more and more consumers embrace AI-driven experiences.
Companies across Europe are deploying artificial intelligence (AI) technologies to revolutionise customer service as more and more consumers show high acceptance of AI-driven experiences, reveals a new research report from ServiceNowand Devoteam.
The report, "The AI revolution: creating a new customer service paradigm", explores how AI is driving a new revolution in service delivery, drawing on research* carried out with 770 IT professionals responsible for the customer service function in 10 European countries.
It reveals that nearly a third (30%) of European organisations have introduced artificial intelligence (AI) technologies to customer service and 72 per cent of those are already seeing benefits that include freeing up agents' time, more efficient processing of high-volume tasks and providing always-on customer support.
"The majority of organisations are offering omnichannel experiences to customers, but many are struggling to keep up with increasing consumer demand for service across these channels," said Paul Hardy, Chief Innovation Officer EMEA, ServiceNow. "Early adopters are reaping the benefits of using AI technologies to deal with common tasks and requests, freeing agents to shift away from a reactive role to really driving proactive, meaningful engagement."
Customer service teams in Europe struggle to keep pace with customer demand
According to survey respondents, providing service and support 24/7 is their number one customer service challenge. Customers are being offered multiple service channels, but they expect responses at any time of the day and this is pushing organisations to breaking point:
AI will reinvent customer engagement
AI will allow organisations to move beyond handling more queries more efficiently, to anticipating and acting on customer needs:
"We are only at the beginning of the AI-driven customer service revolution," said Debbie Elder, Principal Consultant, Devoteam. "A powerful development is the ability of AI to help transform high-stress moments into positive experiences for customers that build loyalty. For example, in the case of a flight cancellation, AI can detect the customer starting a live chat and indicate it is likely to be due to the cancellation. It can then immediately escalate the interaction to a human agent to arrange an alternative and deliver a superior service."
AI will empower customer service agents
While the adoption of AI will increase, these technologies will only serve to augment the role of the human agent at the front line of delivering 'wow' customer experiences:
"AI technologies will enable our customer service agents to focus on the customer interactions where the human touch is needed the most. This gives them greater job satisfaction, enabling them to focus on VIP customers and high priority enquiries, as well as focus on more strategic contributions within organisations," said Clive Simpson, Head of Service Management, CDL.
Twenty-eight percent of spending within key enterprise IT markets will shift to the cloud by 2022, up from 19 percent in 2018, according to Gartner, Inc. Growth in enterprise IT spending on cloud-based offerings will be faster than growth in traditional, non-cloud IT offerings. Despite this growth, traditional offerings will still constitute 72 percent of the addressable revenue for enterprise IT markets in 2022, according to Gartner forecasts.
“The shift of enterprise IT spending to new, cloud-based alternatives is relentless, although it’s occurring over the course of many years due to the nature of traditional enterprise IT,” said Michael Warrilow, research vice president at Gartner. “Cloud shift highlights the appeal of greater flexibility and agility, which is perceived as a benefit of on-demand capacity and pay-as-you-go pricing in cloud.”
More than $1.3 trillion in IT spending will be directly or indirectly affected by the shift to cloud by 2022, according to Gartner (see Table 1). Providers that are able to capture this growth will drive long-term success through the next decade.
Gartner recommends that technology providers use cloud shift as a measure of market opportunity. They should assess growth rates and addressable market size opportunities in each of the four cloud shift categories: system infrastructure, infrastructure software, application software and business process outsourcing.
Table 1: Cloud Shift Proportion by Category
| 2018 | 2019 | 2020 | 2021 | 2022 |
System infrastructure | 11% | 13% | 16% | 19% | 22% |
Infrastructure software | 13% | 15% | 17% | 18% | 20% |
Application software | 34% | 36% | 38% | 39% | 40% |
Business process outsourcing | 27% | 28% | 29% | 29% | 30% |
TOTAL | 19% | 21% | 24% | 26% | 28% |
Source: Gartner (August 2018)
The largest cloud shift prior to 2018 occurred in application software, particularly driven by customer relationship management (CRM), according to Gartner. CRM has already reached a tipping point where a higher proportion of spend occurs in cloud than in traditional software. This trend will continue and expand to cover additional application software segments, including office suites, content services and collaboration services, through to the end of 2022. Application software will retain the highest percentage of cloud shift during this period.
By 2022, almost one-half of the addressable revenue will be in system infrastructure and infrastructure software, according to Gartner. System infrastructure will be the market segment that will shift the fastest between now and 2022 as current assets reach renewal status. Moreover, it currently represents the market with the least amount of cloud shift. This is due to prior investments in data center hardware, virtualization and data center operating system software and IT services, which are often considered costly and inflexible.
“The shift to cloud until the end of 2022 represents a critical period for traditional infrastructure providers, as competitors benefit from increasing cloud-driven disruption and spending triggers based on infrastructure asset expiration,” said Mr. Warrilow. “As cloud becomes increasingly mainstream, it will influence even greater portions of enterprise IT decisions, particularly in system infrastructure as increasing tension becomes apparent between on- and off-premises solutions.”
According to the International Data Corporation (IDC) Worldwide Quarterly Cloud IT Infrastructure Tracker, vendor revenue from sales of infrastructure products (server, enterprise storage, and Ethernet switch) for cloud IT, including public and private cloud, grew 48.4% year over year in the second quarter of 2018 (2Q18), reaching $15.4 billion. IDC also raised its forecast for total spending (vendor recognized revenue plus channel revenue) on cloud IT infrastructure in 2018 to $62.2 billion with year-over-year growth of 31.1%.
Quarterly spending on public cloud IT infrastructure has more than doubled in the past three years to $10.9 billion in 2Q18, growing 58.9% year over year. By end of the year, public cloud will account for the majority, 68.2%, of the expected annual cloud IT infrastructure spending, growing at an annual rate of 36.9%. In 2Q18, spending on private cloud infrastructure reached $4.6 billion, an annual increase of 28.2%. IDC estimates that for the full year 2018, private cloud will represent 14.8% of total IT infrastructure spending, growing 20.3% year over year.
The combined public and private cloud revenues accounted for 48.5% of the total worldwide IT infrastructure spending in 2Q18, up from 43.5% a year ago and will account for 46.6% of the total worldwide IT infrastructure spending for the full year. Spending in all technology segments in cloud IT environments is forecast to grow by double digits in 2018. Compute platforms will be the fastest growing at 46.6%, while spending on Ethernet switches and storage platforms will grow 18.0% and 19.2% year over year in 2018, respectively. Investments in all three technologies will increase across all cloud deployment models – public cloud, private cloud off-premises, and private cloud on-premises.
The traditional (non-cloud) IT infrastructure segment grew 21.1% from a year ago, a rate of growth comparable to 1Q18 and exceptional for this market segment, which is expected to decline in the coming years. At $16.4 billion in 2Q18 it still accounted for the majority, 51.5%, of total worldwide IT infrastructure spending. For the full year, worldwide spending on traditional non-cloud IT infrastructure is expected to grow by 10.3% as the market goes through a technology refresh cycle, which will wind down by 2019. By 2022, we expect that traditional non-cloud IT infrastructure will only represent 44.0% of total worldwide IT infrastructure spending (down from 51.5% in 2018). This share loss and the growing share of cloud environments in overall spending on IT infrastructure is common across all regions.
"As share of cloud environments in the overall spending on IT infrastructure continues to climb and approaches 50%, it is evident that cloud, which once used to be an emerging sector of the IT infrastructure industry, is now the norm. One of the tasks for enterprises now is not only to decide on what cloud resources to use but, actually, how to manage multiple cloud resources," said Natalya Yezhkova, research director, IT Infrastructure and Platforms. "End users' ability to utilize multi-cloud resources is an important driver of further proliferation for both public and private cloud environments."
All regions grew their cloud IT Infrastructure revenue by double digits in 2Q18. Asia/Pacific (excluding Japan) (APeJ) grew revenue the fastest, by 78.5% year over year. Within APeJ, China's cloud IT revenue almost doubled year over year, growing at 96.4%, while the rest of Asia/Pacific (excluding Japan and China) grew 50.4%. Other regions among the fastest growing in 2Q18 included Latin America (47.4%), USA (44.9%), and Japan (35.8%).
Top Companies, Worldwide Cloud IT Infrastructure Vendor Revenue, Market Share, and Year-Over-Year Growth, Q2 2018 (Revenues are in Millions) | |||||
Company | 2Q18 Revenue (US$M) | 2Q18 Market Share | 2Q17 Revenue (US$M) | 2Q17 Market Share | 2Q18/2Q17 Revenue Growth |
1. Dell Inc | $2,386 | 15.5% | $1,433 | 13.8% | 66.4% |
2. HPE/New H3C Group** | $1,665 | 10.8% | $1,389 | 13.3% | 19.8% |
3. Cisco | $1,016 | 6.6% | $947 | 9.1% | 7.2% |
4. Lenovo* | $822 | 5.3% | $254 | 2.4% | 223.5% |
4. Inspur* | $691 | 4.5% | $299 | 2.9% | 131.0% |
ODM Direct | $5,337 | 34.6% | $3,508 | 33.7% | 52.2% |
Others | $3,525 | 22.8% | $2,577 | 24.8% | 36.8% |
Total | $15,442 | 100.0% | $10,407 | 100.0% | 48.4% |
IDC's Quarterly Cloud IT Infrastructure Tracker, Q2 2018 |
F5 Networks has unveiled EMEA's first ever Future of Multi-Cloud (FOMC) report, highlighting game-changing trends and charting adaptive best practice over the next five years.
The F5 commissioned report was conducted by the Foresight Factoryand features exclusive input from influential global cloud experts specialising in entrepreneurialism, cloud architecture, business strategy, industry analysis, and relevant technological consultancy.
"The Future of Multi-Cloud report is a unique vision for how organisations can successfully navigate an increasingly intricate, cloud-centric world. The stakes are higher than ever, and businesses that ignore the power of the multi-cloud today will significantly struggle in the next five years," said Vincent Lavergne, RVP, Systems Engineering, F5 Networks.
The FOMC report comes at a time of significant cloud receptivity.
According to the figures cited in the FOMC report, 81% of global enterprises claim to have a multi-cloud strategy in place. (1) Meanwhile, the Cisco Global Cloud Index estimates that 94% of workloads and compute instances will be processed by cloud data centres by 2021. (2)
"The multi-cloud is a game-changer for both business and consumers. It will pave the way for unprecedented innovation, bringing cloud architects, DevOps, NetOps and SecOps together to pioneer transformational services traditional infrastructures simply cannot deliver. The outlook for the coming years is bright and full of potential," said Josh McBain, Director of Consultancy, Foresight Factory.
A new era of business innovation
The FOMC consensus is that those delaying multi-cloud adoption will become increasingly irrelevant.
New levels of service specialisation will increasingly allow enterprises to find the best tools for their specific needs, enabling seamless scaling and rapid service delivery innovations. Technologies set to drive this transition include serverless architectures, as well as artificial intelligence-powered orchestration layers and configuration tools to aid data-driven decision-making. Fear of vendor lock-in is expected to continue as a key justification for multi-cloud investments.
"The multi-cloud ramp-up is one of the ultimate wake-up calls in internal IT to get their act together," said Eric Marks, VP of Cloud Consulting at CloudSpectator and a FOMC contributor.
"One of the biggest transformative changes is the realisation of what a high performing IT organisation is and how it compares to what they have. Most are finding their IT organisations are sadly underperforming."
Plugging the skills gap
The FOMC report cautions that the multifaceted logistics of monitoring multiple cloud services, containers, APIs and other processes can be daunting and inhibit technology uptake. A significant skill gap also exists to handle this added complexity both now and into the future. Across the world, available workforces are not keeping with the pace of innovation, with potentially damaging results to business productivity and digital transformation capacity. Knowledge silos or lack of collaboration within businesses may further exacerbate multi-cloud apprehension.
Looking ahead, the FOMC report urges the business community to do more to "tap into the kaleidoscopic potential of youth and promote industry diversity." It also calls on the IT industry to better promote the use of smart, context-driven and automated solutions that can spark attractive new career opportunities and free existing workforces to focus on more strategic and rewarding work.
Safeguarding the future and building trust
Attack surfaces are expanding at exponential rates. The FOMC emphasises how cybercriminals are no longer tinkering hobbyists but instigators of a new "hacking economy" that can outpace businesses innovation. To remain competitive, organisations need to confront the security challenge head on without compromising quality of service. Implementing a robust, future-proofed ecosystem of integrated security and cloud solutions will help to build end-to-end IT services that give key stakeholders greater context, control, and visibility into the threat landscape.
Coping with compliance
The FOMC posits that the intricacies of regulating a borderless digital world is one the biggest challenges facing governments today. Swift and substantive collaborative action between business and government is needed. Ultimately, the FOMC report believes a global standard for data protection is required within five years.
"Eventually, today's tech-conscious consumers and customers will only want to be associated with the most trustworthy data handlers. There is now a big opportunity to differentiate with best practice and service delivery, particularly in the context multi-cloud's potentialadded McBain.
Ninety percent of European organisations expect IT budgets to grow or stay steady in 2019.
Spiceworks has published its annual 2019 State of IT Budgets report that examines technology investments in organisations across North America and Europe. The results show 90 percent of European companies expect their IT budgets to grow or remain flat in 2019. Among those planning to increase budgets, 65 percent are driven by the need to upgrade outdated IT infrastructure.
The results show 34 percent of European organisations expect their IT budgets to increase in 2019, while 56 percent expect them to remain flat year over year. Organisations that expect IT budget increases next year anticipate a 21 percent increase on average. Only 6 percent of European companies expect a decrease in IT budgets in 2019, compared to 9 percent in 2018.
Among European businesses boosting their IT budgets in 2019, growing security concerns were the second biggest driver of budget increases behind the need to upgrade outdated infrastructure. Sixty percent of European businesses reported growing security concerns as a top driver of budget increases, followed by an increased priority on IT projects at 57 percent. Compared to their counterparts in North America, European businesses were more likely to increase IT budgets due to changes in regulations, such as GDPR (45 percent in Europe vs. 30 percent in North America), and due to currency fluctuations (14 percent in Europe vs. 5 percent in North America).
IT budget allocations: European organisations boost hardware and cloud budgets while reducing software spend
European organisations plan to spend 36 percent of their budgets on hardware purchases, up by 5 percentage points year over year. Software budget allocations decreased by 2 percentage points to 24 percent in 2019, while cloud service budgets increased by 1 percentage point to 20 percent. Budgets for managed IT services remained steady year over year at 16 percent.
Compared to their North American counterparts, European businesses are allocating more of their budgets toward hardware and managed IT services, and slightly less on software.
European budget highlights within each category include:
Technology purchase decisions: ITDMs make the decision, BDMs sign the checks
Spiceworks also examined the roles various individuals play in the technology purchase process. Business line directors are involved in the technology purchase decisions in 37 percent of European companies, the owner/CEO is involved in 35 percent of organisations, and finance managers are involved in 30 percent. ?
However, IT decision makers (ITDMs) are more likely to be the sole decision maker across all technology categories when compared to business decision makers (BDMs). ITDMs are most likely to be the sole decision maker for networking solutions, backup/recovery, computing devices and physical server purchase decisions. When involved, BDMs are more likely to either sign off on final approval or veto the deal after ITDMs have made their vendor and product selection.
“When it’s time to upgrade or purchase new tech, organisations entrust IT professionals to find the best solution to meet the needs of the business,” said Peter Tsai, senior technology analyst at Spiceworks. “For major tech purchases, the CEO or finance manager may be involved to sign on the dotted line, but in most cases, it’s the IT decision maker who conducts the in-depth research, evaluates the vendors, and ultimately chooses the best solution for the business.”
Server Technology's multiple award winning High Density Outlet Technology (HDOT), has been improved with our Cx outlet. The HDOT Cx PDU welcomes change as data center equipment is replaced. The Cx outlet is a UL tested hybrid of the C13 and C19 outlets, accommodating both C14 and C20 plugs. This innovative design reduces the complexity of the selection process while lowering end costs.
It sounds suspiciously logical, and very, very sensible, but the revelation that Channel organisations need to pay more attention to what their customers want was the major theme for the recent Managed Services + Hosting (MSH) Summit.
From the first keynote onwards, the importance of the customer was centre stage at the MSH Summit. While customer relations has always been important, it seems that the ‘sell and forget’ mentality of the traditional channel model – perhaps not strictly accurate, but rather too prevalent nonetheless – guarantees failure. Customers want their channel suppliers to understand their needs, to work alongside them and, above all, to be able to trust the. After all, with the responsibility for IT increasingly devolving from the end user (who, once upon a time, bought tin and fitted it all together, or got an SI to do this), to the channel and service providers, who not only sell solutions, but then have to run and support them on behalf of the customer. In simple terms, MSPs are becoming the IT departments for many end users.
Bearing in mind this sea change, MSPs need to spend time and effort on researching their (potential) customers’ needs – right from how they choose to source their IT, what IT solutions they require, and what level of ongoing support needs to be provided.
First up, customers want to know about their potential suppliers. Although the quoted figure might vary, it seems safe to report that over half of the end user buying cycle takes place before any human contact is established. This means that customers are researching possible suppliers via a mixture of traditional and modern, multi-media channels. A muddled, messy and out-of-date website is unlikely to have new customers knocking down an MSPs’ virtual or physical door, yet all too many suppliers seem content to operate their cyber-presence in this way.
What is required is clear, concise and accurate information about services the MSP provides – with clear emphasis on what makes the organisation different and not just another ‘me too’ provider. Customer service is a major potential differentiator, with more and more Cloud and managed services being based on the commoditised services of the web giants. Vertical industry specialisation is another valid approach – with an MSP demonstrating how well they know the problems faced by a specific industry sector, and how they can help offer targeted solutions.
Additionally, there needs to be the realisation that the IT buying cycle is no longer the sole remit of the IT department. Increasingly, any or all company departments are having a say in the specification and purchasing of IT products and services. Yes, the IT department might have the final say, or veto, but it’s no longer enough to target the professionals when it comes to selling – well-informed ‘amateurs’, armed with knowledge acquired from the technology they use in their private lives, can be key influencers in the final purchasing decision.
To summarise – MSPs need to review their customer intelligence, review their own positioning and review their marketing. The end objective being, as more than one speaker explained, is that this new sales model requires an ‘outside in’ approach – the customer’s needs come first, with everything else falling into place around these.
In terms of specific opportunities, the security market is clearly the biggest market right now. While huge end user corporations might have their own security professionals capable of carrying the fight to the hackers, the SME market is especially vulnerable to cyber attacks, as it has little or no in-house security expertise. No one can protect against zero day attacks, but a surprising amount of companies have a surprising amount of vulnerabilities to existing, well-known problems. For example, basic patch management practices are alarmingly absent from many organisations (the infamous NHS attack being one of the more high profile casualties of failing to apply available patches).
Another opportunity for MSPs is, somewhat depressingly, the fact that the incumbent IT supplier may be doing such a bad job that the customer will welcome the approach of an organisation that appears to know what it is doing and can back up this talk with a concrete solution that delivers what is required and, importantly, continues to deliver. In one quoted example, a customer had been unable to access their data for three days, had to contact their IT supplier to tell them this, and the supplier then took a further three days to ‘rescue’ just some of the data. With the bar set so low, a well-organised and resourced MSP might just find plenty of opportunities to win new business!
How these opportunities are developed is crucial to success. Taking weeks, or even months, to talk to the customer from board-level right down to the office/shop floor, to understand exiting problems, future requirements, the nice-to-haves and the essentials, should ensure that the proposed solution meets the customer’s expectations – especially if a technology roadmap has been agreed, with a timeline and key outcomes defined. Such a comprehensive approach to sales should pay-off for both supplier and customer.
And once the business has been won, MSPs need to ensure that they keep in regular, meaningful contact with the customer. One speaker explained how his organisation contacts the customer weekly and monthly, as well as for quarterly and annual strategic reviews.
As for the vertical approach, well, whisper it quietly, in a room where grey, or at least greying, hair and suits proliferated (myself included), it was refreshing to listen to two young entrepreneurs – hosting supplier and customer – talk about the digital agency market. Robert Belgrave and Jim Bowes were the clearest evidence of the day that the IT market is changing rapidly and frequently and that, if suppliers and their customers don’t change with it, then the abyss awaits.
Most notable was Robert’s admission that his hosting company’s initial success was under threat from the commoditisation offered by the hosting giants, so they decided to work with this change, establishing a consulting service that would guide customers – new and potential – through the managed services and cloud maze. The result is, that while some customers stick with Robert’s company’s own hosting service, others go the commoditised route, but with added-value services and support being supplied by Wirehive.
So, flexibility is, perhaps, the final piece of the MSP jigsaw. Of course, planning requires certain assumptions to be made as to the technologies and issues that end users need to address. However, game-changing disruption is, potentially, only a sleep away. Wake up to find that the rules have changed and you can either waste time moaning about how unfair the new rules are, or spend your time and energy adapting to the new rules. And that applies for MSPs and their customers alike.
In summary, wherever you sit in the IT ecosystem, if you’re not excited by the future, then it just might be time to head for the door marked ‘EXIT’.
The Storage Networking Industry Association (SNIA) is the largest storage industry association in existence, and one of the largest in IT. It is comprised of 180 leading industry organizations, over 2,000 active contributing members along with more than 50,000 IT and storage professionals worldwide.
By Michael Oros, SNIA Executive Director.
Today, SNIA is the recognized global authority for storage leadership, standards and technology expertise. As such, our mission is to develop and promote vendor-neutral architectures, standards, best practices and educational services that facilitate the efficient management, movement and security of information.
These initiatives are vital in the dynamic fields of data and information. IT is constantly evolving and shifting to provide new efficiencies and valuable business insight with increased data set sizes and larger pools of computational and storage resources. Virtualization, Persistent Memory, the cloud, software-defined data center, hyperconvergence, computational storage, the Internet of Things (IoT), Artificial Intelligence (AI) and machine learning are just a few of the latest waves of innovation that are disrupting traditional IT approaches and are continuing to transform all industries. SNIA stays alert to these trends and is involved on the leading edge of such standards and innovation needs. That enables us to isolate areas that need attention such as omitted interoperability standards or nascently understood technology areas that require industry education. In some cases, it is up to SNIA to champion technologies that open up new vistas for the storage industry and beyond.
These nine technology focus areas are actively supported by Technical Work Groups and Storage Communities that we refer to as Forums and Initiatives. Individuals from all facets of IT dedicate themselves to programs that unite the storage industry with the purpose of taking storage technologies to the next level.
Through these nine focus areas, SNIA leads the storage industry in the development of standards and driving broad adoption of new technologies. Our members pave the way enabling innovation by establishing the best way for platforms to interoperate. It is up to SNIA to provide a forum that aligns the strategic business objectives of the diverse storage vendor community with the need for interoperability and worldwide standards. By contributing to these standards and interoperability initiatives, our members are fostering timely technology adoption that delivers real value and benefit to IT organizations globally.
To stay informed and updated on how SNIA is accelerating the adoption of next-generation storage technologies, please visit www.snia.org. You can also hear the latest from SNIA by regularly viewing our storage blogs at www.sniablog.org, SNIA YouTube channel and to learn about SNIA global events and meet-ups please visit www.snia.org/snia_events.
This month’s DCA journal theme is focused on IOT, Smart Cities, Edge Computing and Cloud. These are all topics which are intrinsically linked to one another and this month’s articles reflect this. Interxion report on the role an urban data centre plays in a smart city, Vertiv focuses on Edge computing, IMS take us back to IOT basics and Dr Marcin Budak from Bournemouth University runs through a day in the life of a smart city commuter, however first just a few thoughts from me to kick off with….
It has been reported recently that by 2050, 66 percent of the world’s population will live in urban locations. It’s important to address the way services and technology are delivered to the growing inner-city population and ensure that these are provided in the most cost-effective manner. In other words, we need to work smarter not harder.
At the forefront of every city’s concerns is the safety of its citizens. Be that from a terrorist threat or from general criminal activity which is exponentially linked to increased population density. One area which is seeing a rapid acceleration to combat crime is the use of CCTV cameras. IoT can now increase the ability to monitor citizens and keep them safe.
Some would argue this represents nothing more than a Big Brother invasion of an individual’s privacy, others would argue that if ‘you have nothing to hide you have nothing to fear’. Clearly the use of CCTV isn’t anything new, however, with an improved digital infrastructure comes the ability to increase the quality and functionality of surveillance devices to deliver so much more.
Rather than the grainy images we’ve all seen on Crimewatch, the roll out of new HD facial recognition technology could identify suspicious or dangerous individuals prior to a crime being committed. It could also help to quickly identify and prosecute individuals once the unlawful act is committed. The technology to identify a crime in progress is developing further, although not yet needed in many parts of Europe, I read that in Washington D.C. for example, they have begun installing “gunshot sound sensors” on CCTV masts. These sensors alert authorities almost immediately to the sound of gunshots. Rather than having to be called out the Police and Emergency Services are alerted to fire arms incidents in real time which decreases the response times dramatically.
The focus by government and local authorities on IoT based initiatives is not just restricted to increasing the security of citizens but equally driven by the need to increase the efficiency of how community services are delivered and in turn reduce the cost of delivering them, with a “more for less” motto being very much the name of the game.
A quick Google search revealed an increasing number of pilot projects are being launched, all are being closely monitored with a view to both regional and national rollout. I found projects such as Smart Parking, Smart Street Lighting, Smart Roads (covering everything from congestion, accidents or road damage). The list continues I found numerous trials relating to everything from Smart-bins, air pollution, seismic activity to burst water pipe sensors!
It’s not just the list of potential applications which is endless but also the data these street level devices will be generating. A great deal of this data will need to be collected, stored and analysed somewhere, this will be done either locally at the edge or backhauled to larger centrally located hosting facilities.
These are exciting times for all DCA members and the potential business opportunities which exist for both DC providers and suppliers is plain to see in support of this IoT and Smart City revolution which is only just starting to warm up.
We have decided to switch things around for the next edition of the DCA Journal with a dedicated focus on Liquid Cooling. If you have a view in this subject and would like to submit and article for review please email Amanda McFarlane amandam@dca-global.org or call on 0845 873 4587 for more details, deadline is the 20th October 2018.
Finally, I would like to thank everyone who supported and took part in the DCA Charity Golf Day which this year was held at the Warwickshire PGA Golf Course. Many thanks to the team at Angel Business Communications for helping to organise the event for us. Look out for the pictures to follow and don’t forget to pencil it in for next year if you want to take part.
We are moving to a 5G world. It will be one where data is vital to our daily lives. Data now needs to be instantaneous. Any delay, even for an nth of a second, could be catastrophic in an always-on world. That nth could be a tenth of a second, nothing more. But no matter how fast your fibre connection, if information has to travel across several continents to get to your door, there is going to be latency – a time drag, an nth of a second.
Imagine sitting in an autonomous car at 70mph with vehicles whizzing all around you. That nth of a second could be the difference between a collision and a smooth journey home. You are a gamer in an online world. That nth of a second is the difference between a clean shot and a clear miss.
Smart cities, the Internet of Things (IoT), connected cars, online gaming and machine-to-machine (M2M) communications are all driving the demand for instant connections. This, in turn, is driving the trend to edge computing. It’s a term used to describe local datacentres that are smaller than the Hyperscale centres that have come to characterise the data centre industry, although more critical to the successful deployment of a ‘Smart Anything’.
But on-the-edge computing, as the industry calls it, removes that latency and provides the instant connectivity the modern world and mission critical applications demand.
History to lead the future
As a result, the edge will both benefit from, and drive the development of, more reliable and faster internet connections. It makes for a shared future between the edge and 5G. However, it’s not just the future that connects them - these technologies could in fact benefit from key learnings in each other’s past evolution. The implementation of 4G LTE (long-term evolution) is a great example. Whereas 3G could connect society more easily, 4G truly brought high-speed data connectivity to the masses. It expanded quickly and became integral to society’s day-to-day life.
This illustrates that businesses and society will eagerly adopt any new developments in 5G as well. Given its correlation with the edge, it could imply that the edge needs to be 5G ready. But how can you be ready for something that’s undefined at the moment? This is not impossible, as it primarily means the edge needs to be communication ready, and this is already feasible. Vertiv can, for example, offer fully integrated and scalable systems with high-quality communications capabilities.
Another key learning from the 3G/4G/5G evolution is its inherent influence on infrastructure development. Whereas 3G allowed for places of leisure, such as McDonald’s and airports, to become connectivity hubs, 4G enabled the further implementation of technology globally, both in cities and rural areas. With the boost in implementation and usage also came an increased need for more robust, secure edge resources.
Similarly, the edge will see a change in infrastructure needs with the progress of 5G. We don’t know what 5G cellular architectures will look like, yet. However, we know they will enable more consumer applications, as well as potential other applications we haven’t imagined yet. This means that telco, cloud and colocation architectures will evolve to meet 5G and edge needs. It is expected that cloud and colocation companies will continue to expand their footprint to provide services closer to users. This, combined with the continued deployment of edge-ready local infrastructure, matched to archetypes and use cases, will create an edge ecosystem that extends far beyond traditional small-space edge deployments.
The current edge
It seems like the direction of future developments is not as mysterious as one might think. So how ready is the industry for 5G and edge computing to really take hold? In an effort to understand and foresee future demand, Vertiv has analysed a variety of emerging use cases, resulting in the ‘Edge Archetypes’ report. The report identified the following four primary edge archetypes:
Developments in 5G cellular networks will both enable and drive the evolution of these archetypes, and the edge in general, to allow for the implementation of edge-dependent applications.
It might seem like these developments are still out of reach and need to be government-driven due to its vastness. However, the commercial sector is more likely to truly drive this evolution with the development of consumer-oriented services and apps – increasing the demand of the edge’s capabilities.
As a result, with the available knowledge at hand and the opportunity for the commercial sector to drive demand, the realisation of edge computing and 5G cellular networks might just be closer than you think.
The alarm on your smart phone went off 10 minutes earlier than usual this morning. Parts of the city are closed off in preparation for a popular end of summer event, so congestion is expected to be worse than usual. You’ll need to catch an earlier bus to make it to work on time.
The alarm time is tailored to your morning routine, which is monitored every day by your smart watch. It takes into account the weather forecast (rain expected at 7am), the day of the week (it’s Monday, and traffic is always worse on a Monday), as well as the fact that you went to bed late last night (this morning, you’re likely to be slower than usual). The phone buzzes again – it’s time to leave, if you want to catch that bus.
While walking to the bus stop, your phone suggests a small detour – for some reason, the town square you usually stroll through is very crowded this morning. You pass your favourite coffee shop on your way, and although they have a 20% discount this morning, your phone doesn’t alert you – after all, you’re in a hurry.
After your morning walk, you feel fresh and energised. You check in at the Wi-Fi and Bluetooth-enabled bus stop, which updates the driver of the next bus. He now knows that there are 12 passengers waiting to be picked up, which means he should increase his speed slightly if possible, to give everyone time to board. The bus company is also notified, and are already deploying an extra bus to cope with the high demand along your route. While you wait, you notice a parent with two young children, entertaining themselves with the touch-screen information system installed at the bus stop.
Once the bus arrives, boarding goes smoothly: almost all passengers were using tickets stored on their smart phones, so there was only one time-consuming cash payment. On the bus, you take out a tablet from your bag to catch up on some news and emails using the free on-board Wi-Fi service. You suddenly realise that you forgot to charge your phone, so you connect it to the USB charging point next to the seat. Although the traffic is really slow, you manage to get through most of your work emails, so the time on the bus is by no means wasted.
The moment the bus drops you off in front of your office, your boss informs you of an unplanned visit to a site, so you make a booking with a car-sharing scheme, such as Co-wheels. You secure a car for the journey, with a folding bike in the boot.
Your destination is in the middle of town, so when you arrive on the outskirts you park the shared car in a nearby parking bay (which is actually a member’s unused driveway) and take the bike for the rest of the journey to save time and avoid traffic. Your travel app gives you instructions via your Bluetooth headphones – it suggests how to adjust your speed on the bike, according to your fitness level. Because of your asthma, the app suggests a route that avoids a particularly polluted area.
After your meeting, you opt to get a cab back to the office, so that you can answer some emails on the way. With a tap on your smartphone, you order the cab, and in the two minutes it takes to arrive you fold up your bike so that you can return it to the boot of another shared vehicle near your office. You’re in a hurry, so no green reward points for walking today, I’m afraid – but at least you made it to the meeting on time, saving kilograms of CO2 on the way.
Get real
It may sound like fiction, but truth be told, most of the data required to make this day happen are already being collected in one form or another. Your smart phone is able to track your location, speed and even the type of activity that you’re performing at any given time – whether you’re driving, walking or riding a bike.
Meanwhile, fitness trackers and smart watches can monitor your heart rate and physical activity. Your search history and behaviour on social media sites can reveal your interests, tastes and even intentions: for instance, the data created when you look at holiday offers online not only hints at where you want to go, but also when and how much you’re willing to pay for it.
Personal devices aside, the rise of the Internet of Things with distributed networks of all sorts of sensors, which can measure anything from air pollution to traffic intensity, is yet another source of data. Not to mention the constant feed of information available on social media about any topic you care to mention.
With so much data available, it seems as though the picture of our environment is almost complete. But all of these datasets sit in separate systems that don’t interact, managed by different entities which don’t necessarily fancy sharing. So although the technology is already there, our data remains siloed with different organisations, and institutional obstacles stand in the way of attaining this level of service. Whether or not that’s a bad thing, is up to you to decide.
Think about some of the busiest cities in the world. What sort of picture springs to mind? Are you thinking about skyscrapers reaching into the atmosphere, competing for space and attention? Perhaps you’re thinking about bright lights, neon signs and the hustle and bustle of daily life. Chances are that whichever city you’re thinking about – New York, Tokyo, Singapore – is actually a smart city.
Smart cities are so-called based on their performance against certain criteria – human capital (developing, attracting and nurturing talent), social cohesion, economy, environment, urban planning and, very importantly, technology. Speaking specifically to this last point, real-life smart cities are less about flying cars and more about how sensors and real-time data can bring innovation to citizens.
And the key to being ‘smart’ is connectivity, to ensure that whatever citizens are trying to do – stream content from Netflix, drive an autonomous vehicle or save money through technology use in the home – they can do so with ease, speed and without disruption. Crucial to this, and the often-overlooked piece in the connectivity puzzle – the urban data centre. Urban data centres are the beating heart of all modern-day smart cities, and we’ll explore why here.
Becoming smart
Just as Rome wasn’t built in a day, neither was a smart city. Time for some number crunching. According to the United Nations, there will be 9.8 billion of us on Earth by 2050. Now, consider the number of devices you use on a daily basis, or within your home – smartphones, laptops, wearables, smart TVs or even smart home devices. Now multiply that by the amount of people there will be in 2050, and you get a staggering number of devices all competing for the digital economy’s most precious commodity – internet access. In fact, Gartner predicted that by 2020, there will be 20.4 billion connected devices in circulation. At a smart city level, this means being able to translate this escalating demand for access to the fastest connection speeds to unparalleled supply. Connectivity is king, and without it cities around the world will come screeching to a halt. To keep up with the pace of innovation, we need a connectivity hub that will keep our virtual wheels turning – the data centre.
Enter the urban data centre
In medieval times, cities protected their most prized assets, people and territories by building strongholds to bolster defenses and ward off enemies. These fortresses were interconnected by roads and paths that would enable the exchange of people and goods from neighbouring towns and cities. In today’s digital age, these strongholds are data centres; as we generate an eyewatering amount of data from our internet-enabled devices, data centres are crucial for holding businesses’ critical information, as well as enabling the flow of data and connectivity between like-minded organisations, devices, clouds and networks. As we build more applications for technology – such as those you might find in a typical smart city – this flow needs to be smoother and quicker than ever.
Consider this – according to a report by SmartCitiesWorld, cities consume 70% of the world’s energy and by 2050 urban areas are set to be home to 6.5 billion people worldwide, 2.5 billion more than today. With this is mind, it’s important that we address areas such as technology, communications, data security and energy usage within our cities.
This is why urban data centres play a key role in the growth of smart cities. As organisations increasingly evolve towards and invest in digital business models, it becomes ever more vital that they house their data in a secure, high-performance environment. Many urban data centres today offer a diverse range of connectivity and cloud deployment services that enable smart cities to flourish. Carrier-neutral data centres even offer access to a community of fellow enterprises, carriers, content delivery networks, connectivity services and cloud gateways, helping businesses transform the quality of their services and extend their reach into new markets.
The ever-increasing need for speed and resilience is driving demand for data centres located in urban areas, so that there is no disruption or downtime to services. City-based data centres offer businesses close proximity to critical infrastructure, a richness of liquidity and round-the-clock maintenance and security. Taking London as an example, the city is home to almost one million businesses, including three-quarters of the Fortune 500 and one of the world’s largest clusters of technology start-ups. An urban data centre is the perfect solution for these competing businesses to access connectivity and share services, to the benefit of the city’s inhabitants and the wider economy.
The future’s smart
London mayor Sadiq Khan recently revealed his aspirations for London to become the world’s smartest city by 2020. While an ambitious goal, London’s infrastructure can more than keep pace. Urban data centres will play a significant role in helping the city to not only meet this challenge, but become a magnet for ‘smart tech’ businesses to position themselves at the heart of the action. The data centre is already playing a critical role – not just in London, but globally – in helping businesses to innovate and achieve growth. And as cities become more innovative with technological deployments, there’s no denying that smart cities and urban data centres are a digital marriage made in heaven.
The application of IoT is booming with new use cases arising near enough daily. But, contrary to its growth, the sector risks inertia if businesses lose sight of the key objectives digitisation was founded upon – improving day-to-day experiences.
Yes, a big part of IoT is creating more efficient processes. But those efficiencies must translate into issues that resonate with customers, from the quality of the product to meeting environmental pledges and reducing wastage to truly deliver; something that can’t be achieved by automating processes alone, but by automating outcomes – as Jason Kay, CCO, IMS Evolve, explains.
Digitisation Falters
Pinpointing the reason for organisations’ growing failure to make the expected progress towards successful digitisation is a challenge. Choice fatigue, given the diversity of innovative technologies? Over ambitious projects? An insistence by some IT vendors that digitisation demands high cost, high risk rip and replace strategies? In many ways, each of these issues is playing a role; but they are the symptoms not the cause. The underpinning reason for the stuttering progress towards effective digitisation is that the outcomes being pursued are simply not aligned with the core purposes of the business.
Siloed, vertically focused digitisation developments typically focus on short-term efficiency and process improvements. They are often isolated, which means as and when challenges arise, it is a simple management decision to call time on the development: why persist with a digitisation project that promised a marginal gain in process efficiency at best, if it fails to address core business outcomes such as customer experience?
Accelerating the digitisation of an organisation requires a different approach and brave new thinking. While disruptive projects and strategies can prove threatening to existing business models – when executed correctly – can in fact create opportunity for new business models, exploration and a new approach to the market. By considering and focusing on the core aspects of the business, not only can opportunities to drive down cost be identified, but also deliver measurable value in line with clearly defined outcomes.
Reconsidering Digitisation
In many ways the IT industry is complicit in this situation: on one hand offering the temptation of cutting-edge and compelling new technology, from robots to augmented reality, and on the other insisting that digitisation requires multi-million pound investments, complete technology overhaul and massive disruption to day-to-day business. It is therefore obviously challenging for organisations to create viable, deliverable long-term digitisation strategies; and this confusion will continue if organisations focus on the novelty element and fail to move away from single, process led goals.
Achieving the true potential digitisation offers will demand cross-organisational rigour that focuses on the business’ primary objectives. Without this rigour and outcome led focus, organisations will not only persist in pointless digitisation projects that fail to add up to a consistent strategy but, concerningly, will also miss the opportunity to leverage existing infrastructure to drive considerable value.
Consider the impact of an IoT layer deployed across refrigeration assets throughout the supply chain to monitor and manage temperature. A process based approach would be focused on improving efficiency and the project may look to utilise rapid access to refrigeration monitors and controls, in tandem with energy tariffs, to reduce energy consumption and cost. However, if such a project is only defined by this single, energy reduction goal, once the initial cost benefits have been achieved there is a risk that the lack of ongoing benefits will resonate with management. Yet digitisation of the cold chain also has a fundamental impact on multiple corporate outcomes, from customer experience to increasing basket size and reducing wastage; it is – or should be – about far more than incremental energy cost reduction.
Supporting Multiple Business Outcomes
Incorrect cooling can have a devastating impact on food quality. From watery yogurt to sliced meat packages containing pools of moisture and browning bagged salad, the result is hardly an engaging brand experience. These off-putting appearances can threaten not only customer perception but also basket size, yet the acceptance of this inefficiency is evident in the excessive supply chain over-compensation. To ensure that the products presented to customers on the shelves are aesthetically appealing, retailers globally rely on overstocking with a view to disposing any poorly presented items. The result is unnecessary overproduction by producers and a considerable contribution to the billions of pounds of food wasted every year throughout the supply chain.
Where does this supply chain strategy leave the brand equity with regards to energy consumption, environmental commitment and minimising waste? Or, for that matter, the key outcomes of improving customer experience, increasing sales and reducing stock? It is by considering the digitisation of the cold chain with an outcomes based approach, a project that embraces not only energy cost reduction but also customer experience, food quality, minimising wastage and supporting the environment, that organisations are able to grasp the full significance, relevance and corporate value.
Furthermore, this is a development that builds on an existing and standard component of the legacy infrastructure. It is a project that can overlay digitisation to drive value from an essentially dull aspect of core retail processes and one that can deliver return on investment, whilst also improving the customer experience.
Reinvigorating Digitisation Strategies
If digitisation is to evolve from point deployments of mixed success, towards an enduring, strategic realisation, two essential changes are required. Firstly, organisations need to consider what can be done with the existing infrastructure to drive value. How, for example, can digitisation be overlaid onto existing control systems to optimise, for example, the way car park lights are turned on and off, to better meet environmental brand equity and reduce costs? In the face of bright, shiny disruptive technologies, it is too easy to overlook this essential aspect of digitisation: the chance to breathe new life and value into existing infrastructure.
Secondly, companies need to determine how to align digitisation possibilities not with single process goals but with broad business outcomes – from a better understanding of macro-economic impacts, all the way back through the supply chain to the farmer to battle the global food crisis, to assessing the impact on the customer experience. And that requires collaboration across the organisation. By involving multiple stakeholders and teams, from energy efficiency and customer experience to waste management, a business not only gains a far stronger business case but a far broader commitment to realising the project.
Combining engaged, cross-functional teams with an emphasis on leveraging legacy infrastructure offers multiple business wins. It enables significant and rapid change without disruption; in many cases digitisation can be added to existing systems and rapidly deployed at a fraction of the cost proposed by rip and replace alternatives. Using proven technologies drives down the risk and increases the chances of delivering quick return on investment, releasing money that can be reinvested in further digital strategies. Critically, with an outcome-led approach, digitisation gains the corporate credibility required to further boost investment and create a robust, consistent and sustainable cross-business strategy.
Sustainability has been a theme within the IT industry since the introduction of the energy star label for hardware. Since 2007 sustainable digital infrastructure became a topic internationally shaped by a variety regional organisations running green IT programs to reduce the footprint of layers within digital infrastructure including datacentres. Reducing the overall footprint of the datacentre industry is an impossible challenge as the rate growth of the adoption of digital services, and therefore the need for infrastructure, has outpaced the rate at which the energy footprint can be reduced. Industry orchestration will be needed if this challenge is to be met and new standards for datacentre sustainability are to be set. Years have passed and excess datacentre heat reuse is still not delivering its promise, often because technology, organisational and operational elements cannot be matched. Recently stakeholders have been aiming to change that by stimulating a new role for datacentres, which is the transformation we still need to see on a large scale: the transformation from energy consumers to flexible energy prosumers in smart cities. This is possible with technologies available today, the future is now!
FROM ENERGY CONSUMER TO ENERGY PRODUCER
The global move to cloud based infrastructures and the Internet of Things (IoT) generates high demand for datacentre capacity and high network loads. The energy demand of datacentres is rising so quickly that it is causing serious issues for energy grid operators, (renewable) energy suppliers and governments. Grid operators and energy suppliers can hardly keep up with the demand in the large datacentre hubs let alone ensure enough renewable energy generation is available where we see the demand. Not only does this raise questions of sustainability on all levels, the demand for flexibility and high loads requires a different approach to the business model of the datacentre. With the ultimate challenge of becoming an energy neutral industry.
The key to resolving this challenge is the adoption of liquid cooling techniques in all its forms. Asperitas is committed to approaching this challenge head-on by taking an active role in discarding the limitations of IT systems and datacentre infrastructures of today to find new ways to drastically improve datacentre efficiency. This can help to develop a future where the datacentre is transformed from energy consumer to energy producer. These excerpts from our whitepaper present our vision of the datacentre of the future. A datacentre that faces the challenges of today and is ready for the opportunities of tomorrow.
With all these advantages, liquid offers solutions that are just not attainable in any other way. This is why liquid is the future for datacentres. But what does this future look like? Which liquid technologies are available and what does this mean for the infrastructure of the datacentre?
In the next chapter, we outline the basic liquid technologies operating in datacentres today. After that we explore the most beneficial environment for these technologies: a hybrid temperature chain. Further on, a model of connected, distributed datacentre environments is introduced. With the dedication to liquid, Temperature Chaining and the distributed datacentre model, the datacentre of the future transforms from energy consumer to energy producer. This approach will drastically reduce the carbon impact of datacentres while stimulating the energy efficiency of unrelated energy consuming industries and consumers.
THE HYBRID INFRASTRUCTURE
The introduction of water into the datacentre whitespace is most beneficial within a purpose-built set-up. This focus for the design of the datacentre must be on absorbing all the thermal energy with water. This calls for a hybrid environment in which different liquid based technologies are co-existing to allow for the full range of datacentre and platform services, regardless of the type of datacentre.
Immersed Computing® provides easy deployable, scalable local edge solutions. These allow for rejecting heat to whatever reuse scenario is present, like thermal energy storage, domestic water, city heating etc. If no recipient of heat is available, only a dry cooler is sufficient. Reducing or even eliminating the need for overhead installations like coolers for edge environments, providing a different perspective on datacentres. Geographic locations become easier to qualify and high quantities of micro installations can be easily deployed with minimal requirements. These datacentres will be integrated in existing district buildings or multifunctional district centres. A convenient location for the datacentre is a place where heat energy will be utilised throughout the whole year. Datacentres can also be placed as a separate building in residential and industrial areas. This creates the potential for a connected datacentre web consisting of mainly two types of datacentre environments. Large facilities (Core Datacentres) which are positioned on the edge of urban areas or even farther away and micro facilities (Edge Nodes) which are focused on optimising the large network infrastructure and are all interconnected with each other and with all core datacentres.
The main purpose of the edge nodes is to reduce the overall network load and act as an outpost for IoT applications, content caching and high bandwidth cloud applications. The main function of the core datacentres is to ensure continuity and availability of data by acting as data hubs and high capacity environments.
This is an excerpt of an Asperitas whitepaper. This whitepaper was originally published in The DCA Journal in June 2017: The Datacentre of the Future, authored by Rolf Brink, CEO and founder of Asperitas. http://asperitas.com/resource/immersed-computing-by-asperitas/
With recent reports outlining how blockchain is set to make a significant impact in both the environmental and pharamaceutical and lifesciences industry sectors, Digitalisation World sought a range of opinions as to how blockchain will develop outside of its cryptocurrency and gaming sweetspots.
Part 1 of this article covers the two blockchain reports, with the others bringing you the opinions of various industry experts as to how they see this technology developing in mainstream business sectors
The Pistoia Alliance, a global, not-for-profit alliance that works to lower barriers to innovation in life sciences R&D, has published the results of a survey on the adoption of blockchain in the pharmaceutical and life science industries. According to the survey, 60 percent of pharmaceutical and life science professionals are either using or experimenting with blockchain today, compared to 22 percent when asked in 2017; however, 40 percent are not currently looking at implementing, or have no plans to implement blockchain. The biggest barriers identified to adoption are access to skilled blockchain personnel (55 percent), and that blockchain is too difficult to understand (16 percent). These factors underline why The Pistoia Alliance is calling for the life science and pharmaceutical industries to collaborate over the development and implementation of blockchain.
“We must ensure that the life science industry has access to the right skills and staff to bring their blockchain projects to fruition, particularly looking to the technology industry to fill the blockchain talent gap. This knowledge will be particularly useful for the 18 percent of life science professionals who admitted to knowing nothing about blockchain. The potential to enhance collaboration and, therefore, innovation is huge,” commented Dr Steve Arlington, President of The Pistoia Alliance. “Blockchain provides an additional layer of trust for scientists and their organisations. We hope the security benefits of the technology help to lessen reticence over sharing and transferring data or information, and will facilitate further cross-industry collaboration and knowledge sharing. We believe blockchain will open up new opportunities for the industry to begin sharing data more securely to advance drug discovery, ultimately making patients’ lives better.”
The survey also showed life science and pharmaceutical professionals are becoming more aware of the capabilities of blockchain. Respondents believed the greatest opportunities for using blockchain lie in the medical supply chain (30 percent), electronic medical records (25 percent), clinical trials management (20 percent), and scientific data sharing (15 percent). Of the benefits of blockchain, life science and pharmaceutical professionals believe the most significant is the immutability of data (73 percent). Significantly, for an industry with tight regulations, 39 percent also believe the transparency of the blockchain system is its best feature. However, almost one fifth (18 percent) of professionals believe using blockchain adds no value beyond a traditional database, showing there is some reluctance in the industry to use the technology. The Pistoia Alliance believes that some of the misconceptions about blockchain can be overcome with greater education of those in industry.
“As life science and pharmaceutical organisations are beginning to look at implementing or experimenting with blockchain, The Pistoia Alliance is working hard to inform organisations on how to implement it safely and effectively,” commented Dr Richard Shute, consultant for The Pistoia Alliance. “We are currently focusing on educating scientists and researchers about the potential uses of blockchain technologies outside of the supply chain, particularly in R&D. At The Pistoia Alliance, we want to support our members’ initiatives in blockchain, as well as provide a secure global forum for partnerships and collaboration. I would encourage anyone in the life science industry with an interest to join our Blockchain Bootcamp in October, and Alliance members to get involved in our blockchain community, to share knowledge and best practice.”
The Pistoia Alliance is continuing its drive to educate the life science industry about blockchain. You can join The Pistoia Alliance’s two-day Blockchain Bootcamp on the 8th and 9th October in Boston. The event will consist of an introduction to the Hyperledger platform, as well as a mini hackathon incorporating a range of life science use cases, which will allow participants to code their own blockchain-enabled apps in teams. For more information on the event and to register, see here.
A new World Economic Forum report released today at the Global Climate Action Summit in California identifies more than 65 ways blockchain can be applied to the world’s most-pressing environmental challenges and calls for new global platforms to incubate ‘responsible blockchain ecosystems’ rather than just individual applications or companies.
Produced in collaboration with PwC, Building Block(chain)s for a Better Planet also identifies eight game-changers where the technology can fundamentally transform the way the world manages its natural resources. These range from decentralizing management of natural resources such as energy and water, to creating more transparent supply chains that drive greater sustainability, and providing new mechanisms for raising the trillions of dollars that will be needed to deliver low-carbon and sustainable economic growth.
Blockchain-enabled solutions are currently being explored to improve the sustainability of global supply chains and could help overcome illegal activities by tracking fish from “bait to plate”, or commodities like palm oil, beef and soy from “farm to fork”. Such transparency is vital in influencing consumer decisions, updating supply chain practices and triggering new governance arrangements. Blockchain-enabled smart contracts could, for instance, be used to underpin innovative tenure arrangements that give specific resource rights to communities or fishers.
According to the report, these and other opportunities have been largely untapped by developers, investors and governments, and represent an opportunity to unlock and monetize value that is currently embedded in environmental systems.
“Blockchain’s potential to fundamentally redefine how business, government and society operate has generated considerable hype,” said Sheila Warren, Head of Blockchain and Distribute Ledger Technology at the World Economic Forum’s Centre for the Fourth Industrial Revolution. “Despite this hype, there are many challenges to overcome – it is still a nascent technology undergoing rapid development. Now is the right time for stakeholders to work together to ensure the development of responsible blockchain solutions that are good for people and the planet.”
Celine Herweijer, Partner, PwC UK, added: “It is important for anyone thinking about developing or investing in a blockchain application for the environment to take a step back and ask three essential questions: will blockchain solve the actual problem, can downside risks or unintended consequences be acceptably managed, and has the right ecosystem of stakeholders been built?”
If harnessed in the right way, blockchain has significant potential to enable the transition to cleaner and more resource-preserving decentralized solutions, unlock natural capital and empower communities. This is particularly important for the environment, where the tragedy of the commons and challenges related to non-financial value are prevalent.
“If history has taught us anything, it is that these transformative changes will not happen automatically,” said Dominic Waughray, Head of the World Economic Forum’s Centre for Global Public Goods. “They will require deliberate collaboration between diverse stakeholders ranging from technology industries through to environmental policy-makers, and will need to be underpinned by new platforms that can support these stakeholders to advance not just a technology application, but the systems shift that will enable it to truly take hold.”
How to win the IoT race
By Anthony Sayers, Internet of Things Evangelist at Software AG.
It’s not uncommon for businesses to fall at the first hurdle when implementing IoT projects. This is why it’s important to remember that a successful IoT project is like a marathon. It doesn’t just happen overnight and it requires a lot of hard work, dedication and training. Businesses need to put their training plans into action for the race ahead. By understanding the best techniques and processes to cross the finish line, they can avoid that first roadblock.
In fact, research from Cisco shows that as many as three quarters of all IoT projects are failing. This can be attributed to the fact that many have been designed to solve individual problems, meaning they become siloed. With Gartner predicting that, by 2020, more than 65 per cent of enterprises will adopt IoT products, it is crucial that businesses stop trying to cram IoT adoption into a short space of time.
So, how do organisations take the lead and win the IoT race?
There are three core principles that businesses must adopt to ensure IoT projects are successful. For most organisations that fail at the first hurdle, it’s usually because they have a ‘single step vision’ for a long-term project. As organisations gather more data and insights, often they will find that the strategy needs to be adapted to meet changing business needs.
Stage one: Avoid falling at the first hurdle
Many organisations have tried to take shortcuts, sprinting to pass the finish line as quickly as possible. They are rushing to achieve real-time predictive maintenance, whilst skipping the necessary preparation and processes. It may sound obvious, but it is crucial to start by taking the time to understand your own environment. This should be a data-driven, educational exercise to understand and analyse the historical information that you currently have access to. This means stepping beyond purely the IT domain and into a more operational environment.
A good example of connecting an IoT strategy with a business’s goals is Gardner Denver Air Compressors. The industrial manufacturer chose to start small and scale fast to secure an IoT monitoring solution to its compressor distributors and service partners. This allows it to offer a higher-quality and real-time monitoring solution to its customers. In doing this, the company was able to redefine its relationship with its partners and customers by ensuring equipment downtime was minimised.
Stage two: The value of real-time
Once the data is understood, then, and only then, can businesses begin to see the value of what they can gain from IoT projects. As businesses move to the second step, they can start reaping the benefits of better time management, gathering information in real-time and receiving alerts when something has gone wrong.
In the case of The Winora Group, the bike manufacturer implemented IoT to provide digital, connected, smart eBikes to its customers. By connecting the bikes to the IoT platform, Winora is able to see and handle service-relevant information. This means it is able to provide customers with online views of available routes, theft alerts, GPS monitoring and crash detection, with emergency notifications sent to friends and relatives.
By gathering insights in real-time, organisations can analyse data on the go. Step one of the process is to connect the data and provide insights to businesses, to help them understand why something broke down or what patterns their existing data shows. Step two is about driving the IoT project to the next level and looking at the additional analytic capabilities. This helps to build out a framework to move the project from a purely reactive one – to a more predictive one.
Stage three: Becoming agile and competitive
Typically, businesses begin to see the return on investment following a six months’ implementation period. They are able to operate in a much more agile and competitive way and provide more relevant products and services to the market.
Data-driven insights can help businesses make more informed decisions for the future. One of the key benefits of IoT is the ability to connect all the data points to achieve a predictive maintenance strategy.
OCTO Telematics is doing just that. The telematics company analyses contextual data from vehicles, location, crash data, and driver behaviour information. It provides insurance and automotive companies with the unique ability to develop usage-based insurance policies for drivers. The pay-as-you-drive concept means that each customer receives the appropriate coverage package according to their personal needs. This is a great example of how IoT can be used to deliver notable benefits directly to customers.
Winning the IoT race
To be in with a chance of crossing the finish line and winning the IoT race, businesses need to start looking at their IoT projects via an industry lens. This will ensure that they are implementing a project to deal with the business’ true needs. The best piece of advice - albeit a cliché – is take your time. By understanding the process and learning what works, it then becomes possible to reap the benefits of long term IoT success.
Blockbuster was a popular video rental store – worth a cool $4.8 billion (£3.65bn) – in the early noughties, but its failure to go digital resulted in bankruptcy by 2010. The fall of this once great transatlantic entertainment company acts as a cautionary tale for even the most-established global businesses: demonstrating that digital transformation is no passing craze, and a robust strategy is vital to ensure survival.
By Lindsay McEwan, Vice President and Managing Director EMEA, Tealium.
This year alone, the IDC predicts businesses will invest $1.1 trillion (£836bn) in technology to avoid the same fate. Yet, while acquiring advanced tech is a crucial element of successfully managing digital change, it isn’t everything. Data and data-driven decisions are the keys to unlocking stability and security in the evolving economy.
This is why companies that have learned to harness data as a means of delivering engaging customer experiences — such as Netlfix, Uber and Airbnb — are flourishing. In fact, Airbnb is now estimated to be worth at least $38 billion (£28.9 billion), and research shows it to be more popular than the leading hotel chains.
So, what can we learn from these digital gurus about using smart data strategies?
The customer is always right
It’s well known that Blockbuster’s late arrival at the digital party, slowness in identifying Netflix as a serious threat, and insufficient attempts to respond to customer needs were instrumental in its downfall. Despite some acknowledgement of changes in viewing habits — such as its shift from video to DVD and Blu-Ray, and trials of online rental with in-store collection — the company’s failure to modernise quickly and thoroughly meant the market was left wide open for competitors. That’s not to mention the fact it passed up the chance to acquire Netflix.
Listening and adapting to customers’ cries for flexible, personalised, and fast services early on may have been enough to save Blockbuster from its demise. At the time, LoveFilm — which Amazon acquired in 2011 and morphed into Amazon Video — was quick to identify that customers no longer wanted the inconvenience of making the trip to a store, and offered a DVD-to-door service. Netflix was also tuned in to the demands of the masses, and pipped rivals to the post in transitioning from a DVD postal service to a video streaming service; with Hulu and Amazon Video following soon after. It is therefore no surprise that business has boomed for Netflix.
Just tech is not enough
Gathering customer feedback is a vital step for companies seeking to keep pace with digital innovation. But to put customer demands and ideas into action, firms need resources – both financial and technological. Gathering and acting on feedback can be expensive, but the rise of AI is opening up doors for businesses to connect with consumers more economically, via tools such as chatbots or targeted ads. However, it’s not enough to splurge on the latest tech; to guarantee high-performance, the data fuelling it – or indeed collated by it – needs to be used wisely.
The evolution of Netflix is largely attributed to its smart use of data. Through collecting data about individual content consumption and building a granular view of customers’ viewing behaviours, Netflix makes connections between content and opens up new experiences. For example, its use of customer profiles in sync with AI algorithms successfully identifies material consumers have not previously engaged with but might enjoy — even if it is a different genre. In fact, one report suggested that 80% of all viewing choices were made as a result of their recommendations service.
Each time a Netflix user logs in they are presented with a treasure trove of shows and movies to capture their interest. And with more than 130 million members predicted by the end of the second quarter this year, Netflix has set the bar for customer-centric digital transformation. That being said, Netflix will need to keep evaluating and amending its strategy to stay on top. While revenue for Q2 2018 is up 40% from the same time last year, it didn’t attract the projected number of subscribers, which CEO Reed Hastings blames on a poorly streamlined business approach.
Principal Analyst at eMarketer, Paul Verna, claimed that these results were not surprising. It’s hard to stay on top; maintaining a success story requires unification of data fragments, relating to individuals’ actions, to create accurate consumer profiles. And, to avoid falling foul of penal fines, the data must be collected with explicit consent for a legitimate purpose, and protected in line with GDPR.
The secret ingredient
Hyper-connectivity across a plethora of devices has led to an explosion of multi-channel data. And most businesses are aware that when it is collected, processed and harnessed in real-time – and in compliance with data protection regulations – this data can fuel improved customer service experiences and contribute to long-term, successful business. The obstacle most marketers struggle to overcome is data siloed depending on its channel, which fails to give a comprehensive overview of audiences.
To meet expectations of relevant and consistently personalised interactions on every channel, marketers must analyse the customer journey to assess where they are on the path to purchase. This process is complicated if the data required to undertake the task is disparate.
As a disruptor to the travel industry, Airbnb – worth approximately $38 billion (£29.1bn) and handling over 15 billion mega bytes of data every single day – is the poster child for data-based tailoring. Its integrated worldwide network uses data to help globetrotters find ideal accommodation no matter what their destination. Akin to Netflix, Airbnb develops granular insight into customers, their behaviours and preferences, to suggest holiday homes and experiences the consumer might enjoy.
Recently, marketers have been increasingly turning to Customer Data Platforms (CDPs) – which manage event-level data streams from multiple channels – to achieve this. Designed to stitch together data fragments from an array of sources, these platforms help build a central hub of insight, which in turn gives marketers the means to build the elusive single view of the customer. Harnessing this technology allows marketing professionals to deliver personalised, targeted messaging at the moment of impact; this has the ability to delight consumers with convenience and relevance. Some CDPs have integrated privacy and consent management tools to assist compliance with data protection regulations, which can ease the burden.
Netflix and Airbnb are just two household names reaping the many benefits of digital transformation, and major organisations such as the NHS are not far behind — with plans to unify fragmented data across its 9,000 organisations. Success stories such as these provide tangible evidence that digital transformation isn’t just a fad – provided the implementation strategy is founded on smart data use. By considering the way they process data, companies can respond to constantly changing, ever-increasing consumer demands, without simply relying on flashy tech.
Simply put, if digital transformation isn’t working for you – you’re not doing it right!
With recent reports outlining how blockchain is set to make a significant impact in both the environmental and pharamaceutical and lifesciences industry sectors, Digitalisation World sought a range of opinions as to how blockchain will develop outside of its cryptocurrency and gaming sweetspots.
Part 2 of this article brings you the opinions of various industry experts as to how they see this technology developing in mainstream business sectors
Eleanor Matthews, MD of WorkFutures and founder of Source sees blockchain as having massive potential in the procurement space and certainly moving into the business mainstream:
“To begin with, Blockchain is almost certainly going to release the next wave of efficiencies in procurement, minimising duplication across customers and their supply chains. The old way, where customers and suppliers made private, local transactions across a range of systems by applying their own local processes, will swiftly be replaced.
“A move to distributed ledger technology such as blockchain allows for a single, immutable, secure record of every transaction, eliminating disparate systems and excessive local work within the four walls of an enterprise. Maersk and IBM, to take just two examples, are already predicting that their new blockchain platform will save the global shipping industry billions of dollars a year. And all through more efficient, blockchain-powered, processing, validation and storing of information.
“Secondly, blockchain will enable procurement teams to build greater trust. Even though we might call Blockchain a ‘Trustless’ technology, what that actually means here is that no record will only be entrusted to one party to verify. Once a transaction is added to the blockchain it can never be changed, so this allows for greater confidence across the board. For products where provenance is important, this allows companies to make a better consumer offer. For example, Everledger’s Diamond Time-Lapse product can validate the provenance of diamonds, ensuring that they are real, not stolen, and don’t come from mines with a poor human rights record.
“Finally, blockchain will automate fulfilment of certain types of contractual obligation without the need for separate automation technology. So-called ‘smart contracts’ can be set up so that when conditions are met, payments or other value transfer can happen automatically. Goodbye to long waits for contractors or other freelancers to get paid for the work they do.”
Blockchain and the law
CaseLines has recently applied to patent blockchain for the storage of digital evidence in the process of a trial. The software would be used within its evidence management platform to manage the progress of evidence and transform the current security levels of legal cases. Paul Sachs, founder and CTO of CaseLines, comments:
“The purpose of blockchain is to store a series of transactions in a way that cannot be changed. This unique way blockchain collects and handles data, stores each transaction to ensure the details of each stage in the journey are verified against the document being viewed. So once a piece of evidence is entered into the system, there can be no possibility of records being altered or falsified.
“This is more vital than ever as the UK courts are currently undergoing a £1bn modernisation programme that includes the digitisation of many processes as a means of improving efficiency. When blockchain is used within the CaseLines product, which would be the first of its kind in the world, we believe it would give digital justice systems unparalleled levels of security and trust in the process of handling all types of evidence.
“While blockchain is a public artifact where each transaction in the digital ledger is permanently recorded, inspection of blockchain doesn’t reveal evidence as it contains only evidence IDs and hash codes. It will thereby eliminate the possibility that evidential material submitted to court can be repudiated, as the validity of the document presented will be irrefutable. It is then not possible to photo-shop a picture or splice a video.
“Courts will be demanding a high bar for the admissibility of electronic evidence. By tying blockchain to digital evidence software, judges can be sure they looking at something that is irrefutably the same as the original electronic evidence loaded into this system.”
Set to become mainstream in the retail sector
Simon Dunleavy, Director of Cloud Services at Enterprise eCommerce company Maginus, believes that blockchain has the potential to become mainstream in the retail sector because it brings a new element of trust and liability that will be embraced by consumers:
“By 2030, Gartner has forecasted that blockchain will generate an annual business value of more than £3 Trillion, with 10-20% of the world’s economic infrastructure running on blockchain-based systems. One industry that will most probably be completely revolutionised by blockchain is retail. The ledger system traditionally associated with the volatility of cryptocurrencies like Bitcoin will in fact bring consumers new levels of trust and transparency from the retailers they shop with.
“Blockchain cannot be hacked or forged because data is held by many different parties who can account for its validity. This means that if every step of the supply chain was automatically logged in a blockchain (by scanning a QR code, for example), consumers would be able to track where every part of their product came from, what happened to it along the way and how it eventually ended up with them. For example, they would be able to ensure that an item hadn’t been manufactured using child labour or by harming animals; or they would be able to confirm that the Italian designer handbag they invested in was really 100% made in Italy. In the future, blockchain may even replace the receipt as it will prove the provenance of a product and will include a warranty as well.
“In a post-GDPR world, blockchain may provoke a change in how data is treated by marketers and consumers alike. Blockchain would enable the monetisation of consumer data, whereby consumers would be able to publish and sell their data to retailers they like and trust, and revoke access to their data to retailers they don’t, using blockchain to ensure their data is respected. Overall, it is unfair to completely associate blockchain with the inconsistency of cryptocurrency, rather it will act as a force for good in retail, giving consumers more power over their data and what they buy.”
How blockchain can help defend your business
Richard Menear, CEO of Burning Tree, explains:
“Blockchain has been around since 2009, but its development has created the backbone for a new type of internet. The new use for this original BitCoin concept is taking the digital world by storm; so how can blockchain technology help defend your business?
“Blockchain creates continuity in the digital world, it’s a digital ledger in which transactions made in bitcoin or another cryptocurrency are recorded chronologically and publicly. This incorruptible digital ledger means that data isn’t stored in a single location and there is no centralised version for a hacker to corrupt; meaning your records should be safe from corruption.
How does blockchain work?
“Each ‘block’ is connected to all the blocks before and after it, making it difficult to tamper with. In order for a hacker to get to a single piece of information, they would need to change the block containing that specific record as well as those connected to it. While this isn’t impossible, it is hopefully enough of a deterrent for the average cyber criminal.
“With GDPRs 72 hour reporting timeframe, keeping on top of any breaches in data is important. With blockchain, if any part is altered the ‘signature’ required to access the information will become invalid and the network will know that something has happened. This early notification is essential in protecting information and preventing further loss.
How is blockchain safer than others of its kind?
“As copies of the blockchain are kept on a large and widely distributed network, there is no one weak point to attack.
“Some people believe that blockchains can and will eventually replace central Banks and its services will develop into data sharing, becoming a normal part of daily transactions. The added benefit of being outside the services of Facebook and Google and their increasing rises to the limelight for all the wrong reasons, is another incentive for companies to convert to blockchain.
“Transactions are currently limitless and with no Governing body there is no one source or point of reference, this is hopefully another way to ensure users of its safety.
“Even the Bank of England and trading company Nasdaq have announced they would be developing services incorporating the use of blockchain.
Blockchain and Identity Management
“A major focus for cyber criminals is to steal data which has value, and most of the companies that we all interact with on a daily basis hold a lot of information about us as individuals which falls into that category. Supplying personal information when setting up an account is pretty much standard these days. But what if you didn’t have to? What if you managed my own Identity and had it verified on the blockchain by a wide range of institutions that are proven sources of ID (Government Bodies, Banks, Health Organisations etc.)?
“We’re excited to be working with Trusted Key in this space and would be happy to share how we see them as a major disruptor to traditional Identity Management models.
What is the future for blockchain?
“Blockchain isn’t going to stop there, research and development is being undertaken to use this idea in energy, healthcare and even transportation. Smart cities where street signs, traffic lights, cars as well as other moving and static objects would be embedded with sensors. This is no longer sci-fi, but real-life. This technique could allow emergency vehicles to quickly re-route and avoid traffic, as well as reduce congestion and lower emissions.
“Not all cyber security has developed at this rate which still leaves businesses and organisations vulnerable to threats. Developing innovative technology solutions with our advanced consulting services, could provide peace of mind if you are considering introducing blockchain into your organisation.”
Securing IoT
Ken Munro, security researcher and partner in Pen Test Partners thinks that blockchain could be used to secure the IoT:
“Blockchain could be used within the IoT to authenticate for instance by removing the need for a username and password and instead giving the user an identity which is essentially a public/private key pair. Firstly, the user would sign up for their identity using an application which could be a Windows or Mobile application, for example. This application would then give the user a long passphrase only once, to be used in the case that the user needs to revoke access to their account. This identity, which would be the user’s public key, is then stored in the Blockchain inside a transaction block.
“When a user attempts to login to a website, the website would query the Blockchain for the users’ public key block and compare it with the private key that the user is presenting via the Windows/Mobile application, thus logging them in if they match. If the Identity (private key) was to be compromised, the user could use the long password given on sign up to create a new private key and create a new public key block in the Blockchain, which would be queried by websites on next sign in.
“An interesting example of Blockchain security might be securing a business network of IoT gear from a rogue attacker because as IoT comes into the world, they will have their own subnets that need securing. Let’s say we have an Internet of Things controller that creates a network of IoT gear in a business. Each of these IoT devices will have a private key with their public key set into the Blockchain. The IoT controller will constantly query the Blockchain for which IoT devices should be on the network. If a rogue attacker gets into the IoT network via Wi-Fi or an Ethernet port the attacker’s machine will not have a public/private key pair in the Blockchain, therefore the IoT controller would block the connection from the attacker’s computer. Sort of like blocking MAC addresses on a switch, but this is actually effective. The attacker would need to compromise 51% of the miner nodes or add enough nodes to the mining network in order to add the attacker computer key into the Blockchain to circumvent this security which is too labour intensive. And, if the IoT devices become the miners it would require a lot less resource to run the Blockchain network.”
Serverless has been touted as a game-changing technology that can deliver cost and time savings to businesses in all sectors. However, companies and CTOs that focus on costs and spreadsheet calculations may have a hard time understanding its value, even though leading digital platforms have been using it for years.
Ross Williams – Head of Development, Great State.
Serverless empowers technology workers to bring solutions to production with less specialist knowledge, enabling them to focus more energy on understanding the business domain and their customers.
In travel, margins are low and competition is high. Driving efficiencies can make the difference between flying high and running to the ground.
Ryanair is the ultimate efficiency-driving travel brand, notoriously cutting the frills to keep prices low. It’s no surprise that they’ve seen the benefit of moving their technology infrastructure to the cloud, but unlike Amazon, Google and Expedia, Ryanair have yet to realise the true value of the cloud, serverless technology and the positive impact they can have on teams.
Serverless abstracts developers from server complexity
With serverless technologies, the complexity of servers is removed from the development process so that teams can focus more of their effort on logic to add business value.
This type of technology has existed for some time, but its generalisation has expanded to encompass what has been previously accomplished on traditional servers. Where the likes of ITTT (If This Then That) and Zapier have focused on tying together various cloud-based services to automate workflows, serverless generalises the concept to not only coordinate cloud services, but to actually build the cloud services as well.
Higher infrastructure cost; lower people cost
With serverless, you are only charged the exact amount of processing and storage you consume. If you need more, you can spin up a thousand servers in seconds, then automatically shut them down again. And because developers need not spend much time worrying about infrastructure and setup, new features can be added and deployed in hours. When cloud costs rise to millions of pounds a month, having independent services with precise billing becomes very important.
IT Operations still have a role to play in a serverless environment, however, they are more advisory to the team rather than critical members. Reducing the need for specialists means the complexity in building and running software has been standardised and commoditised, allowing work to be completed by more generalists.
With serverless, product teams can create infrastructure in seconds, but also build and deliver solutions in hours with up to 80% reduction in operation costs.
Of course, not everyone is seeing these benefits. The hourly cost of leasing servers from Amazon or Microsoft can be three times as expensive or more as a financed server from Dell or HP. Serverless costs even more than a cloud server. And the resiliency of cloud servers does not match the level of some enterprise servers. Designing software from scratch to run with expendable and easily replaceable servers can flip the financial equations in favour of serverless, while attempting a lift-and-shift of legacy apps often fail to realise gains.
The business case on the surface doesn’t stack up. So why have innovators embraced the cloud and serverless, while laggards have been slow to adopt? It comes down to the people and process changes that serverless enables. Those that use technology to enable even greater collaboration and a focus on customers have reaped enormous benefits from serverless, while laggards are still struggling to rationalise investing in the cloud because their organisation is not structured for constant evolution to exploit it.
Building digital products is becoming ever more of a collaborative exercise. Finance, marketing, and user experience professionals all have a place in delivering valuable solutions. The companies that win today understand the value of interactions and cooperation throughout their organisation, including IT.
Ryanair’s IT transformation and team fragmentation
Ryanair appears to run a very efficient operation, yet it was only in May 2018 that they decided to fully embrace the cloud.
Ryanair is on a mission to compete in the wider travel segment with the likes of Google, Amazon, and Expedia. Expedia moved over to the cloud in 2012 and has invested heavily in serverless since 2016. Ryanair has been on a 4-year journey to revamp their IT department, and only now sees the benefit of the cloud.
Ryanair's digital revamp began in 2014 after recognising that customers would no longer tolerate slow and poor quality digital products, and the business needed to move faster to grow their wider travel and leisure offerings. Yet they have neglected the benefits of bringing people across different teams and disciplines together. For Ryanair to succeed in the cloud and embrace serverless, they will first need to invest in bringing their people together. And that means re-doing much of their investment over the past 4 years.
Amazon has been seeing the gains from serverless since 2013 because of their focus on teams and collaboration. By limiting teams to 6 or 7 (“two-pizza” teams), Amazon attempts to bring together close bonds where information flows quickly and efficiently. The team members are also given the same single business metric to be evaluated against. Finally, the team is meant to own the product from inception to operation. Amazon's CTO Werner Vogels' motto is "You build it, you run it", as a way to bring the entire product team closer to its operations, and therefore the users it’s meant to benefit.
Businesses with a traditional IT setup struggle to run small teams. Specialists are needed to operate the servers, database administrators need to control the schema, backend developers are needed to connect services, while frontend developers provide the visible part of the software. Not much room is left in this team structure for finance, marketing or user experience without ballooning the team size. This stretches the capabilities of communications, frequently leading to walls between the different team functions. At Mobile World Congress 2018, Ryanair's CTO John Hurley said “the marketing department is the biggest challenge we have - they want perfection and we want to go quickly, learn, iterate and improve”.
Ryanair began in 2014 to create "Digital Innovation Labs", where IT workers would be segregated into new buildings with a more modern working environment to attract IT talent. This segmentation has moved their IT workers further away from collaborating effectively with other departments. They subsequently hired new talent into the segregated teams and invested in technology that hampered their move to the cloud.
Small, multi-discipline, collaborative teams
Now that Ryanair is buying fully into the cloud, what are the chances they will be able to gain advantages from serverless? Given that serverless is again 1.5-2x the cost of traditional cloud, it’s a hard case to sell.
First, Ryanair needs to bring their business functions together and align people in teams with shared goals and the right mix to bring value to customers.
Second, Ryanair needs a mix of people who excel at communication, team-building, and a drive toward satisfying customers, not people who are solely motivated by technical excellence or being at the cutting edge. With this staff and their close proximity, they can reduce their team sizes to communicate efficiently around shared business goals. Serverless will let them move quickly, bringing business value to market faster and cheaper.
The case for serverless continues to grow
Machine Learning adds weight to the opportunities for serverless technology. It can be computationally expensive but equally as rewarding. There are obvious benefits to a business that can crunch millions of customer reviews to categorise hotels or judge room quality by user submitted photos.
How good does the technology need to be to reach break-even, and when does the cost benefit swing back the other way? The goldilocks level of benefit can be delivered only by involving finance teams, user researchers, developers and fined-grained serverless billing. Teams not working in unison can easily spend millions for little gain, and worst of all, with segregated operations and delivery, they might not even know it.
Analytics and marketing departments, who are collecting increasing volumes of data, can also see large benefits in moving to a serverless infrastructure. With traditional IT, the payoff of such a strategy is murky, with long investment times before reaping benefits, if ever. With serverless, finance teams can understand just how much value there is in processing each extra user interaction, photo or comment. And because of the micro-scaling of serverless and its faster development cycles, teams can experiment to confirm assumptions before engaging in additional investment.
Today, developers can deploy highly resilient databases on their own and frontend developers can implement database schemas. We are approaching the era when a generalist developer can handle most technology challenges. The new frontier may be no developers at all.
Great State is a brand technology agency. They help brands in all sectors accelerate their business performance through technology. For more information or to talk to Great State about how serverless technology can help your business, visit greatstate.co.
Server Technology's multiple award winning High Density Outlet Technology (HDOT), has been improved with our Cx outlet. The HDOT Cx PDU welcomes change as data center equipment is replaced. The Cx outlet is a UL tested hybrid of the C13 and C19 outlets, accommodating both C14 and C20 plugs. This innovative design reduces the complexity of the selection process while lowering end costs.
The days of simple brick-and-mortar stores are long gone. No more cash-only registers or isolated branches communicating with each other via one port. Instead, e-commerce has skyrocketed, and retailers are feeling the pressure to make both the in-store and online experience more rewarding.
By Martin Hodgson, Head of UK & Ireland, Paessler.
Now, with consumers more connected than ever, there is little margin for error when it comes to maintaining the customer experience. As leading retailers such as Amazon are demonstrating the high-costs associated with IT downtime (between $72 million and $99 million), it’s imperative that retailers understand the role of network monitoring and how it could make or break a brand. In this piece for RetailTechNews, Martin Hodgson, Head of UK & Ireland at Paessler explores the importance of network monitoring, highlighting the benefits of preventing networking issues before they lead to consumer complaints and showcasing why retailers such as Pepe Jeans are adopting the technology.
Today’s hyper-connectivity means retailers are now operating in a world where consumers expect the ultimate customer experience. As a result, the age of retail has drastically changed from the isolated store experience to multi-store and online capability. But, with so much connectivity afoot, it’s not enough for retailers to simply link their physical stores and run a website – they must now ensure that the connection remains strong across all entities. From LAN links to WAN links, retailers need to ensure optimum connectivity at all times – or risk losing both customer loyalty and revenue as a result of IT downtime. Events that lead to customers threatening to cancel their subscriptions could be cataclysmic for smaller e-commerce sites so, in a world of rapidly increasingly connectivity, how can retailers ensure they remain ‘on the ball’?
Enter network monitoring.
Network monitoring provides IT teams with clarity and insight into what is a hugely complicated entity. Instead of facing website crashes and glitchy servers, a good network monitoring tool helps IT teams take back control by continuously monitoring devices, alerting to potential faults and therefore preventing future downtime. In turn, this insight allows the team to think strategically rather than filling their time trying to resolve simple errors which could ultimately cost the business.
By providing baseline metrics, network monitoring empowers businesses to make informed decisions based on real facts – meaning they can save money by replacing only what truly needs to be, plan effectively for growth and purchase accordingly.
So how can retailers use network monitoring technology to prevent downtime and optimise the customer experience?
The largest problem for retailers is that many were established before this age of hyper-connectivity and are therefore operating on legacy IT systems that aren’t capable of maintaining today’s connected pace. Take Pepe Jeans for instance, since its establishment in 1973, the clothing retailer has progressively adopted devices to keep up with consumer demand and better the customer experience. But the brand’s legacy infrastructure simply couldn’t maintain the pace at which the business has been growing. With a rising online community and close to 400 stores in remote locations (each store usually has at least three devices such as an access point, switch or router), the brand realised it needed to evaluate its complex network to ensure continued efficiency. As a result, the team sought software that monitors and maintains the bandwidth and health status of its network.
Senior IT Network Engineer at Pepe Jeans, Xavier Marchador, was searching for a useful solution that could monitor a wide range of devices, with an easy and intuitive interface, and on-hand support – to pre-empt any issues. Xavier says: “The main reasons for implementing monitoring software were to establish a network base line, track the health status of devices, monitor bandwidth usage in communication lines, and create weather maps with live information of remote delegations.”
Pepe Jeans now uses 6,000 sensors to monitor both LAN and WAN, as well as a remote probe to monitor all locations. The company has also replaced devices that were constantly triggering alarms and requesting WAN lines review due to interface flapping in communication devices. So now, with the help of network monitoring, Pepe Jeans can check the health status of each device, pinpointing the source of network issues before any potential downtime – preventing customer frustration and loss in revenue.
As IoT and e-commerce continue to reshape the retail environment, it will become increasingly important for brands to stay online – all the time. Maintaining a competitive edge will rely on how ready a brand is to receive and process orders, communicate across the network and respond to complaints and issues. Effective network monitoring helps ensure uptime and ensure that customers can always find what they are looking for, leading to better customer support and a stronger brand identity in the long run.
Cryptography experts are already preparing for the advent of quantum computing, explains Aline Gouget, technical advisor and security researcher for Gemalto.
Quantum computing. Whatever your level of technical understanding, it undoubtedly sounds impressive. And in this case, it’s a technology that well and truly lives up to the name. Put simply, quantum computing is set to redefine the limits of data processing power. In doing so, it will offer vast potential to tackle an array of critical scientific challenges.
However, history teaches us that any such ground-breaking advance will also be employed by those with less than pure intentions. Which means we need to prepare for the fact that quantum computing will, sooner or later, offer the means to crack cryptographic codes that have until now been regarded as unbreakable. Significantly, these include the public key infrastructures around which so much of our secure communication is currently built. Fairly obviously, that represents a serious headache. But the good news is that leading industry players have recognized the issue early and are already taking steps to address it. In this article, we’ll review what’s at stake, assess how soon existing cryptographic techniques could be undermined, and consider the measures being taken to ensure that the arrival of quantum computing is something to be welcomed rather than feared.
Quantum computing rewrites the rule book
The strength of quantum computing lies in the radically new way it performs data calculations. Since the 1960s, computing has relied on silicon transistors to store and manipulate data that is encoded as a series of zeros and ones. Quantum computing, in contrast, exploits the ability of sub-atomic particles to exist in more than one state at a time. Consequently, it encodes data differently: in quantum bits or ‘qubits’. In simple terms, the qubit can be likened to a sphere. Using this analogy, a traditional bit can only be at one of the sphere’s two poles – i.e. a zero or a one. A qubit, however, can be in a superposition of states: at any position on the sphere and representing a combination of zeros and ones simultaneously. In practice that means much more data can be stored. Moreover, it can be manipulated far more quickly. Problems well beyond the reach of the traditional computer therefore move into the realms of the eminently solvable.
Breaking the unbreakable
Within the world of cryptography there is widespread agreement as to which algorithms will be easily challenged by quantum computing. Cryptographic algorithms are classified into different categories, according to characteristics such as the type of underlying mathematical functions they are based on, the type of usage they are designed for (e.g. protecting data exchange or the creation of a secret), or the type of secret management required (i.e. one secret key, or a public and private key pair). Of these, the algorithm families that may be weakened by the deployment of quantum computing have been identified as mainly including public key-based methodologies such as RSA and elliptic-curve cryptography for PKI applications, and key exchange applications such as Diffie-Hellman.
The future’s closer than you think
In terms of how soon all of this will happen, there’s rather less consensus. Some experts predict that, within ten years, quantum computing will start to become available to the most advanced researchers and major investors. Indeed, Michele Mosca, from the Institute for Quantum Computing, recently stated there is: “a one in seven chance that some fundamental public key crypto will be broken by quantum by 2026, and a one in two chance of the same by 2031.” Of course such forecasts are changing regularly, but it is worth noting that some serious resources are being committed to the development of quantum computing. So it’s definitely a case of when, not if, this revolutionary technology will make an impact.
Time to panic?
Fortunately, no one is suggesting we need to hit the panic button. The message from the experts is very much one of reassurance. To start with, even the most optimistic predictions for the speed with which quantum computing becomes a reality mean that products with a lifespan of less than ten years are safe. Furthermore, for products that will be around longer, strategies are already being rolled out to protect them over their entire lifecycle. At Gemalto, for example, we are working on the design of products embedding so-called crypto agility capability. This enables software to be loaded that could replace keys and algorithms, as and when they become deprecated. This powerful mechanism enables a fleet of resistant products to be maintained, even as algorithms are found to be vulnerable.
The other axis of defense resides in the choice of algorithm family. Broadly speaking, there are three main approaches to ensuring resistant products:
This last option has the particular virtue of taking a step towards the future, whilst retaining the existing effective crypto that the security industry has well and truly mastered.
A matter of teamwork
A wide range of players are now actively involved in the search for answers. Above all else, protecting the future of public key encryption means finding algorithms that can resist the power of quantum computing yet remain secure when used with a ‘classic’ computer. This is what the sector refers to as ‘quantum-safe’ or ‘post-quantum’ crypto. New public key cryptographic systems that meet the criteria are currently under development and evaluation. NIST (the US National Institute for Standards and Technology) has emerged as a focal point for these efforts, and recently received over 80 submissions in response to its recent call out to research teams. After vetting these proposals, standardization work will be initiated. Solid deliverables are expected in time for NIST’s second post-quantum cryptography standardization conference, in 2019.
Keep in touch
Back in the dark days of World War Two, a remarkable international group of Allied codebreakers based at Bletchley Park in England successfully unlocked the ‘unbreakable’ Enigma machine ciphers with which much of their enemy’s communications were secured. To help them do so, they created a landmark piece of electro-mechanical equipment, the ‘bombe’. Over 70 years later, another new generation of technology is poised to undermine supposedly infallible cryptographic techniques. However, the key message here is not just about the willingness of the wider industry to research and implement new forms of protection against this latest threat. Quantum computing – or at least the quantum physics on which it is based – will also open the door to completely new approaches to data security. Of course it’s still very early days, but for anyone with an interest in encrypted communication, these are exciting times; it is well worth staying abreast of developments. In other words, don’t just keep calm and carry on. Stay tuned as well.
With recent reports outlining how blockchain is set to make a significant impact in both the environmental and pharamaceutical and lifesciences industry sectors, Digitalisation World sought a range of opinions as to how blockchain will develop outside of its cryptocurrency and gaming sweetspots.
Part 3 of this article brings you the opinions of various industry experts as to how they see this technology developing in mainstream business sectors
Pavel Bains, the Founder and CEO of Bluzelle. With Bluzelle, Pavel is developing an open-source protocol for decentralised databases, emphasising speed, security and scalability:
“I think the field where the deployment of blockchain-based solutions may thrive the most is in that of the burgeoning data economy. It’s short-sighted to assume that its applications are limited solely to cryptocurrency (and, by extension, the financial sector) when hundreds of companies around the world, both startups and industry veterans, are actively researching and implementing distributed ledgers into their operations. From insurance to supply chain management, there’s a wealth of use cases to which the blockchain offering can lend more transparency, accountability and censorship-resistant qualities.
“I think developments in the space are coinciding with a renewed understanding of the value of data, due to a number of factors: individuals produce it at a seemingly exponential growing rate, whilst also appreciating not only the risks of entrusting third-party custodians with it (we’ve seen the results of this with numerous prolific data breaches over the past year), but the importance of keeping it both private and secure.
“This is where blockchain technology can truly shine. Security-wise, data can easily be encrypted, broken up (a process known as sharding) and distributed redundantly across nodes. The topology is no longer characterised by clusters of solitary participants interacting with a large central hub, but rather a peer-to-peer network of equally-powerful nodes, sharing shards of data.
“Building on top of this idea, swarm computing is gaining traction and could be invaluable in ensuring that the flow of data is both seamless and scalable – by reducing latency (nodes interact with those with whom the transfer of information is fastest) and uploading/downloading information from peers in parallel. Redundancy means that, even if several peers go offline, that data is still readily available from others. As such, users can enjoy a network that is always online, highly secure, and impervious to the congestion faced by traditionally centralised ones.
“I anticipate we’ll see more of an increase in blockchain-as-a-service providers (joining the ranks of giants like IBM, Microsoft and Oracle) as the scope of the data economy is better understood. Now, more than ever, we need secure and scalable means by which to store information as we continue to transition into the digital age. Blockchain technology is not just hype. It’s a seminal tool in the effort to break away from centralised data silos rife with vulnerabilities, and to ensure individuals are put back in control of their own data.’
Blockchain – more than a buzzword?
Tomislav Matic, CEO and Co-Founder of Crypto Future, comments:
“Blockchain has become a bit of a buzzword around business over the last couple of years, but few actually know what blockchain actually is - and even fewer are aware of the great potential that the technology holds when it comes to its future use. Blockchain can (and likely will) become part of everyday life, bringing about the biggest technological revolution of our lifetimes; largely as a result of the growing demand for immediacy in today’s society.
“Online banking has already shown us that money can be moved from one account to another in seconds, but the implementation of blockchain offers the chance to bring instant transactions to a vast range of applications - not just limited to the financial world. A blockchain platform could soon facilitate the creation of booking apps for restaurants and hotels - allowing for seamless booking, but also creating a frictionless means for owners to charge a fee if customers don’t show up.
“For an application with more wide-reaching implications, consider the issues we see today with elections. Claims of falsified results and slow counting methods make for a long and drawn out process, but a blockchain based application could offer the chance to resolve these problems. It would offer a secure means of getting reliable, transparent results with no human error, and because it’s all online, results would be instant.
“Blockchain-based food tracing apps could help track an agricultural product from the field to the end consumer, allowing them to be fully informed on how their wine, vegetables and meat were produced and how it got from farms, fields or vineyards to the store in which it was bought. Blockchain can bring sustainable and organic food production to the next level.
“These applications are simply a drop in the ocean of the countless possibilities that blockchain could one day offer. As more people begin to truly understand exactly what blockchain is, and that it is not limited to cryptocurrencies and financial applications, we will eventually see it play a critical role in the future of our world.
Food and health
Chrissa McFarlane, CEO of Patientory Inc, offers the following thoughts:
“While there has been significant coverage of the wild fluctuations of cryptocurrencies’ value, the underlying blockchain technology that makes them possible is here to stay. It’s no exaggeration to say it’s a fairly safe bet that this technology will transform several industries in the next five to seven years. Juniper Research found that nearly six out of 10 corporations are considering applying blockchain or are already in the process of developing corporate blockchain services.
“However, while the possibilities of blockchain are widely reported, the number of practical use cases is limited. Yet they do exist, and more importantly, prove that blockchain technology will become mainstream and not be confined to the world of finance or the cryptocurrency space.
“One such use case can be found in the global food supply chain. Following a number of scandals in food production, most infamously the European horse meat scandal in 2013 where foods advertised as containing beef were found to contain undeclared or improperly declared horse meat, trust in global conglomerates’ ability to track the source of all its food products was at an all time low. This led IBM to partner with Walmart, Unilever, Nestle and six other large companies in 2016 to release the Food Trust blockchain to track food through supply chains around the world. Having an immutable record of the food production cycle of every single item of produce on a blockchain ensures that consumers can trust that they are buying and eating the produce they believe they are.
“A second use case can be found in the global healthcare industry. The increase in cyberattacks, such as the WannaCry attack on NHS England in 2017, has broken trust in healthcare organisations’ ability to protect sensitive medical data. However, through harnessing blockchain-as-a-service, much like in the global food supply chain, medical supplies can be tracked almost immutably from provenance through to patient use. As a patient, knowing without doubt exactly which supplies were used throughout your healthcare over the years, even amid today’s global mobile workforce, gives one peace of mind. Further, it creates a holistic image of a patient’s medical history, removing the need for costly duplicate tests and paperwork - crucial for emergency situations where every second counts.
“Patientory Inc. has successfully completed a pilot of the first iteration of hospital nodes that encrypt and protect medical health records and also allow secure transport of data to the permissioned blockchain. This enables exchange of health information which can be used to connect the siloed EMR systems of different medical institutions. Harnessing blockchain-as-a-service in this way reengineers what is now a suboptimal flow of information through the healthcare system.”
Making ripples in the retail world
Nikki Baird, VP Retail Innovation at Aptos, comments:
“Blockchain is certainly starting to shake up the retail world. Generally, most retailers are riding along on innovations that are being driven by other technology players, however, a few larger retailers, such as Walmart in the US, are filing lots of patents around blockchain in supply chain, loyalty and healthcare.
“There are no common applications of blockchain in the retail world, currently. Retailers tend to be conservative adopters of technology, so they’re not jumping in head-first on early-stage pilots or beta tests.
“Most of the focus within retail is on supply chain traceability and product provenance. IBM and Walmart have taken the greatest lead there, in partnership with a lot of other CPG partners. This is about using blockchain to create a ‘universal record’ of product from farm to fork, basically, for supply chain security, country of origin tracking, and to speed the import process. The most valuable benefit will come from the potential to increase the speed of imports and reduce the amount of paperwork needed to move products around the globe.
“Outside of that, most of the value for Blockchain is in product provenance – proving the product is real and wasn’t diverted or counterfeit. There’s another benefit that is not really cost or revenue based, but speaks to customer demands for better information about the products they buy. Blockchain promises to make it easier and cheaper to deliver the kind of info consumers are looking for, like country of origin, or farm-to-fork kind of data – how long has a product been in the supply chain, where did it come from and where did it pass through to get to my local shop.
“Blockchain in the supply chain is likely to become mainstream, and so it should. Out of all of the opportunities for retail and blockchain, this one will mature the fastest, just because it’s a well-defined need and blockchain isn’t just ‘a way’ of solving the problem, but actually a much better way than before.”
How Blockchain can reduce fraud in the supply chain
Pete Kinder, CTO at Wax Digital, explains:
“Many businesses across the globe are starting to see Blockchain’s potential in tackling fraud within the supply chain and purchasing.
“As it is an incorruptible digital ledger that allows financial transactions to be distributed digitally but not copied, invoice management is one area where Blockchain can help make the supply chain more secure and free of corruption. Using Blockchain, transactions have to be verified by all participants in a procurement transaction (e.g. suppliers and buyers). No individual will be able to change the invoice for their own gain without all other users of the system validating it, ensuring that an invoice doesn’t change from the point that a supplier submits it to when a buyer processes it. And as it’s a decentralised approach, people external to the organisation have no obvious place to target and carry out a cyber-attack, as the data is stored across the PCs of all network participants. They will know that it is valid due to an algorithm, taking the form of a set of rules.
“Blockchain also has the potential to eliminate double spending. As cryptocurrencies such as Bitcoin are digital payment systems, there is the risk that network users can pay with the currency more than once if executed cunningly. But if the transaction is stored in the Blockchain, another transaction using the same cryptocurrency cannot be carried out. This stops an employee from making a legal payment, while using the same token to pocket that amount of money for themselves.”
How autonomous vehicles can help pave the way for smarter cities
By Dr. Ingo Stuermer, Global Engineering Director Autonomous Driving, APTIV.
Cities are fast becoming the epicentre for mobility innovation – and for good reason. Mobility is the lifeblood of any city. It enables the movement of people, goods, and thus ideas, social interactions, and resources. Mobility is what makes a city liveable and an attractive place to live.
Cities today, however, face massive challenges in improving mobility. If you ask any city mayor, planner or transit official, he or she can reel off multiple challenges such as safety, congestion, environmental concerns, and equitable access for all. With urban populations projected to increase to 70 percent of the global population by 2050 (up from about 50 percent currently), these challenges will only continue to grow.
This increased population growth in cities means that by 2050, congestion and thus commute times could increase threefold, costs of transportation could increase fourfold, and emissions could rise to five times the current levels.
To ensure our cities continue to remain liveable, it is clear that a number of steps need to be taken to address these issues. These may involve empowering cities to come up with new solutions, offering financial support at a national level, or embracing new technologies that are helping to evolve the very concept of mobility as we know it.
It is the latter option that I would like to spend some time discussing in this piece – specifically the pivotal role that autonomous vehicles can play in paving the way for a smarter, more connected city. Because, believe it or not, autonomous vehicle technology has an incredible potential to address most of these challenges.
Did you know, for example, that studies show autonomous technologies can reduce urban travel time and bring down emissions by 30 percent and 66 percent respectively? Or that autonomous technologies have the potential to lower the number of required parking spots by 44 percent? While autonomous vehicles may still feel like a distant dream to many (unless you live in Las Vegas where they are already being used on the road today), they are a reality towards which we are rapidly moving. And one day in the not too distant future, they will play a very important role in helping to keep our cities moving.
Autonomous vehicles offer so much more than simply mobility benefits. Perhaps the most striking and valuable impact they can have on cities of the future concerns the safety of their residents. Autonomous technology has the potential to reduce traffic accidents by nearly 90 percent. That is a staggering figure that offers significant value to all cities across the globe. It is for this very reason that progressive cities are planning for connected and autonomous vehicles by deploying upgrades including vehicle-to-infrastructure technologies and smart traffic signals, both of which can be helpful in enhancing road safety.
With urbanisation set to move into overdrive in the coming years, new technologies and new solutions will need to be deployed by city planners and councils in order to keep streets moving and the public safe. These are just a few of the reasons why autonomous stakeholders need to focus on working with cities to understand challenges, use cases, and how autonomous technology can integrate and work with city requirements. The cities that handle these mobility challenges best will provide an enhanced quality of life, and thus attract more residents, capital, jobs, and opportunities. This is why it is critical that cities realign themselves with a mobility focus.
As more and more organisations embrace advanced analytics, we are starting to see several very successful implementations emerging, on both a small and large scale. At the same time, however, many organisations are not getting value from their expensive investments in analytics systems. And while failing may provide useful lessons, it is probably better to learn before failure. By comparing the two groups, it is possible to draw some conclusions about what makes analytics projects more likely to succeed or fail.
By Caroline Hermon, senior account executive, SAS.
There are four main areas that top teams and managers should pay attention to if they want to succeed with their advanced analytics projects.
1. Strategic engagement and understanding
It goes without saying that top managers need to be behind any analytics investment, but it is not enough for them simply to decide that they want the organisation to use advanced analytics or declare that it is now data-driven. Any kind of analytics project must be set within a clear vision for the company’s future, business direction and goals.
Managers must also be clear about the future of the analytics project, and how it will be scaled up and rolled out further. Without this clarity, it is likely to remain a pilot forever, albeit perhaps a useful one in its small context.
Linked to this, and part of strategic understanding, is that businesses also need to be clear about how the new analytics project will fit with existing structures and systems, both organisational and IT – such as legacy systems. This does not mean that it is essential to upgrade all legacy systems before any investment – quite the reverse, in fact. It simply means being clear about what needs to interact with what, and how this will be made to happen. Having one platform to manage the entire analytics life cycle can help with consistency, collaboration and governance.
2. A clear and relentless focus on value generation
Many companies fail to generate value from their analytics investment because they focus on the wrong problems or projects. It is essential that major investments, such as in analytics systems, are focused where they can add most value most quickly. In other words, potential use cases should be carefully assessed to make sure that they are focused on strategic issues and big problems. A focus on value-generation also means measuring outcomes and results to ensure that your investment is delivering.
This may sound theoretical, but you should also do the least possible work to generate value. For example, having clean, high-quality data is essential to generate useful insights with analytics. However, you probably do not need to clean all your data – just what you need at any given time. Clean what you need, when you need it – but make sure that the clean data is available for future work too.
3. Clarity about internal roles and tasks, and the skills needed to deliver
There are several aspects to this, but they all relate to internal issues and management. The first is to have clarity about need, and how analytics will fit within your organisation. It is all very well deciding to recruit data scientists, but are you sure you know what they are going to do? And are you really sure that data scientists are what you need?
Rather than focusing on job titles, it is better to be clear about the tasks that need doing, and the skills required to do those. This will include having people who can “translate” between business users and analysts – whether part of either group or not. Someone needs to be responsible for ensuring that business needs are heard and understood, and that analytics projects are delivering to business requirements.
4. A clear view of the wider context
You might consider the wider context is part of the overall strategic view, but it is worth mentioning separately. There are an enormous number of things that can be done with advanced analytics, but some of them may not be legal within your regulatory context. The General Data Protection Regulation (GDPR), for example, requires executives to be able to explain the basis of decisions about individuals.
This means that “black box” algorithms are likely to be inadvisable for companies that might be affected. “Computer says no” will not be enough, so you need to be confident that your analytics output is clearer than that. GDPR also means that it is likely to be better to use anonymised data, so that you do not have to change everything each time someone asks to be forgotten. Forgetting the regulatory context could be an expensive mistake.
Smart cities are becoming much more common as more of the world’s population moves into cities. It is estimated that by 2040, 65% of the world’s population will be living in cities and, according to the International Data Corporation (IDC), by 2021, smart city technology spending is expected to reach $135 billion. To keep up with this influx of people, APIs are being used to drive smart cities which will improve environmental, financial and social aspects of urban life and provide a sustainable future for its inhabitants.
By Manish Jethwa, chief technology officer, Yotta.
In many of the world’s capitals, it is possible to see smart city initiatives in action in everything from timetables at bus stops to street lighting. Street lights, in particular, tend to play a key role as the catalyst for this process as they are already connected to a power source and their shape/height enables them to perform the role of an antennae for the sensor network.
Other sensors can then piggy-back on the network, connecting with each lamp to send data using low-power communications. This is effectively what enables a smart city network to be developed without the authorities needing to install a new array of powered sensors across the city. The benefits extend far beyond street lights themselves. Enhanced connectivity can also be key in improving drainage systems; ensuring bridges are safe and looking after green spaces for example.
All these processes rely on sensor technology sending out data for analysis. That makes it critical to implement a technology architecture capable of handling mass data flows – and helping collate, order and visualise data such as pollution statistics, details of road surface conditions or drain levels.
But how exactly does this architecture work? On the sensor side, the city authorities generally use specific APIs as the providers will typically be building data models that are specific to the sensor data. On the application side, the API should be more flexible to effectively bring lots of data sources together into a single system. If the application interface is too rigid, new API endpoints will need to be developed to get each new type of data into the system, adding cost to the process of integrating new sensors into a central data hub.
There will, however, always need to be some transformation of the data from device-specific APIs into delivery application ones. Moreover, this transformation needs to be carried out at scale and on demand and that is where a microservices approach, in which large applications are developed from modular components or services, comes in.
One key role of microservices is as a data filter. Microservices can help filter inconsequential data collected by sensors and then transmit significant data to the right places, which allows data analysis to happen at a more general level.
For example, the authorities may have implemented multiple sensors to measure temperature variations across the city. Microservices can provide a valuable service here by reducing multiple measurements into key notifications of predefined threshold being exceeded.
One of the big advantages of microservices is that they can be quickly scaled up, when multiple sensors all decide to send data at the same time, for example, but they are relatively inexpensive because they typically run for only a few seconds at a time and users only pay for the time they are in use.
Once the microservices have completed their work, the data is passed through smart city application APIs for processing. At this point, the application needs to have a strong visual interface that helps the authorities understand and make sense of the data that has been collected.
That’s where the visual element of the interface is so important. Dashboards need to be in place. Strong iconography and colours can be used to differentiate data items or link similar items for example. To drive user engagement, usability elements can be built into the interface, further encouraging users to interact and engage with it, establishing patterns, analysing the results of data enquiries and driving new insights.
Insights alone are, however, of little value in building the smart city unless they result in concrete actions. That means rules need to be put in place that trigger immediate actions, such as a service engineer call if a street light fails, or a maintenance visit being arranged if a drainage gulley is overflowing.
This workflow element is vital to the success of any connected asset management approach within the smart city – and it must never be neglected if the analysis carried out by the system is to result in tangible operational efficiency, environmental and safety benefits across the city.
The potential for smart cities is clear. Internet of Things technology is helping to drive enhanced connectivity between assets. The authorities are using a technological architecture, consisting of microservices, APIs and visual interfaces to make use of this connectivity to collate, order and visualise key data. By analysing patterns and trends in this data they can thereby achieve insight into a range of issues affecting the smart city and take steps to make them safer, more productive and better places to live.
With recent reports outlining how blockchain is set to make a significant impact in both the environmental and pharamaceutical and lifesciences industry sectors, Digitalisation World sought a range of opinions as to how blockchain will develop outside of its cryptocurrency and gaming sweetspots.
Part 4 of this article brings you the opinions of various industry experts as to how they see this technology developing in mainstream business sectors
Shan Zhan, global business manager at ABB’s food and beverage business, says:
“There has been a lot of talk in the media recently about Blockchain, particularly around cryptocurrencies, but the technology is starting to have more wide-reaching impacts on other sectors. With multinationals such as IBM and Walmart driving a pilot project using blockchain technology for traceability, the food and beverage industry needs to look at the need for the protection of traceability data.
“Food security is a key priority, especially in developing countries. While most countries must abide by strict traceability regulations, which are particularly strong in the EU, other regions may not have the same standards or the data may be at risk of fraud. While malicious contamination intended to damage public health is a significant concern, a bigger problem is the mislabeling of food for financial gain. For example, lower-cost types of rice such as long-grain are sometimes mixed with a small amount of higher-priced basmati rice and sold as the latter.
“In this case, blockchain technology would prevent food fraud as the amount of each ingredient going into the supply chain cannot be lower than the volume going out. This would flag the product as fraudulent.
“Not only can it help to monitor food ingredients, it can also monitor the conditions at the production facility. These are often very difficult to verify and, even if records are taken, they can be falsified. A photo or digital file can be taken to record the situation, such as a fish being caught, to show that it complies with the MSC’s regulations on sustainably caught seafood.
“The blockchain will then create a secure digital fingerprint for this image that is recorded in the blockchain, known as a hash. The time and location of the photograph will be encrypted as part of this hash, so it cannot be manipulated. The next supplier in the blockchain will then have a key to this hash and will be able to see that their product has met the regulations.
“While we are not yet advanced enough with this technology to implement across all food and beverage supply chains, increased digitalization and being at the forefront of investment into these technologies will help plant managers to prepare their supply chain against the food fraud threat”.
A victim of its own success?
Jitendra Thethi, Assistant Vice President of Technology and Innovation at Aricent, thinks that:
“Blockchain has become a victim of its own success. Look beyond the hype and blockchainwill go mainstream. It has already gone beyond the cryptocurrency space as it provides the answers to many security flaws businesses - and even society - experience today. In a nutshell, blockchain saves each transaction inside a cryptographic block, which is linked to its predecessor to form a chain. It is possible to trace and verify every individual transaction. It is tamperproof.
“Here are some examples of real-world business use cases. Toyota uses blockchains to track thousands of car parts that comes from different countries and factories. IBM is collaborating with a consortium of major food suppliers and retailers who will use blockchains to tackle food safety. Ericsson is using blockchain technology for the rail industry to audit railway installation and maintenance as well as field-service work orders. Airbus and Rolls-Royce are looking to deploy blockchain for aircraft equipment provenance.
“Blockchain can help log shadow profiles for IoT devices and even mitigate impacts of ransomware. Blockchains naturally help attain desired compliance – they allow for compatibility and interoperability in a democratized manner.
“On a societal level, blockchains with its immutability and strong personalized control and consent for social, political and personal information is the future for digital identities. There are reports that the UK’s Home Office is considering blockchains – this can help to avoid situations such as Windrush. Blockchain’s tamperproof capabilities are also being considered by authorities to tighten voting machines.
“Blockchains do come with challenges that businesses must overcome for future deployments to be successful. Blockchain systems will need to be compliant with privacy regulations such as GDPR. Of course some would argue that privacy leads to more trust.
“Another red flag is that businesses should also be aware of how much data to store on a chain. Data volumes can grow quickly and this might require new means of archiving. Then of course the archiving would need to be robust, accessible and cost efficient. Finally, one of blockchain’s strengths is anonymity. But when using blockchains to improve identity management - anonymity must be avoided to build trust – especially in customer service.”
Blockchain reaching fever pitch?
Travis Biehn, technical strategist at Synopsys, comments:
“Enterprise blockchain adoption has reached a fever pitch internationally in 2018. However, the security community has been late to the game in terms of securing these platforms against attack. While the open source community has been enamoured with the success of Ethereum, the enterprise community has been quietly building the next generation of distributed trust-less applications on permissioned blockchain technologies. Public blockchains allow everyone to participate in the network (with varying degrees of access), while private blockchains regulate access via membership control. While enterprise blockchains have roots in established public blockchains like Bitcoin and Ethereum, they have significant and security-relevant differences.
“Unlike Bitcoin and Ethereum, the private blockchain’s primary use case isn’t to store and transfer value, but to enforce arbitrarily complex business logic. Enterprise practitioners often use the term distributed ledger technology (DLT), referring to the general capabilities that blockchain platforms provide. Promises made by blockchain platforms include immutability, auditability, tuneable-trust, and programmability. Immutability is the guarantee that, once data has been written to the blockchain it will be tamper-proof indefinitely. Auditability is the direct result of the immutability guarantee combined with historical data kept by all members of a blockchain. Tuneable-trust promises that applications built on distributed ledger technology don’t need to trust all members of their network. Programmability means the rules of the network are codified into smart contracts. These blockchain programs describe precisely with code how data changes, ultimately comprising an enormous state machine.”
A picture paints a thousand words
Eugene Morozov, Co-Founder of CryptoNumiz, details blockchain’s potential in the art world:
"Blockchain is taking many industries by storm with its ability to fully encrypt and track the movement of digital data while providing a secure and transparent way to view information. We will continue to see increasingly obscure industries adopting the blockchain in the near future.
“The art industry; for example, suffers from the ability of someone to duplicate another’s original work, damaging the real value of countless originals. As digital art grows in popularity it also becomes easier to simply copy what we see online, claim it and sell it for a profit. The problem has become so serious that many artists are now put off from sharing their work online through social channels.
“Now, with the integration of blockchain technology, a digitally produced and presented piece of art can be secured all the way from the moment of its creation, through to the auction phase, and beyond that to the ownership and resale phases, and ultimately throughout the piece’s entire existence well into the future and beyond our own life spans. This helps to ensure that unique, original art remains encrypted to everyone except those for whom it is intended.
“There are blockchain solutions available that will immediately authenticate an original piece of art and who it belongs to. For instance, if a piece of digital art is illegally reproduced, the blockchain would be able to immediately identify the original owner, revealing to them that the original piece has been forged.
“The blockchain revolution within the art industry is yet to hit its peak. The problem lies with acceptance and knowledge of what the technology can actually achieve. Infrastructure to support the technology needs to be established, helping to highlight what the possibilities are. Blockchain is ready and waiting; it’s up to the market to realise this potential before we will see the technology become mainstream."
Insurance reassurance
Aaron Wagener, COO of the MXC Foundation, explains:
“Like all truly innovative technologies, the blockchain was initially met with some resistance. However, over the past few years it has proven its worth, both on an economic level and as a basis for a new generation of similar and/or related technologies.
“We’re already seeing how blockchain technology is helping to build trust between consumers and insurers. Blockchain is based on the use of a decentralised public ledger stored on millions of machines around the globe. For a transaction to be made, a portion of connected machines must agree on the validity of this transaction. This reduces the opportunity for human error, and makes it nearly impossible for the system to be tampered with. Public ledgers are transparent, self-policed and incorruptible.
“Some blockchains allow devices to subscribe to data streams using smart contracts. This allows data to sold between machines while the agreed payments are automatically processed once the service is completed. Of course, these services are paid for using cryptocurrencies.
“There are also cryptos that allow people to share their data with certain insurance and healthcare companies. In turn, they can see who is using their information, where it was sourced from – and are getting paid for the usage of this data. Ultimately, this means each person can collect, use, and resell data, and earn crypto, which they can invest any way they see fit.
“The benefit for consumers is they would be able to buy insurance and have peace of mind they will be compensated without qualifying causes that are unjust. For insurance companies the incentive is to reduce fraudulent claims and process damages more speedily.
“There is also evidence of the technology’s permanence in the efforts being made to adapt blockchain technology to accounting. The prospect of tracking assets with a long and unalterable blockchain has accountants excited about the possibility of massively reducing the manpower and processing required in day to day transactions, and financial regulators equally enthusiastic and worried about the possibility that money may no longer be fungible. “Like a blood diamond, you may be able find out just where future dollars (or their equivalents) actually came from.
“On an individual level, blockchain offers each and every one of us the possibility of seeing who is using our information, where it was sourced from – and of getting paid for the usage of our data. Ultimately, the result will be a decentralised economy where individuals will collect, use, and resell their own data, opening this closed market to anybody who wants to take part.”
Data verification
Adrian Clarke, the Founder and CEO of Evident Proof. Adrian and his team are currently working on data verification and proof services built atop a blockchain network. In regards to the impact blockchain technology could have on the business world, Adrian comments:
“Blockchain technology has opened up a range of possibilities in the business world. We’ve grown accustomed to centrally-owned databases with standard CRUD permissions (create, read, update and delete). This works well for a number of applications, though it’s lacking in many others – databases owned by a single entity run the risk of being altered retrospectively or having falsified data appended. In fields like insurance, supply chain management or the tracking of pharmaceuticals, this is a risk that’s been accepted (for lack of a better offering), but with blockchain technology, it’s one that can be largely mitigated.
“Indeed, the idea of a ledger that isn’t controlled by any single party, but rather by participants of the network, has been greatly received over the past years. Of particular interest are the properties of immutability and transparency that can be leveraged within a blockchain system, in such a way that tamper-proof and permanent records can be created and later used to trace the provenance of a given set of data.
“To draw on a specific example of where this sort of mechanism can shine, consider the art industry. Artwork has been a hugely traded asset for centuries, but it’s a difficult one to manage – for the most part, the markets for artwork are largely unregulated, with value being drained from buyers and sellers due to factors such as dishonest transactions, theft, duplication of works, or the sale of counterfeit pieces.
“What’s sorely needed is a reliable method of observing the course a specific piece of art takes over its lifespan. A means by which to track its provenance, without needing to meticulously identify the myriad of third parties and holders involved previously. With a distributed ledger, it would be trivial to see where a given asset has come from (this is by no means limited to the art domain, and can be extended to supply chain management).
“We’ve only touched the tip of the iceberg in determining what the technology is capable of. Thousands of devs around the world are working hard in improving on the protocols we have in these nascent days. I believe the technology is here to stay – it’s already being rolled out, and will continue to take over inefficient operations in the business world.’
The advantages of DevOps are increasingly recognized in terms of faster deployments, increased productivity and lower downtime. No wonder that 82% of the 700 organizations surveyed in the 2018 Redgate State of Database DevOps survey said that they have either already adopted DevOps or plan to do so in the next two years. They cited benefits including increased speed of delivery and freeing up staff to handle added-value work as key reasons for embracing it.
By Matt Hilbert, Redgate Software.
DevOps involves a fundamental shift in how many businesses work, however, and to achieve lasting results, companies need to focus on four areas.
1. Make the development process end-to-end
Modern companies are driven by data, and access to the latest information is vital in order to create and maintain an agile, responsive and successful business. It’s therefore essential to include the database as an integrated part of your DevOps process, in order to make changes quickly and seamlessly to the database as well as applications.
Over a third of respondents to Redgate’s DevOps research make database changes either daily or more than once a week, demonstrating how important it is to DevOps workflows. Failing to incorporate the database risks it becoming a bottleneck that slows releases and undermines the effectiveness DevOps can otherwise deliver.
Organizations should encourage close collaboration between developers and database administrators and use the same DevOps methodologies to drive consistency and maximize productivity. Including the database in processes like version control and continuous integration, for example, will speed up development and make deployments far easier, particularly if the tools used for database development plug into and integrate with the infrastructure already in place for applications.
2. Understand your company culture
Like any big change, DevOps only works if it fits with your business culture. It requires openness, a willingness to collaborate and the ability to break down department barriers and silo-based ways of working.
The first step is to gain a real understanding of the culture that exists within your business, and a good place to start is the three cultures model developed by American sociologist, Ron Westrum. He found that in every business, one of three cultures exists: pathological, bureaucratic or generative.
Pathological cultures are typified by individuals who focus on personal power and the hoarding of information to gain advantage. The management style is typically domineering, and innovation and creativity are discouraged because they threaten the status quo.
Bureaucratic cultures emphasize following rules and defending departmental turf. Existing procedures and practices are not questioned and change is seen as something that must be subjected to intense scrutiny.
Generative cultures come about when everyone is focused on the broader mission of the business. This creates a climate which encourages the solving of problems rather than seeking the cause of them, and supports innovation and co-operation within and across departments.
Clearly, if you work for an organization with a pathological culture, DevOps probably isn’t an option. The co-operation and collaboration needed to bring people together is missing, and any attempt to introduce it will be frowned upon rather than welcomed.
For bureaucratic organizations, you’ll need to take a softer approach to DevOps, with longer lead times and extensive planning meetings before a tentative first step is taken. Bear in mind that while this will take time, once DevOps becomes the new normal, it will then be seen as part of the status quo and something that everyone follows.
The easiest culture for introducing DevOps is the generative one, because it already has the openness and focus on collaboration required for it to thrive.
3. Show what’s in it for each group
We’ve talked about the overall benefits of DevOps, but how these manifest themselves for different groups varies.
The major concern for CEOs, for example, is business outcomes, so you need to demonstrate that DevOps will drive benefits such as a faster time to market, higher quality products, lower costs and higher revenues.
CIOs, on the other hand, are more interested in operational advantages like improved support, faster fixes, increased team agility and engaged, happy employees.
Going down a further level, managers and team leaders are operating in a more detail-oriented environment. They want to see how DevOps can help increase release speed and frequency, lower release costs and the number of defects, and improve application performance.
To ensure widespread backing for DevOps, make sure you sell the different benefits to each group, using the language and metrics that are important for their roles. This will increase understanding, adoption and the overall success of your efforts.
4. Collaborate across teams
Moving to DevOps is a journey, and requires everyone to be involved. Hence the difficulty introducing it to closed, individualistic, pathological cultures. People simply don’t see any benefit to collaboration, particularly across departmental boundaries.
At a technology level, DevOps collaboration requires traditional barriers between developers and database administrators to be broken down if information is to flow successfully between them. The good news is that this is happening in an increasing number of organizations – 58% of companies surveyed by Redgate rated the collaboration between these teams as ‘good’ or ‘great’ – a figure that rose to 68% for those which had already adopted DevOps.
The survey also found that over three quarters (76%) of teams have developers who are responsible for both database and application development, showing an increasingly collaborative approach is already being achieved.
Summary
The benefits of adopting DevOps are increasingly clear, whatever the type or size of organization that you work for. However, implementing DevOps is not simply a case of flicking a switch and seeing immediate, positive results.
To drive lasting success you need to understand your company culture, explain the benefits to every group in the organization, adopt an end-to-end approach that integrates the database within DevOps and finally ensure close collaboration that breaks down silos.
Concentrating on these four areas will maximize your chances of successfully adopting DevOps and creating a flexible, agile technology backbone which underpins wider organizational success.
The recent hacking of servers belonging to Professional Golfers’ Association (PGA) of America, targeting files relating to the PGA Championship and Ryder Cup golf tournaments, is an example of the threat posed to organisations’ cyber defences by increasingly sophisticated types of malicious software.
By David Higgins, Director of Customer Development EMEA, CyberArk.
The continued growth of digital technologies, automation and the Internet of Things is creating countless opportunities for businesses; for instance, capturing and using real-time data to gain a competitive edge and boost those all-important margins. Simultaneously however, this marriage of old and new technologies has introduced unseen forms of cyber risk and provides criminals with additional routes of attack, which if ignored, could put a stop to business altogether.
Recognising the threat
The rapid growth in digitisation and automation has been accompanied by the emergence of a type of cybercrime predicated on the use of ‘ransomware’ to extort funds – often in the form of bitcoin. As seen in the case of PGA, ransomware locks systems and denies access to data until the ransom sum is paid. Following the typical line, the PGA hackers warned that any attempt to crack the hacked file encryptions would lead to the permanent loss of the data they contained.
With increased digitisation, previously unconnected areas of an organisation’s operations can now become part of a broader interconnected IT network. This became evident in the PGA hack: the breached files contained marketing materials, including logos, relating to the two golfing championships. Integration and connectivity undoubtedly bring multiple operational advantages, but teams looking after the security of internal IT networks now find themselves with much larger attack surface areas to protect.
Defending against cyber-attacks is or at least should now be a high-level priority for businesses and organisations. An aversion to cybersecurity investment will leave firms increasingly vulnerable to new and emerging types of infiltration. Ransomware attacks, though far from new, are becoming more and more relevant, and in some cases more complicated to defend against.
The repercussions of ransomware
When ransomware is downloaded it rapidly encrypts files and data on the victim’s infrastructure, disabling access and even bringing operations to a halt. This can quickly damage customer relationships and incur huge costs through the loss of intellectual property or essential business data.
Ransomware is usually delivered via a simple phishing email, containing a misleading attachment for the victim to open. Once opened, the attachment encrypts the data in the user’s system and delivers a message with details on the conditions of the ransom and the size of the payment required to access the decryption key.
The damage done by ransomware has historically depended on the particular individual in a target company, and the extent to which they are connected to the wider network. More recently we have seen variants of ransomware that have extended their scope beyond the hard drive of a single PC. Instead, they seek out ‘privileged’ accounts – those which provide advanced administrative access – to move more widely within the network and search for business-critical files to encrypt. In this way, by infiltrating just one account, the ransomware can compromise a much larger part of the network to find and deadlock vital files and data at an even greater cost to businesses.
Bolstering defences
Most anti-malware and anti-ransomware solutions today focus on detecting and blocking them at the point of infection. These solutions are useful when you know what you’re looking for, but ransomware continues to evolve, with new variants emerging every day. Businesses and organisations should therefore adopt a multi-layered approach which employs application controls and removes local privileges (the ability to access more sensitive parts of the network) from regular PCs. This will reduce the surface area for attacks and block their progression.
Steps must also be taken to protect the most sensitive files in the organisation. Employing grey-listing - an approach which denies reading, writing and modifying file privileges to unknown apps or applications that aren’t trusted or certified - allows ransomware to execute harmlessly, thereby blocking it from accessing and encrypting business critical files.
Backing up an organisation’s data is a simple but essential defensive method in the fight against ransomware. With multiple generations of backup – taken from automatically backed up data at various intervals – the system can be wiped and restored in an instant, negating the threat of ransom demands.
As businesses and organisations embrace digitisation and automation to access the benefits of operational integration, cybersecurity must be a primary consideration. By dedicating equal time and investment to protecting their highest value assets through improved cybersecurity, organisations can limit the impact of fast-growing threats such as ransomware and ensure their business remains securely operational at all times. With high-profile incidents such as the PGA hack this month continuing to occur, it’s essential that businesses look closely at their processes to ensure they won’t succumb to a similar fate.
Research detailed in Schneider Electric White Paper #136, “High Efficiency Indirect Air Economizer-based Cooling for Data Centers”, demonstrates that Indirect air cooling systems deliver the longest-term energy savings in economizer mode.
By Wendy Torell, Sr. Research Analyst, Data Center Science Center, IT Division, Schneider Electric.
Energy efficiency, sustainability and impacts on the environment are key considerations for modern data centres. As a result, the cooling function is under constant scrutiny. Careful selection of its components and the overall system design can yield significant savings in a facility’s overall power consumption.
Maximising the amount of time the cooling system spends in “economiser mode”, where compressors and chillers can be turned off temporarily and outdoor air used to cool the data centre, is a key objective.
Ultimately, it is the external climatic conditions that determine whether the surrounding air temperature is low enough to permit the cooling system to operate in economiser mode. Increasingly, developments in both product design and systems management allow economiser modes to be engaged across a broader range of external temperature and humidity conditions, and therefore allow the number of hours of “free cooling” to be increased.
In many cases it is now possible for economiser mode to be the primary mode for a cooling system, with high-energy compressors and chillers engaged only rarely, whereas traditionally, the opposite has been the case.
Direct vs Indirect Air Economizers
There are two ways to use outdoor air to cool a data centre: direct air, sometimes referred to as “fresh” air economisation, in which unfiltered air is brought directly into the IT space; or indirect economisation, in which air is passed through one or more heat exchangers before entering the IT space.
The former is potentially far more cost effective, as one foregoes the need for additional heat-exchange equipment, but it is only possible if the surrounding air is dry and free of pollutants or other contaminants. Inevitably, fresh air economisation means the data centre must run the risk of enduring rapid changes in temperature, humidity and air quality.
By contrast, indirect air economisation not only ensures that air quality in the data centre can be maintained at a high level, but also offers greater control options over the temperature of the air admitted. The downside is the increased capital investment in heat-exchange equipment that is necessary. Nevertheless, if temperature and humidity thresholds are kept within certain limits, indirect air economisers can actually produce greater efficiency than direct fresh air cooling in many geographical locations.
Optimal cooling design
Compared to traditional cooling approaches, a cooling system that follows the following five design principles can reduce energy consumption by 50% and allow the data centre the flexibility to scale from partial to full load.
1. Economiser mode is the primary mode of operation
2. Inside air is protected from outdoor pollutants and excessive humidity fluctuations
3. Onsite construction and programming times are minimised
4. Cooling capacity is scalable in a live data centre
5. Maintenance does not interrupt IT operations
A system design that follows all these principles is a self-contained cooling system, placed directly outside the data centre with three modes of operation: air-to-air economisation; supplementary evaporative cooling; and in worst-case scenarios, Direct Expansion (DX) or chilled-water cooling, which comes into play if either of the two economiser modes prove insufficient to the cooling requirement.
Hot air is pulled into the module and, based on the load, IT inlet temperature set point and outdoor environmental conditions, the system automatically selects the most efficient mode of operation.
Indirect air-to-air economisation modes use an air-to-air heat exchanger to transfer the energy from the hotter data-centre air to the colder outdoor air. If this system is unable to reject the data-centre heat, evaporative cooling is used to spray water over the heat exchanger to reduce its surface temperature, allowing the data centre to continue to benefit from economiser mode. The proportional DX mode provides additional cooling capacity when neither of the two economiser modes can maintain the inlet set point.
Such a self-contained indirect air system can be placed outside the data-centre building, either mounted on concrete pads or on the roof, if it has sufficient weight-bearing capability.
A traditional chilled-water cooling architecture typically has three heat exchanges that take place in economiser mode: outdoor air passes through the cooling tower, a plate and frame heat exchanger and then an air handler. As a result, to obtain an IT inlet temperature of 21.1C the traditional design requires an outdoor air temperature of 6.5C or lower before economiser mode can be engaged.
In contrast, the indirect air system has only one air-to-air heat exchange. The single-stage air-to-air exchanger can achieve the same IT inlet temperature with an outdoor temperature of 12C. In other words, the economiser mode can be engaged during a wider range of outdoor temperatures and consequently a greater period of time.
The case for evaporative cooling
Evaporative cooling increases the use of economisation mode in many environmental regions, especially hot dry climates. Water is sprayed evenly across the outside of the heat exchanger. As ambient air is blown across it, the water evaporates causing a reduction in the outdoor air temperature, increasing the amount of heat energy that can be removed from the heat exchanger.
The wider range of operating temperatures that can be tolerated inside the data centre itself also enables improved efficiencies. Many IT vendors now build greater temperature-range tolerances into their equipment, meaning warmer temperatures can be allowed inside the IT space and consequently the cooling effort can be reduced. This leads to longer periods spent in economiser mode.
Control and prefabrication
Some vendors now ensure their products are more efficient and controllable so that cooling effort can be matched more accurately to IT load. Variable frequency drives on components such as fans, pumps and compressors allow cooling effort to be reduced at lower loads, thereby saving energy.
Controls are also becoming more standardised and pre-integrated into systems before they leave the factory.
Traditionally, such controls were typically engineered and programmed onsite leading to control systems that were unique and difficult to replicate, and were not optimised for energy consumption. In many cases they weren’t fully tested and lacked the complete supporting documentation or manuals. They were also very inflexible, making it difficult to accommodate changes to the makeup of the data centre. Standardised, pre-engineered controls allow economiser-based cooling systems to operate efficiently and predictably in all modes of operation as climate and load settings change.
Often these prefabricated designs are modular in construction and offer great advantages in terms of speed of installation, scalability and continuity of operation. Pre-tested modules can be integrated quickly and scaled up cost effectively as load demands increase. Furthermore, they can be placed outside the data centre or on the roof, which minimises the downtime needed to add new cooling capacity or to carry out essential maintenance.
Conclusion
Direct comparisons of energy savings from self-contained indirect-air economiser cooling systems compared with traditional chilled-water economiser systems will depend on several factors, including load, geography and age of facility. The Cooling Economizer Mode PUE Calculator, a TradeOff Tool developed by the Schneider Electric Data Center Science Center allows you to see this comparison as you vary key parameters to match your data centre scenario.
An indirect air cooler with evaporative-cooling support can typically expect to operate in economiser mode for more than 50% of the year in most geographies, whereas traditional economiser modes often operate in economizer mode less than 25% of the year.
The ability to place the cooling system outside the facility means that space for revenue-generating IT equipment is not wasted. Traditionally, chilled-water systems consume approximately 30sq m for every 100kW of IT load, or about 5% of computer room space.
For data centre managers keen to maximise efficiency, a self-contained indirect-air cooling system can provide the greatest opportunity to maximise the amount of time spent in economiser mode, ensuring electrical energy consumption is kept to a minimum.
More information can be found within Schneider Electric White Paper #136, “High Efficiency Indirect Air Economizer-based Cooling for Data Centers”, which can be downloaded by visiting http://www.apc.com/wp?wp=136.
With recent reports outlining how blockchain is set to make a significant impact in both the environmental and pharamaceutical and lifesciences industry sectors, Digitalisation World sought a range of opinions as to how blockchain will develop outside of its cryptocurrency and gaming sweetspots.
Part 5 of this article brings you the opinions of various industry experts as to how they see this technology developing in mainstream business sectors
Alastair Johnson, Founder and CEO of Nuggets, a blockchain-based platform for secure payments and identity verification, offers the following comments:
“Blockchain technology has no shortage of applications. Fundamentally, the offering is an incredibly valuable one. An immutable database owned by no single entity is a vastly more secure alternative to the centralised data silos we’ve witnessed crop up over the past few decades – if the trend in data breaches has taught us anything, it’s that not even the biggest businesses, which store troves of sensitive customer information, are impervious to attack.
“It’s not something that can be overlooked. On one hand, legislation like GDPR is starting to severely penalise companies found to be mishandling their customer data. On the other, directives like PSD2 are opening up the backends of banks, so as to allow fintech companies to leverage the security and infrastructure offered by banks to build on top of.
“I think businesses operating in this changing regulatory environment would greatly benefit from exploring and implementing blockchain protocols into their operations. Insofar as GDPR compliance, the combination of zero-knowledge storage and blockchain means that individuals (and not companies) are solely in control of their data – information like card details or identity documents never have to be shared with merchants or service providers. This gives individuals greater peace of mind, whilst also ensuring that businesses integrating such a solution need not expend time and money ensuring that their practices for storing customer data are compliant – notably, because they don’t store it at all.
“I firmly believe that self-sovereign identities are the future of customer-company interactions. Trusting businesses to safeguard information is clearly an unworkable model, and it’s one that needs to be left behind – irreparable damage can be caused in the event that malicious parties gain access to individuals’ sensitive data (whether to the individual, or to the business that failed to adequately protect from the breach in question).
“Blockchain, in the business world and beyond, is a means by which to empower users. We have the technology available to us to break away from archaic databases, which have served only to put consumers and companies at risk by become lucrative central points of failure. It’s time to harness decentralisation to put an end to these risks, and to ensure that individuals are put back in control of their own data.’
Security successes
Dr Kevin Curran, senior member of the IEEE and professor of cybersecurity in the School of
“The blockchain could play an important role to play in security, particularly in areas such as securing the Internet of Things (IoT) in the days ahead. Scaling the IoT will prove difficult using traditional centralised models. There are also inherent security risks in the IoT such as disabling them should they become compromised and parts of botnets, which has already become a serious problem. Using the Blockchain, with its solid cryptographic foundation, offers a decentralised solution that can aid against data tampering thus offering greater assurances for the legitimacy of the data. Blockchain technology could potentially allow billions of connected IoT devices to communicate in a secure yet decentralised ecosystem which also allows consumer data to remain private.
“With regards practical applications in the public sector, one area which has received a lot of attention is online voting via a blockchain. Here the blockchain serves as a public ledger of transactions which cannot be reversed. The all-important consensus of transaction (e.g. legitimate votes) is achieved through 'miners' all agreeing to validate new records being added to a database. Whenever a vote is to be made, a new transaction record is created by person A, who adds the details of his vote and uploads it to the blockchain of nodes for them to add the new transaction to the blockchain. Should it be deemed a valid transaction by the majority of nodes – then the new vote is added to the end of the blockchain and remains there forever. A majority consensus is all that’s needed to approve the winner. Here everyone agrees on the final tally as they can count the votes themselves and as a result of the blockchain audit trail, anyone can verify that no votes were tampered with and no illegitimate votes were inserted.
“Other areas include using blockchain technology to transform key aspects of society by using smart contracts to make micropayments, which for example in the music industry could enable data sharing releasing more value along the chain. It can even be used to create online lotteries. A blockchain can assist recruiters in identifying, processing & verifying candidates quickly. It has high levels of trust built in. It can also potentially eliminate qualification fraud so for instance, an IT company seeking to hire cybersecurity and AI personnel can have confidence that the 'badges/certs' they earned and verifiable through a blockchain have not been tampered with. Continuing professional development (CPD) in IT is important due to the rapid changing nature of technology but it can be very difficult to track. Blockchain potentially offers a decentralised trusted method to get inputs from trusted providers and therefore users may be incentivised to do more CPD. Of course, their CPD learning credits will be securely stored in a reputable system.”
Min Teo, Executive Director at ConsenSys, leading innovators in blockchain and decentralisation, believes that:
“Blockchain has far greater applicative potential for mainstream adoption than its association with cryptocurrency and the financial sector might suggest.
“The reason that financial services is so closely associated with blockchain is because Bitcoin, the very first application of blockchain technology, was designed for the transfer of value over a peer-to-peer network, where actors needn’t trust a third-party intermediary, such as a bank. The network relied on new ways to achieve consensus across many different actors. Ethereum is similar to Bitcoin in that it does not rely on intermediaries to validate transactions, but its potential extends much beyond transferring value. The Ethereum Virtual Machine (EVM) enables developers to write code, create digitally-scarce tokens, and develop sophisticated smart contracts on top of the ledger, allowing users to exchange value securely. In the years to come, there is potential for any industry that relies on third parties to act as trusted intermediaries to be disrupted by blockchain.
“Smart contracts allow for the tokenisation of any asset that has value, meaning that there are manifold examples of industries —- such as journalism, real estate, supply chain, and entertainment — that are already experimenting and poised to benefit from the technology. For example, Civil is a community of journalists and technologists using blockchain technology to create a new, sustainable model for journalism where readers and journalists can directly exchange value with one another, fund a newsroom, or challenge newsmakers with voting protocols. The inherent security of blockchain also means that the enterprise data security and supply chain sectors are set to be turned on their head. The current state of supply chain management means that you’re trusting that each person doesn’t tamper with their own database, even if there’s an economic incentive for them to do so. Blockchain technology would change this because it fundamentally relies upon a shared, immutable ledger. Viant is an Ethereum blockchain based supply chain solution that recently partnered with WWF to deploy a solution for provenance tracking of sustainably sourced yellowfin tuna in Fiji.
“As well as impacting specific industries, blockchain also has the potential to be as intrinsic to our everyday lives as the internet, by acting as the foundation of Web 3.0. Built on blockchain platforms like Ethereum, Web 3.0 will enable people to have control of their digital information and permission how it is used. There will be no one company or government controlling the information and the tools that we use to access the internet. In the same way Web 2.0 is considered the internet of information, Web 3.0 will be the internet of value; blockchain is the enabling technology that will bring the internet back to its originally intended open source, transparent form.”
Impact on data storage
David Warburton, Senior Threat Research Evangelist (EMEA), F5 Networks, says:
“Blockchain is best known as the technology behind the Bitcoin cryptocurrency, and uptake to date remains most advanced in the financial services sector. However, a wealth of other use cases are emerging, from traceability through the food supply chain to micropayments for media content. Countries in EMEA are already putting their weight firmly behind it, with regions such as the UAE announcing intentions to base all government transactions on blockchain by 2020.
“Blockchain’s central promise of ultra-secure authenticity, without an expensive middleman, is an incredibly compelling proposition. It could become the biggest disruptor yet for the digital economy. However, like any technology, blockchain is not immune to outside threats. It has already endured frequent and successful attacks resulting in the loss of hundreds of millions of pounds. There’s also no doubt that Bitcoin is still largely regarded by the public as the ransom currency of choice for cybercriminals.
“It’s my belief that the development of blockchain technology points towards a future ecosystem where data storage and analysis is increasingly localised and de-centralised. Such advancements will be driven by growing infrastructure provided by the IoT and edge computing, as well as stringent societal calls for greater security and transparency.
“All businesses need to consider how blockchain technology will impact their business and should start to think about where it best fits to meet customer demands. For now, blockchain remains a breakthrough technology, albeit one that has a long journey to establish itself as a mainstream business platform.”
Blockchain will become mainstream
Simon Tucker, Partner and Head of Energy and Commodities, Infosys Consulting, believes that:
“It's not a question of if blockchain will become mainstream, but rather when organisations will begin deploying the technology at scale. Consider rocket technology, GPS or even computing. All these were developed for military or intelligence applications, and each has been harnessed to bring previously unimagined capabilities to businesses and their customers. It’s exactly the same with blockchain: a technology developed for tracking and authenticating financial transactions has can be applied to almost every industry that seeks to authenticate and protect data.
“Businesses don’t care about blockchain’s origins in the shady world of cryptocurrency. Rather, they see the potential to transform their operations by using the technology to reduce the risk of fraud, track items in their supply chain more effectively, and share information securely with partners, to name a few applications.
“There is no doubting blockchain’s potential across industries. For manufacturers, it will enable them to transform their supply chain, bringing much greater accountability and real-time visibility that will enable them to improve ethical sourcing, fight counterfeiting – and optimise the entire logistics, manufacturing and retail sectors. In the oil and gas sector, blockchain provides a way to manage the relationships between huge and complex data sets, which is crucial for regulatory compliance, authenticating financial transactions, making commodity trading transactions more secure, and improving collaboration and data sharing with partners throughout a complicated value stream.
“Blockchain has potential applications for any industry or activity that relies upon trust. Banking is obviously a key sector that can benefit from blockchain, but we’ll soon see exciting implementations in areas as diverse as entertainment (for example, by automating royalty payments and preventing copyright theft); healthcare, where decentralising information will improve the way we share and protect data; and even in the public sector, through helping government to create a framework of trusted and transparent data records.
“We shouldn’t be surprised that blockchain hasn’t yet made it into the mainstream. You can’t just slot blockchain in: the technology is so transformational that it requires organisations to plan their blockchain strategy very carefully to ensure that these initiatives are successful. Rather than seeing slow adoption as a sign of industries’ ambivalence, we should applaud the fact that businesses aren’t rushing headlong into building new blockchain applications. The best businesses are thinking carefully about delivering new capabilities that will benefit us all.”
Corporate networks have quickly become more and more complex. Change requests are regularly processed in the hundreds by IT security teams, which are then applied to company owned network devices. As a result, underlying network configuration processes increase in size and complexity, impacting the resources needed to manage the required changes.
By Andrew Lintell, Regional Vice President at Tufin.
These changes affect all environments, from multi-vendor firewalls and routers, to SDN and hybrid cloud platforms. The sheer size of the modern network therefore makes it increasingly difficult for companies to manage the complexity that comes with it. Cybercriminals are ideally positioned to take advantage of this confusion, which has left businesses scrambling to safeguard their networks from both targeted and automated attacks that penetrate the network by capitalising on overly permissive access policies.
A popular approach to meeting these initial network security challenges is network segmentation, where applications and infrastructure are divided into segments, so that threats can be contained and prevented from spreading to other areas. In the event that the attack exploits an existing service, monitoring can be prioritised, and vulnerable access rules assessed to direct incident response and mitigation.
Whilst network segmentation is not a new approach, it is by no means outdated. However, the definition of effective network segmentation, its implementation and long-term maintenance is a major challenge for many companies, especially in the face of new stringent privacy regulations and frequent changes to the infrastructure footprint through the adoption of the cloud. So, how can companies guarantee the effective implementation of network segmentation practices, while considering all the complexities of a corporate network? And how can they achieve their ideal state of limited access in granularity?
Begin with the basics
The first step is to evaluate the actual situation: What do businesses need from their network and how should they choose to divide it? To put it simply, individual departments are often keen to contain their applications within their own subsection or unit, which is entirely logical and a necessary step towards ensuring that sensitive data doesn’t find its way into the wrong hands.
Further than this, segmentation is a crucial consideration for businesses to demonstrate best practices to align with the General Data Protection Regulation (GDPR). Under the new regulation, organisations need to track access to data pertaining to residents of the EU. After dividing the corporate network into individual segments or security zones, or tagging applications, IT managers will need to ensure the provisioning of minimal required access to those zones or applications. Above all, highly sensitive areas should be proactively monitored to identify if unnecessary access can be removed.
Not a one-time job
The often-said phrase “Security is a journey, not a destination” certainly applies here. Network segmentation is not a one-time project, but an ongoing process that requires continuous maintenance. Network systems are constantly in need of updating, whether it is driven by new business requirements, new devices or new software. To effectively segment networks, companies should consider:
One step further: microsegmentation
Depending on the maturity and complexity of the company, as well as its business requirements, microsegmentation serves as a pragmatic solution to managing network access through a more dynamic and application-specific approach. Using microsegmentation, the individual segments are broken down even further – even down to the application and user levels. In these cases, access to data is only granted to a pre-defined security group of users that is carefully managed by the security team. The group can be easily modified to reflect changes in personnel, and access is provided between the specific security group and the specific application. Rather than treating networks as broader segments of users, microsegmentation allows you to employ security from the start in a manageable way.
Microsegmentation can be achieved with physical networks, as well as private and public cloud networks using software-defined network technologies for managing advanced cloud infrastructures. This requires comprehensive segmentation solutions that address the hybrid cloud and heterogeneous networks, thereby enabling IT security teams to effectively maintain and visually manage a microsegmentation policy for their organisation.
In a constantly changing business environment, it is imperative to ensure that this volatility does not increase the attack surface, exposing the company to a network breach. The right automation tools can empower the solution to mitigate significant security risks by fostering a security-first mentality when it comes to meeting change requirements and reducing the complexity and time required to manage network changes continuously.
It can also help ensure the effective segmentation of networks, although a balance must be reached. Businesses must be aware of overcomplicating the management of the different groups and getting too granular with the control.
Maintaining the desired network segmentation can therefore be a difficult task given the complex nature of security policies, and the fact that constant change requests are now the norm in most companies. However, if the network is divided into smaller zones, an attack on one segmented area cannot spread to another, creating a much more secure infrastructure overall and significantly bolstering network security. Ultimately, businesses must avoid over-segmenting the network and maintain a central console to effectively manage a micro-segmented network across multi-vendor, physical and cloud platforms.
Server Technology's multiple award winning High Density Outlet Technology (HDOT), has been improved with our Cx outlet. The HDOT Cx PDU welcomes change as data center equipment is replaced. The Cx outlet is a UL tested hybrid of the C13 and C19 outlets, accommodating both C14 and C20 plugs. This innovative design reduces the complexity of the selection process while lowering end costs.
Generation Z – those born after 1995 –are starting to enter the workforce at pace. Growing up with ubiquitous connectivity and evolving mobile technology has shaped Gen Z’s priorities for the workplace.
By Ian Stone, CEO, Vuealta.
91% of this group say that technological sophistication impacts their interest in working at a company. This is hardly surprising, as they are now often described as the “ultimate technology natives”. Businesses need to not only deploy the technology that this new generation craves, but shout that capability from the rooftops, if they are to recruit the talent they need to succeed.
SMEs in particular – accounting for 99.9% of businesses and 60% of the workforce in the UK private sector – will become increasingly dependant on this demographic to drive growth. They are the latest group to enter the workforce, but also hold many skills and qualities – creativity and innovation the most highly prized – that will be important to these agile organisations. But competition for this new talent pool is fierce and this highly skilled younger generation places huge demands on their prospective employers.
Agile tech and working practices
Much of the discussion when millennials joined the workforce, positioned them as being “tech-savvy”. With Gen Z, this goes much further. We now have the first generation born into a world of social media, online gaming, and a smartphone in every pocket. Snapchat; Instagram; WhatsApp; even more so than the millennial generation (who’s key influences included Myspace and Facebook), Gen Z expect, receive and digest information instantly.
This demand doesn’t change when they walk into their place of work. Gen Z employees want the latest technology at their fingertips and to be just as connected – in the technology sense – at work as they are in their day-to-day lives. This manifests itself in several guises. Gen Z have always had access to any information or contact, from any location in the world – provided there is 4G or wireless. Increasingly, the same can be said for work – younger generations want the flexibility to be able to work from anywhere in a connected and agile way. Businesses need to make sure that they have the technology in place to facilitate this, as well as exploring more cultural initiatives, like the design of “third spaces” that encourage interactions outside of any rigid departmental boundaries or formalised meeting rooms.
Harnessing the potential of Gen Z
Their highly networked and tech-driven upbringing has fostered a more entrepreneurial generation in Gen Z. In fact, 72% want to start a business of their own in the future. SMEs can harness that motivated and strategic outlook within their organisation if they give them a chance. Where possible, promote the freedom to be autonomous while still having the appropriate checks and balances needed. Flattening organisational charts and concepts of hierarchy – as well as providing constant opportunities to learn and develop – will all be important to attracting Gen Z. Many organisations are looking into concepts like “scrums” – agile breakout groups and teams – rather than rigid hierarchies.
This equally feeds into the work itself. When it comes to Gen Z, it’s not just about how they work, but what the work actually is. Using technology has placed a premium on their key skills like creativity, innovative thinking and the ability to understand and process information quickly. Organisations that can use technology effectively, automating laborious tasks like data entry, will better attract and unleash the potential of this new generation in the workforce.
Learning and development
Having grown up during the 2008 recession, Gen Z are also naturally more pragmatic than their millennial predecessors, particularly appreciating the value and efficiencies that technology brings to the workplace. The influence of these more risk-averse times and familiarity with the rise of new technologies, has also made Gen Z much more conscious of the need to learn new skills to stay relevant and compete.
Gen Z have grown up with the world’s largest ever on-demand how-to video library – YouTube. With that bank of learning just a few clicks away in their personal lives, this new section of the labour force wants equally innovative solutions to appease that thirst for knowledge and development. Organisations are responding. The NHS, for example, has begun to train their doctors and nurses with the help of virtual realities. Instead of learning their trade in real-life operations and emergencies, VR technology enables them to acquire and train their skills safely. While not applicable to every business, it does highlight the need for organisations to better embrace new technologies and change workforce practices, when looking to attract and engage younger generations.
SME leaders must ensure that they have the technology and organisational flexibility that this new Gen Z workforce craves – and it’s not just about having an iPad on every desk. They want to be able to work from anywhere with agility and access to instant information, while being given the freedom to think creatively, learn and have a real impact on the organisation. Having the right business technology in place sits at the heart of delivering on this and organisations need to take heed of those demands if they are to attract and retain Gen Z talent. In fact, that technology capability is so important, it should sit on top of every job spec.
With recent reports outlining how blockchain is set to make a significant impact in both the environmental and pharamaceutical and lifesciences industry sectors, Digitalisation World sought a range of opinions as to how blockchain will develop outside of its cryptocurrency and gaming sweetspots.
Part 6 of this article brings you the opinions of various industry experts as to how they see this technology developing in mainstream business sectors
Fady Abdel-Nour, Global Head, M&A and Investments, PayU, comments:
“Hype around blockchain technology seems to be reaching a crescendo. Given the potential for blockchain is huge, this is hardly surprising. However, it’s certainly fraught with complexities, with even those at the core of the industry unclear as to how it will play out.
“In particular, there is much uncertainty surrounding the regulation of blockchain which can be said to be hindering the pace of development and deterring institutional investors from entering the market.
“The issue around blockchain regulation is laced with complexity. For example, when considering bitcoin, it must be taken into account that the technology doesn’t only impact the way that organisations operate, but also has far reaching implications for government. This means that finding appropriate regulation to encompass all aspects is far from a quick process.
“However, when we look beyond these issues it’s clear to see that there are some areas where blockchain could really revolutionise industries.
“A promising example of this is in the banking sector. Blockchain is on the way to transforming a number of areas, from payments transactions to how money is raised in the private market. It is projected that the world banking sector will save up to $20 billion by 2020 through the strategic implementation of blockchain.
“It's interesting to see the impact that blockchain is having on emerging markets. These markets are developing rapidly and in need of fast solutions. In this instance, blockchain technology is already providing a new way of building infrastructure and, for example, in markets like South Africa people are using blockchain as a store of value.
“Blockchain has the potential to revolutionise traditional operating models in the business world and beyond. In order to achieve this, the bubble of hype surrounding blockchain must be popped, and replaced with strategic plans (and regulations) that facilitate and support the technology. Only then will the full potential of blockchain be achieved.”
All things to all industries?
Agnelo Marques, VP Technology and Head of Blockchain at Mphasis, explains:
“Most cryptocurrencies – Bitcoin being the first - use blockchain as the underlying technology. Beyond cryptocurrencies and in the enterprise space, the underlying technology appeared interesting and potentially seemed like, it would be able to solve system challenges for business applications. Ever since enterprise technologists have started evaluating the same and in the last couple of years have created several high profile PoC and pilot implementations. For some reason however, the technology has been perceived to be more suitable for use in the financial services industry. This probably could be because the general idea of the technology was derived from Bitcoin.
“A few examples listed here could prove otherwise; supply chain management use cases have seen the highest number trials this year, the primary goal of using blockchain here is to help improve efficiency, enhance security, increase trust through transparency and driving down costs. In the healthcare space, where blockchain could be used to securely hold patient’s health record with the control resting completely in the individual and that’s not all healthcare payment too sees potential in blockchain technology. Digital media and related industry can benefit from digital rights management and avoid unauthorized distribution. Like-wise, practically businesses in every sector can benefit if specific use cases are identified to benefit from the technology.
“Not only businesses, but governments could benefit from this technology. Imagine an auditable, immutable, traceable blockchain solution for land records. A self-sovereign identity that can be implemented on blockchain. Several other services government provide to its citizen can be made more efficient and secure and these are currently being evaluated through a number of PoCs
“This is a technology with far reaching impact of societies, businesses and geographical setup. It’s difficult to put and estimate, but the impact will be felt on our day to day lives as we transact and work across governments and businesses. Imagine the ability be in-charge of your own identity data or your personal health or having land records that are fraud-proof. Several countries and their central banks are working on proposal for releasing their own digital currency, loosely based on the cryptocurrency model, this will impact how we transact with each other. As businesses and governments continue to test, apply and leverage this technology, the demand for re-engineering business processes will have impact of the current way of doing business. Blockchain based solutions work best when there exists a large network; larger the network the better it is. This however introduces us to newer issues, challenges and opportunities arising from the network effect. Concerns and issues will arise from information being publicly visible within the blockchain network, though non-members can be isolated, information privacy may have a new form. Finally, the legal and regulatory frameworks that exist today will need to undergo a change to accommodate the new models of transacting and communicating.
Making its mark on marketing
Saleem Khan, Global Data Leader at Dun & Bradstreet. Saleem talks about the impact blockchain can have specifically on the advertising and marketing industry, bringing in ad fraud, privacy and identity:
“The impact of blockchain will be felt far beyond cryptocurrency. Blockchain will almost certainly impact major business functions like supply chain management as well as marketing and advertising. Although the positive impact of this technology will be determined by how well these industries can harness the concepts of decentralisation and distributed ledgers in order to meet business needs.
“For instance, one of the key problems marketers and advertisers face today is ad fraud. With roughly 50-60% of clicks coming from bots, it’s become increasingly hard to determine direct campaign ROI – and finding a solution for human verification has not yet been proven possible. But this is exactly where marketers can use blockchain and a robust onboarding process to ensure only legitimate individuals are clicking on ads and that ads are only being displayed on legitimate sites. By incentivising individuals using campaign dollars to share their interests and intent data, a marketer can target a prospect with much more precision. Using blockchain allows individuals to opt-in and share their interest and intent data and to also revoke access as required.
“Privacy is also a growing concern for many B2B companies trying to reach individuals within a business – and with the GDPR and other national privacy laws coming into effect, it has become difficult for marketers to legally collect information to use in their targeting. Marketers can stay ahead of the ever-changing regulatory curve by using blockchain to geofence data and confirm that they marketed to known marketable entities. Here, blockchain acts like a Digital Rights Management (DRM) system. A smart contract residing on the blockchain will include certain rights and privileges, like where the data is allowed to be used, where it can’t be used, where you can store data, and so on.
“Finally, blockchain will provide a more holistic account view for B2B accounts. For B2B marketers, tying an individual to the company they work for can be challenging. However, through blockchain, marketers can begin identifying individuals using a public key. That key can then be used along with a unique business identifier to track the potential prospect and their buying behaviour. This allows B2B companies to track a person in the context of their business so that the actions for that account can be easily traced.
Innovative and exciting
Jerome Nadel, GM of Payments and CMO, Rambus, observes:
“Whilst blockchain was originally designed as a technology to support cryptocurrency, it is evolving beyond its financial roots to benefit organisations across industries. From smart contracts to IoT security, blockchain is transforming the way businesses do business, and the best is yet to come.
“The decentralised nature of blockchain, and the elimination of third-party processors, is opening new doors for upcoming markets and applications. Some of the world’s most innovative and exciting business projects today are using blockchain, largely for the ability to achieve an unbiased and automated consensus. For businesses looking to improve security and transparency when it comes to transactions, both financially and in the movement of other assets – the answer lies on the blockchain.
“Blockchain’s ability to secure supply chain records for example, will be a game-changer for companies looking to store and record this data. This has particular benefits for retailers, as a blockchain is capable of recording each step in a product’s lifecycle – from product creation, to purchase, to delivery.
“We also expect to see blockchain and tokenization used in tandem, to address future needs for air-tight, secure solutions across industries. When applied together, tokenization can be used to protect credentials, and allow user domain controls to control where and how they may be used. Blockchain can then help to protect the integrity of data-related records showing the transaction process that the token was involved in. This combination could be used to keep the most sensitive of data, including all forms of personal data from account details to patient IDs and social security numbers.
“Blockchain technologies will continue to become mainstream in the business world. Banks will continue to adopt blockchain and cryptocurrencies, which will be widely used by customers once they are protected with technologies similar to those used to secure electronic payments today. But beyond the financial, the business potential of blockchain is huge, and as new applications are created, we will see entire industries building operations on the blockchain.
Wireless Mesh networking is definitely not a new concept. Leaving aside the projects prior to the advent of 802.11, the developments concerning mesh technologies have gone hand in hand with the evolution of the standards and the emerging of specific business needs.
By Massimo Mazzeo Ocello, Director of Systems Engineers, Ruckus Networks.
Numerous examples of projects around the world can be cited, starting as early as the mid-1990s, but it is only after a decade that the first commercial solutions have been offered by industry vendors. While initially wireless mesh networks were considered an enterprise technology for spanning cityscapes, today, the application of wireless mesh is becoming far more widespread, and for good reason.
Wireless mesh networking definition
The word “mesh” nicely captures the picture of access points (AP) connected to each other. If you draw lines from AP to AP to AP to AP (and on and on), the many interconnections look like a woven mesh. This mesh also ends up looking something like a safety net, which is an analogy that we’ll come back to later.
With the overwhelming majority of our devices connecting to the internet wirelessly, it’s easy to forget that behind the scenes, APs are attached to a wired network. Using the same visual example, this safety net must be attached, at its edge, to fixed points. These cable connections are how APs send data to switches for distribution. When the AP is connected by a wire, it’s called a “root AP”. When an AP doesn’t have, or perhaps loses, its wired connection, it can operate as a “mesh AP” by connecting to another wired AP for backhauling.
Whilst it might appear to be an attractive option to just connect all your APs wirelessly and eliminate as much of the cabling as possible, we must remember that in doing so there’s a performance penalty as more and more backhaul is put through the wired APs. Moreover, without going too far into technical details, it should be kept in mind that a performance degradation is seen when too many wireless “hops” are crossed. It’s still the case that you should wire where you can, but when it’s too costly or impractical to do so, mesh networking can certainly offer a lot of benefits.
Smart Wi-Fi mesh networking is like having a WLAN engineer on the job
Transforming from a root AP to a mesh AP is not something all APs do well. In order to form a true mesh network, a certain amount of intelligence is required. Wi-Fi environments are so dynamic that you can’t predetermine the best connections. In mesh AP mode, an AP needs to be able to search for and assess every AP to pick the optimal AP mesh partner; an AP that’s lost its cable connection has to determine which AP to mesh with in the blink of an eye.
And here is the magic that differentiates solutions that work from others that do not. The AP can make that determination for itself by applying machine learning to the the same data, variables and calculations that an experienced WLAN engineer would use. Thanks to this algorithmic, automated approach, you effective place a WLAN engineer in each and every one of your APs. This is key to ensuring that as the Wi-Fi environment changes APs are making decisions that will help to maintain as high a performance as possible across the network.
Wireless mesh networking weaves a great safety net and provides the solution in many scenarios
One of the greatest benefits of mesh networking is the resiliency it can provide. Returning to that safety net analogy, deploying a mesh technology offers great backup capability in case of disruptions in the wired network. In a mesh network scenario, this situation is allayed by the fact that as soon as the wired connection fails, with the AP in mesh mode it will automatically connect to the most appropriate wired AP, ensuring that the connection is maintained and that services run as usual whilst the problem is fixed. As soon as the wired connection is restored, it switches straight back to becoming a root AP again.
Mesh networks can be successfully applied in many different scenarios, as in many cases they offer some clear advantages over wired networks. Since they don’t need any pre-existent infrastructure, they are ideal for temporary events such as concerts, conferences and meetings, in which the network architecture must be set up and then removed in very little time. Also thanks to their auto-arranging capabilities, they don’t need much pre-planning and they are robust and scalable. Mesh networks can also be successfully employed in crisis-management applications, where no infrastructure is present and there is no time for network planning: disaster areas and rescue teams is an example. They can also be a good choice in scenarios in which running cables is very expensive or forbidden, such as in archaeological sites or large open areas like dockyards. Wireless mesh networks can also serve as backbone for Smart City and IoT devices, like sensors or CCTV cameras thanks to the ease of deployment.
While a wired connection is often the fastest solution, it’s not always the best and mesh networks can be a better alternative in many cases.