The following observations may not be earth-shattering, but, to your correspondent, at least, it seems worthy of note that the topic of collaboration is cropping up again and again right across business at the present time. We have the internal collaboration between various, previously separate, business departments, who are now coming together to develop new ideas, products and services in a much more flexible, agile and quicker fashion. We have the collaboration between different vendors to bring to the market joint solutions. And we have the collaboration between vendors, the Channel, and end users, with the objective of ensuring that customers really do get the best possible solutions and services from their suppliers, whoever and wherever they may be.
The trigger for this observation was a recent trip to the US with Schneider Electric – with the focus very much being on The Edge. One particular presentation introduced edge solutions, that couple APC by Schneider Electric physical infrastructure with Cisco’s HyperFlex Edge, hyperconverged solutions, designed for quick and efficient deployment in edge environments.
It was a joint presentation given by individuals from both Schneider and Cisco. The two companies, building on their history of partnering, have worked together to develop the edge-focused hyperconverged solutions, with reference designs for HyperFlex deployments that have been pre-engineered to join APC and Cisco equipment for solutions that are pre-integrated, remotely monitorable and physically secure. And Schneider also works with other major IT vendors, including HP Enterprise and Lenovo.
Where once data centre and IT vendors saw fit to want to own ‘everything’, it seems that there is a growing realisation that plans for world domination can never take into account the customer’s desire for best-of-breed technology. No one organisation can be the best at everything it does. By recognising this, and working with partners who are the best at what they do, it’s possible to develop and/or be a part of a wider ecosystem that does offer customers the very best solutions across a range of IT and data centre disciplines. Furthermore, where once multi-vendor solutions were the prelude to an unholy mess when it came to after sales service and support – it’s not our fault, it’s there’s etc. – individual vendors and, increasingly, the channel which may well supply the heterogeneous technology solutions – are happy to stand behind their offerings, rather than indulge in finer pointing.
What might be termed as end to end collaboration – vendors, channel, end users – is developing because of the emergence of these multi-vendor solutions. The channel is supplying them to the customer, and everyone needs to be able to trust everyone else in the supply chain – there’s a lot at stake if this trust is broken. For the Channel, they are, if you like, sticking their necks on the line by telling their customers that a particular vendor collaboration solution is much better than that provided by any single vendor, and they need to believe that they will receive the right level of support from the vendors behind the solution. For the end user, they need to trust both their supplier (a channel company) and the vendors – mainly in terms of receiving seamless ongoing support and maintenance. And the vendors need to trust that the Channel will do a good job of representing their joint solutions – providing the right level of professionalism and support to customers. Vendors probably also need to trust that customers do not have totally unrealistic expectations as to what the collaboration solutions will deliver.
However, if everyone talks, and listens, to everyone else in the supply chain, then there’s a very real chance that all parties will benefits from this multi-faceted collaboration.
Finally, internal collaboration is already taking off, thanks in the main due to the development of DevOps and DevSecOps. Serial or over the wall development – where department A hands on its work to Department B, who hands it to Department C, before it goes back down and up and down the line endless times before a finished product or solution emerges – is a very old-fashioned, clunky and time-consuming way to develop new ideas. Much better that all interested parties – design, IT, marketing, sales, even finance, sit down together, agree objectives and plans and then all stay interested and provide near-real-time feedback as the project progresses. The result? New applications can be developed in, if not days, weeks, rather than months or even years.
Key to the success of all these different types of collaboration is an openness to change, an openness to share and an openness to trust. And key to accepting this is an acceptance that, no matter how good you think that your company is, it is almost certainly going to be even better if it works with partners who can contribute a level of knowledge and expertise about certain topics that is just not available inside your organisation. Sharing information and trusting the people with whom you share it might seem like a big step to take, but if everyone is trying to grow the market within which your organisation works, does it matter? Success for you, your partners means that the market will grow and you’ll be in a great position to take advantage of this growth.
Which brings me back to where I started. Schneider Electric has a whole department devoted to developing new ideas, helpful tools and white papers that are publicly available. For example, anyone planning a data centre build or refresh can access and use Schneider’s data centre optimisation tool, whereby a whole host of parameters can be input, amended, until finalised, before a final, optimised design is arrived at, or the exercise makes it clear that going the colo route, rather than staying on-premises, makes sense. Why make such information freely available to competitors? Well, they are also in the business of selling products and services in to the data centre space and the better they do it, the more likely it is that the overall market will grow, and Schneider itself will be benefit from such growth. In other words, if your confident that you are doing something well, why be afraid of telling the market?
Yes, intellectual property and technology developments might need to be kept under wraps, but the more open and collaborative any industry is, the faster it will grow and the faster the supplies will grow with it.
For many, collaboration is a word inexplicably linked with some of the murkier goings on of various 20th century conflicts. In the digital world, collaboration has nothing but positive connotations.
PA Consulting launches global survey of 500 senior business leaders.
Organisational agility is a major factor in driving financial performance, according to new research from PA Consulting, the global innovation and transformation consultancy.
PA’s research found that organisational agility is the single most important factor in responding to rapid technology, customer and societal change. The top 10 per cent of businesses by financial performance are almost 30 per cent more likely to display characteristics of agility. Organisational agility enables large-scale companies to thrive, delivering higher all-round performance, with positive benefits for customers, shareholders and employees.
However real and lasting organisational agility remains elusive for many. Three in five business leaders know they need to change yet are struggling to act. PA’s research found that:
Conrad Thompson, a business transformation expert at PA Consulting, says: “Businesses around the world face unprecedented disruption but also unprecedented opportunities. As radical change transforms the world we live in, organisations must evolve at pace. One in six of the companies we surveyed acknowledged that unless they evolve, they risk failure within five years. Yet we see organisational agility efforts transforming well-established financial institutions, household names, and industry stalwarts, establishing the conditions that have made them fit for the future. This research, and our experience, proves that embracing organisational agility is the most effective way to get ahead of the competition, and ensure your organisation thrives, today and tomorrow.”
What it takes to achieve organisational agility
PA has identified five attributes key to organisational agility and common to the high performing companies we surveyed:
Sam Bunting, an organisational agility expert at PA Consulting says: “At PA we believe in the power of ingenuity to build a positive human future in a technology-driven world. As a business leader right now, we’ve never been afforded a better opportunity to adapt and transform – to truly understand our customers’ motivations, to harness the power of technology to create ingenious products and services and to find creative and effective ways to engage and empower our people. We’ve seen first-hand how organisations can beat their numbers and deliver higher all-round performance as a result of taking these five perfectly achievable steps.”
Eight out of ten executives admit they do not review internal processes before setting transformation goals.
New research released by Celonis has found that despite significant pressure to embark on transformation initiatives that enable greater productivity, improve customer service and reduce costs, most businesses remain unclear on what they should focus on. The results showed that almost half of C-suite executives (45%) admit they do not know where to start when developing their transformation strategy. The Celonis study explores how businesses are approaching transformation programmes, as well as the disconnect between leadership and those on the frontline.
The survey of over 1,000C-suite executives and over 1,000 business analysts found that many organisations have wasted significant resources on business transformation initiatives that have been poorly planned. In fact, almost half (44%) of senior leaders believe that their business transformation has been a waste of time. With a quarter (25%) of businesses having spent over £500,000 on transformation strategies in the last 12 months, many organisations run the risk of incurring huge costs with no return.
Top-down strategies are stunting results
It’s no wonder that transformation initiatives aren’t hitting the mark, because business leaders are not using the expertise of those closest to running the organisation’s operations. The results highlighted a disconnect between those developing transformation strategies and those carrying them out:
Businesses are skipping square one
In addition, the study demonstrated that most organisations are struggling with transformation initiatives because they are diving into execution before understanding what to change first. In fact, more than eight out of 10 (82%) C-suite executives admit they do not review their internal business processes to understand what needs to be prioritised when setting initial goals and KPIs for a transformation programme. Perhaps this is because they don’t know how to gain this visibility; almost two-thirds (65%) of leadership state they would feel more confident deploying their transformation strategy if they had a better picture of how their business is being run.
And this trend has trickled down to the entire organisation. The research shows that almost two-fifths (39%) of analysts are not basing their work on internal processes when executing the transformation strategy given to them by senior personnel. Ultimately, this highlights that business leaders are investing in transformation initiatives because they think they should and not because they have identified a specific problem.
Look too far ahead and stumble in the present
Despite acknowledgement that an understanding of the here and now would be beneficial to inform transformation strategy, businesses are still jumping straight into tactics. For example, almost three quarters of C-suite executives cite AI/machine learning (73%) and automation (73%) as areas they want to maintain or increase investment in. In contrast, less than a third (33%) of senior leaders state that they plan to invest more in getting better visibility of their processes. But for those organisations that want to increase their investment in AI and innovation, understanding their current processes first could help them to work out which technologies would be most beneficial to their business.
“Transformation strategies will inevitably be part of every organisation’s operations, because no business can avoid adapting to the latest industry and technological trends,” commented Alexander Rinke, co-founder and co-CEO, Celonis. “However, they should be founded in concrete insights derived from processes that are actually happening within a company. Our research shows that too many businesses are rushing into costly initiatives that they do not necessarily even need to embark on. They are falling at the first hurdle; having a better understanding of inefficiencies in underlying business processes can help organisations invest wisely to provide the best possible service for their customers.”
“From early stages in digital transformation to post transformation, organisations must understand how internal processes can shape their business strategy,” added Jeremy Cox, Principal Analyst, Ovum. “Quantifying the business impact of existing or newly adapted processes, can help optimise the environment for customers. Ovum's annual global ICT Enterprise Insights research based on around 5000 enterprises reveals a consistent picture of struggle, as industry by industry around 80% have made little progress. While there are many reasons for this difficulty, a forensic examination on how work gets done, aided by intelligent process mining technology, would help quantify the consequences and drive consensus on what must change.”
Ethical and responsible AI development is a top concern for IT Decision Makers (ITDMs), according to new research from SnapLogic, which found that 94% of ITDMs across the US and UK believe more attention needs to be paid to corporate responsibility and ethics in AI development. A further 87% of ITDMs believe AI development should be regulated to ensure it serves the best interests of business, governments, and citizens alike.
The new research, conducted by Vanson Bourne on behalf of SnapLogic, studied the views and perspectives of ITDMs across industries, asking key questions such as: who bears primary responsibility to ensure AI is developed ethically and responsibly, will global expert consortiums impact the future development of AI, and should AI be regulated and if so by whom?
Who Bears Responsibility?
When asked where the ultimate responsibility lies to ensure AI systems are developed ethically and responsibly, more than half (53%) of ITDMs point to the organisations developing the AI systems, regardless of whether that organisation is a commercial or academic entity. However, 17% place responsibility with the specific individuals working on AI projects, with respondents in the US more than twice as likely as those in the UK to assign responsibility to individual workers (21% vs. 9%).
A similar number (16%) see an independent global consortium, comprised of representatives from government, academia, research institutions, and businesses, as the only way to establish fair rules and protocol to ensure the ethical and responsible development of AI. A further 11% believe responsibility should fall to the governments in the countries where the AI systems are developed.
Independent Guidance and Expertise
Some independent regional initiatives providing AI support, guidance, and oversight are already taking shape, with the European Commission High-Level Expert Group on Artificial Intelligence being one such example. For ITDMs, expert groups like this are seen as a positive step in addressing the ethical issues around AI. Half of ITDMs (50%) believe organisations developing AI will take guidance and adhere to recommendations from expert groups like this as they develop their AI systems. Additionally, 55% believe these groups will foster better collaboration between organisations developing AI.
However, Brits are more sceptical of the impact these groups will have. 15% of ITDMs in the UK stated that they expect organisations will continue to push the limits on AI development without regard for the guidance expert groups provide, compared with 9% of their American counterparts. Furthermore, 5% of UK ITDMs indicated that guidance or advice from oversight groups would be effectively useless to drive ethical AI development unless it becomes enforceable by law.
A Call for Regulation
Many believe that ensuring ethical and responsible AI development will require regulation. In fact, 87% of ITDMs believe AI should be regulated, with 32% noting that this should come from a combination of government and industry, while 25% believe regulation should be the responsibility of an independent industry consortium.
However, some industries are more open to regulation than others. Almost a fifth (18%) of ITDMs in manufacturing are against the regulation of AI, followed by 13% of those in the Technology sector, and 13% of those in the Retail, Distribution and Transport sector. In giving reasons for the rejection of regulation, respondents were nearly evenly split between the belief that regulation would slow down AI innovation, and that AI development should be at the discretion of the organisations creating AI programs.
Championing AI Innovation, Responsibly
Gaurav Dhillon, CEO at SnapLogic, commented: “AI is the future, and it’s already having a significant impact on business and society. However, as with many fast-moving developments of this magnitude, there is the potential for it to be appropriated for immoral, malicious, or simply unintended purposes. We should all want AI innovation to flourish, but we must manage the potential risks and do our part to ensure AI advances in a responsible way.”
Dhillon continued: “Data quality, security and privacy concerns are real, and the regulation debate will continue. But AI runs on data — it requires continuous, ready access to large volumes of data that flows freely between disparate systems to effectively train and execute the AI system. Regulation has its merits and may well be needed, but it should be implemented thoughtfully such that data access and information flow are retained. Absent that, AI systems will be working from incomplete or erroneous data, thwarting the advancement of future AI innovation.”
Egress has published the results of its first Insider Data Breach survey, examining the root causes of employee-driven data breaches, their frequency and impact. The research highlights a fundamental gulf between IT leaders and employees over data security and ownership that is undermining attempts to stem the growing tide of insider breach incidents.
The research was carried out by independent research organisation Opinion Matters and incorporated the views of over 250 U.S. and U.K.-based IT leaders (CIOs, CTOs, CISOs and IT directors), and over 2000 U.S. and U.K.-based employees. The survey also explored how employees and executives differ in their views of what constitutes a data breach and what is acceptable behaviour when sharing data.
Key research findings include:
-79% of IT leaders believe that employees have put company data at risk accidentally in the last 12 months. 61% believe they have done so maliciously.
-30% of IT leaders believe that data is being leaked to harm the organisation. 28% believe that employees leak data for financial gain.
-92% of employees say they haven’t accidentally broken company data sharing policy in the last 12 months; 91% say they haven’t done so intentionally.
-60% of IT leaders believe that they will suffer an accidental insider breach in the next 12 months; 46% believe they will suffer a malicious insider breach.
-23% of employees who intentionally shared company data took it with them to a new job.
-29% of employees believe they have ownership of the data they have worked on.
-55% of employees that intentionally shared data against company rules said their organisation didn’t provide them with the tools needed to share sensitive information securely.
The survey results highlight a perception gap between IT leaders and employees over the likelihood of insider breaches. This is a major challenge for businesses: insider data breaches are viewed as frequent and damaging occurrences, of concern to 95% of IT leaders, yet the vectors for those breaches – employees – are either unaware of, or unwilling to admit, their responsibility.
Carelessness and a lack of awareness are root causes of insider breaches
Asked to identify what they believe to be the leading causes of data breaches, IT leaders were most likely to say that employee carelessness through rushing and making mistakes was the reason (60%). A general lack of awareness was the second-most cited reason (44%), while 36% indicated that breaches were caused by a lack of training on the company’s security tools.
However, 30% believe that data is being leaked to harm the organisation and 28% say that employees leak data for financial gain.
From the employee perspective, of those who had accidentally shared data, almost half (48%) said they had been rushing, 30% blamed a high-pressure working environment and 29% said it happened because they were tired.
The most frequently cited employee error was accidentally sending data to the wrong person (45%), while 27% had been caught out by phishing emails. Concerningly, over one-third of employees (35%) were simply unaware that information should not be shared, proving that IT leaders are right to blame a lack of awareness and pointing to an urgent need for employee education around responsibilities for data protection.
Tony Pepper, CEO and Co-founder, Egress, comments: “The results of the survey emphasise a growing disconnect between IT leaders and staff on data security, which ultimately puts everyone at risk. While IT leaders seem to expect employees to put data at risk – they’re not providing the tools and training required to stop the data breach from happening. Technology needs to be part of the solution. By implementing security solutions that are easy to use and work within the daily flow of how data is shared, combined with advanced AI that prevents data from being leaked, IT leaders can move from minimising data breaches to stopping them from happening in the first place.”
Confusion over data ownership and ethics
The Egress Insider Data Breach survey found confusion among employees over data ownership. 29% believed that the data they work on belongs to them. Moreover, 60% of employee respondents didn’t recognise that the organisation is the exclusive owner of company data, instead ascribing ownership to departments or individuals. This was underlined by the fact that, of those who admitted to sharing data intentionally, one in five (20%) said they did so because they felt it was theirs to share.
23% of employees who shared data intentionally did so when they took it with them to a new job, while 13% did so because they were upset with their organisation. However, the majority (55%) said they shared data insecurely because they hadn’t been given the tools necessary to share it safely.
The survey also found that attitudes towards data ownership vary between generations, with younger employees less aware of their responsibilities to protect company data.
Tony Pepper adds: “As the quantity of unstructured data and variety of ways to share it continue to grow exponentially, the number of insider breaches will keep rising unless the gulf between IT leaders and employee perceptions of data protection is closed. Employees don’t understand what constitutes acceptable behaviour around data sharing and are not confident that they have the tools to work effectively with sensitive information. The results of this research show that reducing the risk of insider breaches requires a multi-faceted approach combining user education, policies and technology to support users to work safely and responsibly with company data.”
86% of IT executives surveyed believe human work, AI systems, and robotic automation must be well-integrated by 2020 -- but only 12% said their companies do this really well today.
Appian has published the third set of findings from its Future of Work international survey, conducted by IDG and LTM Research. Part three of the series looks specifically at “intelligent automation,” defined as the integration of emerging cognitive and robotic computing technologies into human-driven business processes and customer interactions. These technologies include artificial intelligence (AI), machine learning (ML), and robotic process automation (RPA). The data shows an enormous disconnect between the expected business benefits of intelligent automation and a typical organization’s ability to realize those benefits.
Less than half (46%) of organizations have deployed intelligent automation. This is despite the fact that a large majority of IT leaders agree that effective intelligent automation holds enormous potential for their businesses:
IT leaders in enterprise organizations across North America and Europe feel tremendous urgency to take advantage of intelligent automation. However, the vast majority admit their companies can’t currently do it: 86% of executives interviewed indicated that human work, AI systems, and robotic automation “must be well-integrated by 2020,” yet only 12% of executives said their companies “do this really well today.”
Challenges: Complexity & Lack of Strategy
While there are deployments of individual emerging automation technologies, a lack of strategy and clear alignment to business goals is resulting in siloed deployments and overwhelmed internal application development teams. Less than half of surveyed companies have deployed any form of intelligent automation. Fully half of those companies boast IT staffs in excess of 20,000 employees. Specific pain-points include:
Survey of 500 IT and business decision makers examines perspectives on systematic automation of applications management.
A survey of business leaders by DXC Technology and Vanson Bourne, an independent research firm, reveals that 86 percent of IT and business decision makers believe that being able to predict and prevent future challenges with applications could be a “game changer” for their organizations. This could be achieved via automation, artificial intelligence (AI) and lean processes.
Respondents said investing in automation of applications management would have numerous benefits, including improved customer experience (49 percent), higher customer retention (46 percent) and greater customer satisfaction (46 percent). And 78 percent of respondents agree that without systematic automation, achieving zero application downtime would be a “distant dream.”
The findings provide a view of the attitudes, experiences and expectations of 500 IT and business decision makers on the impact of application performance in the digital economy. The survey was completed in four markets — the United Kingdom, Germany, Japan and the United States — and respondents represented sectors including banking, insurance, healthcare, and travel and transportation.
Despite the digital transformation opportunities, only 36 percent of senior managers surveyed said their peers are completely accepting of new technologies such as AI and automation to improve applications quality. Additionally, respondents said that the top barriers to investing in automation were security risks (44 percent) and legacy technology hurdles (36 percent).
“Because legacy technologies can require so much of an organization’s resources, it is essential for organizations to establish a plan to simplify their existing IT structures and free up funds for automation and other digital technologies,” said Rick Sullivan, vice president of digital applications and testing, Application Services, DXC Technology. “We are committed to helping our clients optimize applications performance through AI, lean processes and automation to enable great customer experiences today and future growth opportunities.”
While 82 percent of respondents agree that companywide strategies to invest in new AI-driven technologies to transform applications management would provide significant competitive advantages for their organizations, only 29 percent said their organizations actually have such strategies.
The survey also revealed several key findings for specific industries:
A new report by the Capgemini Research Institute has found that financial services firms are lagging behind in digital transformation compared to other industry sectors. Financial services firms report falling confidence in their digital capabilities, and a shortage of the skills, leadership and collective vision needed to shape the digital future.
The report, part of Capgemini’s Global Digital Mastery Series, examines sentiment on digital and leadership capabilities among bank and insurance executives, comparing it to an equivalent study from 2012. Over 360 executives were surveyed from 213 companies whose combined 2017 revenue represents approximately $1.67 trillion.
Key findings include:
Confidence in digital and leadership capabilities has sunk since 2012
Compared to 2012, a smaller proportion of financial services executives said their organizations had the necessary digital capabilities to succeed – with the confident few falling from 41 percent to 37 percent. Breaking this down, although more executives felt they had the required digital capabilities in customer experience (40 percent compared to 35 percent), confidence in operations saw a significant drop. Only 33 percent of executives said they had the necessary operations capabilities, compared to 46 percent from six years ago.
A shortfall in leadership was also cited, with only 41 percent of executives saying their organizations have the necessary leadership capabilities, down from 51 percent in 2012. In some specific areas, confidence in leadership fell significantly, including governance (45 percent to 32 percent), engagement (54 percent to 33 percent) and IT-business relationships (63 percent to 35 percent).
Digital Mastery proves to be illusive
In Capgemini’s digital mastery framework presented in the report, just 31 percent of banks and 27 percent of insurers are deemed to be digital masters, while 50 percent and 56 percent respectively are classified as beginners.
Executives also criticized the lack of a compelling vision for digital transformation across their organizations. Only 34 percent of banking and 24 percent of insurance respondents agreed with the statement that ‘our digital transformation vision crosses internal organizational units’, with just 40 percent and 26 percent respectively saying that ‘there is a high-level roadmap for digital transformation’.
Banking transformation has taken center stage, while insurance places focus on automation
Although banks’ digital transformation journeys are well underway, the industry has reached a crossroads, cites the report, as it attempts to meet the rising digital expectations of customers, manage cost pressures, and compete with technology upstarts. Fewer than half of banks (38 percent) say they have the necessary digital and leadership capabilities required for transformation. Insurance is catching-up with only 30 percent claiming to have the digital capabilities required and 28 percent the leadership capabilities necessary.
The banking sector does, however, outpace non-financial services sectors on capabilities such as customer experience, workforce enablement and technology and business alignment. Fifty-six percent of the banking firms said they use analytics for more effective target marketing (in comparison to 34 percent insurance and 44 percent non-financial services sector). More than half (53 percent) of banking organizations also said that upskilling and reskilling on digital skills is a top priority for them (32 percent for insurance and 44 percent for non-financial services sector).
One area of advantage for insurers was operational automation, with 42 percent of executives saying they used robotic process automation, against 41 percent of bankers, and 34 percent reporting the use of artificial intelligence in operations (compared to 31 percent of bank executives).
More challenges are ahead
On the other hand, business model innovation, defining a clear vision and purpose, and culture and engagement are some areas which are challenging both for banking and insurance. Only 33 percent of insurance and 39 percent of banking organizations have launched new businesses based on digital technologies (41 percent in non-financial services sector). While banking is in line with the non-financial services average, only around a third (34 percent) of banks had a digital vision that crossed organizational units. Insurance lags even further behind, with just around a quarter (24 percent) having an all-encompassing vision. In terms of culture aspects as well, only 33 percent of banking and 25 percent of insurance organizations thought their leaders were adopting new behaviors required for transformation, as compared with 37 percent in non-financial services organizations.
“This research shows that a reality check has taken place across the financial services industry, as incumbents now understand the true extent of the digital transformation challenge. In an environment of growing competition and consumer expectation, the view is very different from a few years ago, and it’s unsurprising that large organizations have become more realistic about their capabilities,” said Anirban Bose, Chief Executive Officer of Capgemini’s Financial Services and member of the Group Executive Board.
“At the same time, this is a wake-up call for banks and insurers to re-examine their business models. Tomorrow’s operating model is collaborative, innovative and agile. The digital masters we looked at are working with an ecosystem of third-party partners, developing and testing ideas more quickly under an MVP model, and nurturing a culture of bottom-up innovation and experimentation. The majority of financial services firms need to learn from the small pool of genuine innovators in their field,” Bose concluded.
Heightened investment in disruptive technologies and enterprise-class devices will empower the majority of front-line workers by 2023.
Zebra Technologies Corporation has published the results of its latest vision study on the Future of Field Operations. The study reveals mobile technology investment is a top priority for 36 percent of organizations and a growing priority for an additional 58 percent to keep up with rapidly evolving and increasing customer demand. The findings indicate investments will be made in disruptive technologies and enterprise mobile devices to enhance front-line worker productivity and customer satisfaction in field operations including fleet management, field services, proof of delivery and direct store delivery workflows.
“Driven by the acceleration of e-commerce along with customer’s heightened expectations and more focus within companies on differentiating service levels, the field operations industry is rapidly adapting the way it looks at its mobile technology investments,” saidJim Hilton, Director of Vertical Marketing Strategy, Manufacturing, Transportation & Logistics, Zebra Technologies. “Our study shows how growing challenges related to the on-demand economy drive organizations to adopt transformative, disruptive technologies such as augmented reality and intelligent labels to provide visibility and integrate business intelligence for a performance edge.”
KEY SURVEY FINDINGS
Equipping front-line workers with enterprise mobile devices remains a priority to stay competitive.
·The survey shows today only one-fifth of organizations have a majority of their field-based operations using enterprise mobile devices. This is estimated to reach 50 percent in five years.
·Respondents indicate most organizations intend to invest in handheld mobile computers, mobile printers and rugged tablets. From 2018 to 2023, handheld mobile computer usage with built-in barcode scanners is forecasted to grow by 45 percent, mobile printers by 53 percent and rugged tablets by 54 percent.The higher levels of inventory, shipment and asset accuracy provided by using these devices is expected to increase business revenues.
·A key driver of productivity, efficiency and cost-savings in field operations is ensuring ruggedized enterprise devices replace traditional consumer ones. Nearly 80 percent of respondents usually or always conduct a total cost of ownership (TCO) analysis of business devices prior to making a capital expenditure.Only 32 percent of respondents believe that consumer smartphones have better TCO than rugged devices.
Tertiary concerns and post-sale factors are important for organizations when evaluating front-line worker enterprise mobile devices.
·The survey reveals these TCO considerations when investing in new front-line enterprise technology: replacement (47 percent), initial device (44 percent), application development (44 percent) and programming/IT (40 percent).
·Almost 40 percent of respondents say device management and support costs are important as well as customer service (37 percent), device lifecycle cadence (36 percent) and repair costs (35 percent). Such factors increasingly influence the purchase cycle, showing that those who do not provide clear value or cannot control these costs will quickly be overtaken by those who do.
Emerging technologies and faster networks are disrupting field operations.
·The survey shows seven in ten organizations agree faster mobile networks will be a key driver for field operations investment to enable the use of disruptive technology.
·Significant industry game-changers will be droids and drones, with over a third of decision makers citing them as the biggest disruptors.
·The use of smart technologies such as sensors, RFID, and intelligent labels also play a role in transforming the industry. More than a quarter of respondents continue to view augmented/virtual reality (29 percent), sensors (28 percent), RFID and intelligent labels (28 percent) as well as truck loading automation (28 percent) as disruptive factors.
KEY REGIONAL FINDINGS
·Asia Pacific: 44 percent of respondents consider truck loading automation will be among one of the most disruptive technologies, compared respectively to 28 percent globally.
·Europe, Middle East and Africa: 70 percent of respondents agree e-commerce is driving the need for faster field operations.
·Latin America: 83 percent agree that faster wireless networks (4G/5G) are driving greater investment in new field operations technologies, compared with 70 percent of the global sample.
·North America: 36 percent of respondents plan to implement rugged tablets in the next year.
Research reveals less than 20% of IT professionals have complete and timely access to critical data in public clouds.
Keysight has released the results of a survey sponsored by Ixia, on ‘The State of Cloud Monitoring’. The report highlights the security and monitoring challenges faced by enterprise IT staff responsible for managing public and private cloud deployments.
The survey, conducted by Dimensional Research and polling 338 IT professionals at organizations from a range of sizes and industries globally, revealed that companies have low visibility into their public cloud environments, and the tools and data supplied by cloud providers are insufficient.
Lack of visibility can result in a variety of problems including the inability to track or diagnose application performance issues, inability to monitor and deliver against service-level agreements, and delays in detecting and resolving security vulnerabilities and exploits. Key findings include:
“This survey makes it clear that those responsible for hybrid IT environments are concerned about their inability to fully see and react to what is happening in their networks, especially as business-critical applications migrate to a virtualized infrastructure,” said Recep Ozdag, general manager and vice president, product management in Keysight’s Ixia Solutions Group. “This lack of visibility can result in poor application performance, customer data loss, and undetected security threats, all of which can have serious consequences to an organizations’ overall business success.”
Public and hybrid cloud monitoring maturity trails traditional data centers
The survey focused on challenges faced when monitoring public and private clouds, as well as on-premises data centers. Data revealed IT professionals indicated that cloud providers are not providing the level of visibility they need:
Visibility solutions enhance monitoring, network performance management, and security
Nearly all respondents (99%) identified a direct link between comprehensive network visibility and business value. The top three visibility benefits cited were:
The survey also revealed that visibility is critical for monitoring cloud performance, as well as validating application performance prior to cloud deployment:
nCipher Security says that as organizations embrace the cloud and new digital initiatives such as the internet of things (IoT), blockchain and digital payments the use of trusted cryptography to protect their applications and sensitive information is at an all-time high, according to the 2019 Global Encryption Trends Study from the Ponemon Institute.
With corporate data breaches making the headlines on an almost daily basis, the deployment of an overall encryption strategy by organizations around the world has steadily increased. This year, 45% of respondents say their organization has an overall encryption plan applied consistently across the entire enterprise with a further 42% having a limited encryption plan or strategy that is applied to certain applications and data types.
Threats, drivers and priorities
Employee mistakes continue to be the most significant threat to sensitive data (54%), more than external hackers (30%) and malicious insiders (21%) combined. In contrast, the least significant threats to the exposure of sensitive or confidential data include government eavesdropping (12%) and lawful data requests (11%).
The main driver for encryption is protection of an enterprise’s intellectual property and the personal information of customers – both 54% of respondents.
With more data to encrypt and close to 2/3 of respondents deploying 6 or more separate products to encrypt it, policy enforcement (73%) was selected as the most important feature for encryption solutions. In previous years, performance consistently ranked as the most important feature.
Cloud data protection requirements continue to drive encryption use, with encryption across both public and private cloud use cases growing over 2018 levels, and organizations prioritizing solutions that operate across both enterprise and cloud environments (68%).
Data discovery the number one challenge
With the explosion and proliferation of data that comes from digital initiatives, cloud use, mobility and IoT devices, data discovery continues to be the biggest challenge in planning and executing a data encryption strategy with 69% of respondents citing this as their number one challenge.
Trust, integrity, control
The use of hardware security modules (HSMs) grew at a record year-over-year level from 41% in 2018 to 47%, indicating a requirement for a hardened, tamper-resistant environment with higher levels of trust, integrity and control for both data and applications. HSM usage is no longer limited to traditional use cases such as public key infrastructure (PKI), databases, application and network encryption (TLS/SSL); the demand for trusted encryption for new digital initiatives has driven significant HSM growth over 2018 for code signing (up 13%), big data encryption (up 12%), IoT root of trust (up 10%) and document signing (up 8%). Additionally, 53% of respondents report using on-premises HSMs to secure access to public cloud applications.
Dr. Larry Ponemon, chairman and founder of the Ponemon Institute, says:
“The use of encryption is at an all-time high, driven by the need to address compliance requirements such as the EU General Data Protection Regulation (GDPR), California Data Breach Notification Law and Australia Privacy Amendment Act 2017, and the need to protect sensitive information from both internal and external threats as well as accidental disclosure. Encryption usage is a clear indicator of a strong security posture with organizations that deploy encryption being more aware of threats to sensitive and confidential information and making a greater investment in IT security.”
John Grimm, senior director of strategy and business development at nCipher Security, says:
“Organizations are under relentless pressure to protect their business critical information and applications and meet regulatory compliance, but the proliferation of data, concerns around data discovery and policy enforcement, together with lack of cybersecurity skills makes this a challenging environment. nCipher empowers customers by providing a high assurance security foundation that ensures the integrity and trustworthiness of their data, applications and intellectual property.”
Other key trends include:
·The highest prevalence of an enterprise encryption strategy is reported in Germany (67%) followed by the United States (65%), Australia (51%), and the United Kingdom (50%).
·Payment-related data (55% of respondents) and financial records (54% of respondents) are most likely to be encrypted. Financial records had the largest increase on this list over last year, up 4%.
·The least likely data type to be encrypted is health-related information (24% of respondents), which is a surprising result given the sensitivity of health information and the recent high-profile healthcare data breaches.
·61% of respondents classify key management as having a high level of associated “pain” (a rating of 7+ on a scale of 10). This figure is almost identical to the 63% of organizations that use six or more separate encryption products, suggesting there is clear correlation between the two findings.
·Support for both cloud and on-premises deployment of encryption has risen in importance as organizations have increasingly embraced cloud computing and look for consistency across computing styles.
Although the market for integration platform as a service (iPaaS) shows strong growth, the first signs of market consolidation are starting to emerge. Gartner, Inc. predicts that by 2023, up to two-thirds of existing iPaaS vendors will merge, be acquired or exit the market.
“The challenge for most iPaaS vendors is that their business is simply not profitable,” said Bindi Bhullar, senior research director at Gartner. “Revenue growth and increasing customer acceptance can’t keep up with the costs for running the platform and the heavy spending in sales and marketing.”
Megavendors such as Oracle, Microsoft and IBM are better-equipped to handle those challenges as they offer more-competitive offerings with more-aggressive pricing and packaging options than smaller players in the market. Gartner expects that this trend will continue, further diminishing the market share of specialist iPaaS players.
“For organizations looking to purchase an iPaaS solution, this is good news,” said Mr. Bhullar. “They can capitalize on the evolving market dynamics by solving short-term/immediate problems today, while preparing to adopt another iPaaS offering from an alternative vendor as the expected market consolidation accelerates through 2023.”
However, market consolidation means an increased risk that platform services will be discontinued due to the vendor exiting the market or being acquired. “Buyers should minimize exposure to vendor risk by adopting platforms that can deliver short-term payoffs, so that the cost of any eventual replacement can be more easily justified,” Mr. Bhullar added.
RPA Spend to Reach Over $2 Billion in 2022
Gartner estimates that global spending on robotic process automation (RPA) software will total $2.4 billion in 2022, up from $680 million in 2018. This increase in spending is primarily driven by the necessity for organizations to rapidly digitize and automate their legacy processes as well as enable access to legacy applications through RPA. “Organizations are adopting RPA when they have a lot of manual data integration tasks between applications and are looking for cost-effective integration methods,” said Saikat Ray, senior research director at Gartner.
Gartner predicts that by the end of 2022, 85 percent of large and very large organizations will have deployed some form of RPA. Mr. Ray added that 80 percent of organizations that completed proofs of concept and pilots in 2018 will aim to scale RPA implementations and increase RPA spending in 2019.
This shows that the technology is viable and has the desired effects. However, application leaders who are new to the technology should start with a simple RPA use case and work with internal stakeholders to identify more applicable processes.
Moving forward, Gartner expects more organizations to slowly discover that RPA offers benefits beyond cost optimization. RPA technology can support productivity and increase client satisfaction when combined with other artificial intelligence (AI) technologies such as chatbots, machine learning and applications based on natural language processing (NLP).
Consider the example of a client complaining that their invoice is showing the wrong amount. Chatbots engage with the client to understand the initial issue and delegate to a RPA bot to reconcile the invoice against the actual order entry record at the back-end. The RPA bot performs the matching transaction and sends the result back to the chatbot. The chatbot processes the RPA response and intelligently answers the client.
Retailers to use real-time in-store pricing
Gartner, Inc. predicts that by 2025, the top 10 global retailers by revenue will leverage contextualized real-time pricing through mobile applications to manage and adjust in-store prices for customers.
“Digital sales continue to grow, but it’s no longer a competition between online and offline. Today, many retailers find that half of their online sales are supported by their stores,” said Robert Hetu, vice president research analyst at Gartner. “As customers share more data and information from various sources, they expect more personalized and meaningful offers from retailers. Retailers should assess personal data and product preferences, and translate those inputs into immediate and contextualized offers.”
To offer consistent, relevant and personalized prices for customers, retailers need to understand customer behaviors, especially as the path to their purchase decisions becomes erratic across touchpoints. Digital twins — virtual representations of processes, things, environments or people — can simulate behavior and predict outcomes, including customer behavior and preferences. Examples include Adidas’ Speedfactory, to improve the quality, speed and overall efficiency of the company’s entire sporting goods product chain. The city of Singapore also has a full-scale digital twin of itself that can analyze future household energy storage.
Adoption of mobile payments and retailer mobile apps also supports the predicted move toward mainstream adoption of real-time in-store pricing. “Many consumers who have downloaded a retailer’s app use it for online purchases; others use it to obtain a coupon or discount offer that they can use in a physical store,” Mr. Hetu added.
However, retailers face some customer experience and technology challenges in ensuring that the correct price is accessible in real time. “Retailers need to educate customers in understanding the dynamic nature of pricing, which means that prices can rise or fall unexpectedly. Retailers also need to be better at managing delays in updating over-inventory and under-inventory levels,” observed Mr. Hetu.
To manage pricing signage, some retailers are using electronic shelf labels or digital shelf edge technologies. However, for the many that don’t use digital labels, associates must change price labels manually. This is a high-risk source of mistakes and a limitation to the frequency a retailer can adjust prices.
“Retailers must focus on enabling technologies such as a unified retail commerce platform, which uses centralized data for inventory, pricing, loyalty and other information to facilitate a continuous and cohesive experience,” said Mr. Hetu.
By 2030, 80 percent of the work of today’s project management (PM) discipline will be eliminated as artificial intelligence (AI) takes on traditional PM functions such as data collection, tracking and reporting, according to Gartner, Inc.
“AI is going to revolutionize how program and portfolio management (PPM) leaders leverage technology to support their business goals,” said Daniel Stang, research vice president at Gartner. “Right now, the tools available to them do not meet the requirements of digital business.”
Evolution of PPM Market
Providers in today’s PPM software market are behind in enabling a fully digital program management office (PMO), but Gartner predicts AI-enabled PPM will begin to surface in the market sometime this year. The market will focus first on providing incremental user experience benefits to individual PM professionals, and later will help them to become better planners and managers. In fact, by 2023, technology providers focused on AI, virtual reality (VR) and digital platforms will disrupt the PPM market and cause a clear response by traditional providers.
PPM as an AI-Enabled Discipline
Data collection, analysis and reporting are a large proportion of the PPM discipline. AI will improve the outcomes of these tasks, including the ability to analyze data faster than humans and using those results to improve overall performance. As these standard tasks start to get replaced, PPM leaders will look to staff their teams with those who can manage the demands of AI and smart machines as new stakeholders.
“Using conversational AI and chatbots, PPM and PMO leaders can begin to use their voices to query a PPM software system and issue commands, rather than using their keyboard and mouse,” said Mr. Stang. “As AI begins to take root in the PPM software market, those PMOs that choose to embrace the technology will see a reduction in the occurrence of unforeseen project issues and risks associated with human error.”
Server revenue grows and shipments increase
The worldwide server market continued to grow through 2018 as worldwide server revenue increased 17.8 percent in the fourth quarter of 2018, while shipments grew 8.5 percent year over year, according to Gartner, Inc. In all of 2018, worldwide server shipments grew 13.1 percent and server revenue increased 30.1 percent compared with full-year 2017.
“Hyperscale and service providers continued to increase their investments in their data centers (albeit at lower levels than at the start of 2017) to meet customers’ rising service demand, as well as enterprises’ services purchases from cloud providers,” said Kiyomi Yamada, senior principal analyst at Gartner. “To exploit data center infrastructure market disruption, technology product managers for server providers should prepare for continued increases in server demand through 2019, although growth will be a slower pace than in 2018.”
“DRAM prices started to come down, increasing demand for memory-rich configurations to support emerging workloads such as artificial intelligence (AI) and analytics kept buoying server prices. Product managers should market higher memory content servers to take advantage of DRAM oversupplies.”
Dell EMC secured the top spot in the worldwide server market based on revenue in the fourth quarter of 2018 (see Table 1). Dell EMC ended the year with 20.2 percent market share, followed by Hewlett Packard Enterprise (HPE) with 17.7 percent of the market. Huawei experienced the strongest growth in the quarter, growing 45.9 percent.
Worldwide: Server Vendor Revenue Estimates, 4Q18 (U.S. Dollars)
4Q18 Market Share (%)
4Q17 Market Share (%)
4Q18-4Q17 Growth (%)
Source: Gartner (March 2019)
In server shipments, Dell EMC maintained the No. 1 position in the fourth quarter of 2018 with 16.7 percent market share (see Table 2). HPE secured the second spot with 12.2 percent of the market. Both Dell EMC and HPE experienced declines in server shipments, while Inspur Electronics experienced the strongest growth with a 24.6 percent increase in shipments in the fourth quarter of 2018.
Worldwide: Server Vendor Shipments Estimates, 4Q18 (Units)
4Q18 Market Share (%)
4Q17 Market Share (%)
4Q18-4Q17 Growth (%)
Source: Gartner (March 2019)
The x86 server market increased in revenue by 27.1 percent, and shipments were up 8.7 percent in the fourth quarter of 2018.
Full-Year 2018 Server Market Results
In terms of regional results, in 2018, Asia/Pacific and North America posted strong growth in revenue with 38.3 percent and 34 percent, respectively. In terms of shipments, Asia/Pacific grew 17.6 percent and North America grew 15.9 percent year over year.
EMEA grew 3.1 percent in shipments and 20.4 percent in revenue. Latin America grew 20.9 percent in revenue, but declined 4.4 percent in shipments. Japan grew 3.3 percent in revenue, 2.1 percent in shipments.
Despite a slow start, Europe is now one of the fastest-growing regions worldwide for blockchain spending. This is due to a number of factors, such as enterprises moving blockchain to production and a wave of local start-ups driving marketing and sales activities.
According to IDC's Worldwide Semiannual Blockchain Spending Guide, 1H18, published in February 2019, blockchain spending in Europe will reach more than $800 million in 2019, with Western Europe accounting for 83% of spending and Central and Eastern Europe 17%. Total spending in Europe will reach $3.6 billion in 2022, with a 2018–2022 five-year compound annual growth rate (CAGR) of 73.2%.
"In terms of technologies, IT services, such as consulting, outsourcing, deployment and support, and education and training, will drive spending, accounting for nearly 63% of European spending in 2019, growing at a 2018–2022 CAGR of 76.6%," said Carla La Croce, senior research analyst, Customer Insights and Analysis, IDC. "This is because blockchain needs to step up and demonstrate its production-readiness, and businesses need to ensure they take a long-term strategic view of their overarching blockchain initiatives."
Blockchain emerged out of the financial sector and the technology is now well established there, whether as a POC or a real deployment in production, with banking the leading industry in terms of blockchain spending in Europe. Finance will account for a third of total spending in 2019, with widespread uses, from trade finance and post-trade/transaction settlements to cross-border payments and settlements, as well as regulatory compliance. Insurance is expected to be the fastest-growing industry over the 2018–2022 forecast period, with a CAGR of 81.3%.
Blockchain is also growing in other industries, with ongoing experimentation bringing to light new use cases in areas such as manufacturing and resources (accounting for 19% of total spending) and other supply chain related industries such as retail, wholesale, and transport (accounting for nearly 15%).
"Interest in blockchain among supply chain industries is seen in the increasing number of use cases for tracking products, such as lot lineage provenance and asset/goods management, from food to luxury goods," said La Croce. "The aim is to reduce paperwork, make processes more efficient, prevent counterfeiting, and improve trust and transparency with trading partners and consequently with their customers as well."
"IDC also sees strong competition between cloud giants to host, manage, and service the emerging blockchain ecosystems, especially from IBM and Microsoft, along with Amazon, Oracle, Google, and SAP, with Alibaba and Huawei expected to play an increasing role in the East," said Mohamed Hefny, program manager, Systems and Infrastructure Solutions, IDC. "Building consortiums and recruiting the leading enterprises in various segments is becoming a race in blockchain now, and as a result we are witnessing a growing number of large pilot projects."
Security spending to reach $103 billion
Worldwide spending on security-related hardware, software, and services is forecast to reach $103.1 billion in 2019, an increase of 9.4% over 2018. This pace of growth is expected to continue for the next several years as industries invest heavily in security solutions to meet a wide range of threats and requirements. According to the Worldwide Semiannual Security Spending Guide from International Data Corporation (IDC), worldwide spending on security solutions will achieve a compound annual growth rate (CAGR) of 9.2% over the 2018-2022 forecast period and total $133.8 billion in 2022.
The three industries that will spend the most on security solutions in 2019 – banking, discrete manufacturing, and federal/central government – will invest more than $30 billion combined. Three other industries (process manufacturing, professional services, and telecommunications) will each see spending greater than $6.0 billion this year. The industries that will experience the fastest spending growth over the forecast period will be state/local government (11.9% CAGR), telecommunications (11.8% CAGR), and the resource industries (11.3% CAGR). This spending growth will make telecommunications the fourth largest industry for security spending in 2022 while state/local government will move into the sixth position ahead of professional services.
"When examining the largest and fastest growing segments for security, we see a mix of industries – such as banking and government – that are charged with guarding highly sensitive information in regulated environments. In addition, information-based organizations like professional services firms and telcos are ramping up spending. But regardless of industry, these technologies remain an investment priority in virtually all enterprises, as delivering a sense of security is everyone's business," said Jessica Goepfert, program vice president, Customer Insights and Analysis.
Managed security services will be the largest technology category in 2019 with firms spending more than $21 billion for around-the-clock monitoring and management of security operations centers. Managed security services will also be the largest category of spending for each of the top five industries this year. The second largest technology category in 2019 will be network security hardware, which includes unified threat management, firewalls, and intrusion detection and prevention technologies. The third and fourth largest investment categories will be integration services and endpoint security software. The technology categories that will see the fastest spending growth over the forecast will be managed security services (14.2% CAGR), security analytics, intelligence, response and orchestration software (10.6% CAGR), and network security software (9.3% CAGR).
"The security landscape is changing rapidly, and organizations continue to struggle to maintain their own in-house security solutions and staff. As a result, organizations are turning to managed security service providers (MSSPs) to deliver a wide span of security capabilities and consulting services, which include predicative threat intelligence and advanced detection and analysis expertise that are necessary to overcome the security challenges happening today as well as prepare organizations against future attacks," said Martha Vazquez, senior research analyst, Infrastructure Services.
From a geographic perspective, the United States will be the single largest market for security solutions with spending forecast to reach $44.7 billion in 2019. Two industries – discrete manufacturing and the federal government – will account for nearly 20% of the U.S. total. The second largest market will be China where security purchases by three industries -- state/local government, telecommunications, and central government – will comprise 45% of the national total. Japan and the UK are the next two largest markets with security spending led by the consumer sector and the banking industry respectively.
"While the U.S. and Western Europe will deliver two-thirds of the total security spend this year, the largest growth in security spend will be seen in China, Asia/Pacific (excluding Japan and China), and Latin America, each with double-digit CAGRs over the five-year forecast period," said Karen Massey, research manager, Customer Insights & Analysis.
Large (500-1000 employees) and very large businesses (more than 1000 employees) will be responsible for roughly two thirds of all security-related spending in 2019. These two segments will also see the strongest spending growth over the forecast with CAGRs of 11.1% for large businesses and 9.4% for very large businesses. Medium (100-499 employees) and small businesses (10-99 employees) will spend nearly $26 billion combined on security solutions in 2019. Consumers are forecast to spend nearly $5.7 billion on security this year.
AI spending to grow to $35 billion
Worldwide spending on artificial intelligence (AI) systems is forecast to reach $35.8 billion in 2019, an increase of 44.0% over the amount spent in 2018. With industries investing aggressively in projects that utilize AI software capabilities, the International Data Corporation (IDC) Worldwide Semiannual Artificial Intelligence Systems Spending Guide expects spending on AI systems will more than double to $79.2 billion in 2022 with a compound annual growth rate (CAGR) of 38.0% over the 2018-2022 forecast period.
Global spending on AI systems will be led by the retail industry where companies will invest $5.9 billion this year on solutions such as automated customer service agents and expert shopping advisors & product recommendations. Banking will be the second largest industry with $5.6 billion going toward AI-enabled solutions including automated threat intelligence & prevention systems and fraud analysis & investigation systems. Discrete manufacturing, healthcare providers, and process manufacturing will complete the top 5 industries for AI systems spending this year. The industries that will experience the fastest growth in AI systems spending over the 2018-2022 forecast are federal/central government (44.3% CAGR), personal and consumer services (43.3% CAGR), and education (42.9% CAGR).
"Significant worldwide artificial intelligence systems spend can now be seen within every industry as AI initiatives continue to optimize operations, transform the customer experience, and create new products and services", said Marianne Daquila, research manager, Customer Insights & Analysis at IDC. "This is evidenced by use cases, such as intelligent process automation, expert shopping advisors & product recommendations, and pharmaceutical research and discovery exceeding the average five-year compound annual growth of 38%. The continued advancement of AI-related technologies will drive double-digit year-over-year spend into the next decade."
The AI use cases that will see the most investment this year are automated customer service agents ($4.5 billion worldwide), sales process recommendation and automation ($2.7 billion), and automated threat intelligence and prevention systems ($2.7 billion). Five other use cases will see spending levels greater than $2 billion in 2019: automated preventative maintenance, diagnosis and treatment systems, fraud analysis and investigation, intelligent process automation, and program advisors and recommendation systems.
Software will be the largest area of AI systems spending in 2019 with nearly $13.5 billion going toward AI applications and AI software platforms. AI applications will be the fastest growing category of AI spending with a five-year CAGR of 47.3%. Hardware spending, dominated by servers, will be $12.7 billion this year as companies continue to build out the infrastructure necessary to support AI systems. Companies will also invest in IT services to help with the development and implementation of their AI systems and business services such as consulting and horizontal business process outsourcing related to these systems. By the end of the forecast, AI-related services spending will nearly equal hardware spending.
"IDC is seeing that spending on both AI software platforms and AI applications are continuing to trend upwards and the types and varieties of use cases are also expanding," said David Schubmehl, research director, Cognitive/Artificial Intelligence Systems at IDC. "While organizations see continuing challenges with staffing, data, and other issues deploying AI solutions, they are finding that they can help to significantly improve the bottom line of their enterprises by reducing costs, improving revenue, and providing better, faster access to information thereby improving decision making."
On a geographic basis, the United States will deliver nearly two thirds of all spending on AI systems in 2019, led by the retail and banking industries. Western Europe will be the second largest region in 2019, led by banking, retail, and discrete manufacturing. The strongest spending growth over the five-year forecast will be in Japan (58.9% CAGR) and Asia/Pacific (excluding Japan and China) (51.4% CAGR). China will also experience strong spending growth throughout the forecast (49.6% CAGR).
"AI is a big topic in Europe, it's here and it's set to stay. Both AI adoption and spending are picking up fast. European businesses are hands-on AI and have moved from an explorative phase to the implementation stage. AI is the game changer in a highly competitive environment, especially across customer-facing industries such as retail and finance, where AI has the power to push customer experience to the next level with virtual assistants, product recommendations, or visual searches. Many European retailers such as Sephora, ASOS, and Zara or banks such as NatWest and HSBC are already experiencing the benefits of AI, including increased store visits, higher revenues, reduced costs, and more pleasant and personalized customer journeys. Industry-specific use cases related to automation of processes are becoming mainstream and the focus is set to shift towards next-generation use of AI for personalization or predictive purposes," said Andrea Minonne, senior research analyst, IDC Customer Insight & Analysis in Europe.
Cloud IT infrastructure revenues fall below ‘traditional’ infrastructure revenues
According to the International Data Corporation (IDC) Worldwide Quarterly Cloud IT Infrastructure Tracker, vendor revenue from sales of IT infrastructure products (server, enterprise storage, and Ethernet switch) for cloud environments, including public and private cloud, grew 28.0% year over year in the fourth quarter of 2018 (4Q18), reaching $16.8 billion. For 2018, annual spending (vendor revenue plus channel mark-up) on public and private cloud IT infrastructure totaled $66.1 billion, slightly higher (1.3%) than forecast in Q3 2018. IDC also raised its forecast for total spending on cloud IT infrastructure in 2019 to $70.1 billion – up 4.5% from last quarter's forecast – with year-over-year growth of 6.0%.
Quarterly spending on public cloud IT infrastructure was down 6.9% in Q418 compared to the previous quarter but it still almost doubled in the past two years reaching $11.9 billion in 4Q18 and growing 33.0% year over year, while spending on private cloud infrastructure grew 19.6% reaching $5.75 billion. Since 2013, when IDC started tracking IT infrastructure deployments in different environments, public cloud has represented the majority of spending on cloud IT infrastructure and in 2018 – as IDC expected – this share peaked at 69.6% with spending on public cloud infrastructure growing at an annual rate of 50.2%. Spending on private cloud grew 24.8% year over year in 2018.
In 4Q18, quarterly vendor revenues from IT infrastructure product sales into cloud environments fell and once again were lower than revenues from sales into traditional IT environments, accounting for 48.3% of the total worldwide IT infrastructure vendor revenues, up from 42.4% a year ago but down from 50.9% last quarter. For the full year 2018, spending on cloud IT infrastructure remained just below the 50% mark at 48.4%. Spending on all three technology segments in cloud IT environments is forecast to deliver slower growth in 2019 than in previous years. Ethernet switches will be the fastest growing at 23.8%, while spending on storage platforms will grow 9.1%. Spending on compute platforms will stay at $35.0 billion but still slightly higher than expected in IDC's previous forecast.
The rate of annual growth for the traditional (non-cloud) IT infrastructure segment slowed down from 3Q18 to below 1% but the segment grew 11.1% quarter over quarter. For the full year, worldwide spending on traditional non-cloud IT infrastructure grew by 12.2%, exactly as forecast, as the market has started going through a technology refresh cycle, which will wind down by 2019. By 2023, we expect that traditional non-cloud IT infrastructure will only represent 40.5% of total worldwide IT infrastructure spending (down from 51.6% in 2018). This share loss and the growing share of cloud environments in overall spending on IT infrastructure is common across all regions.
"The unprecedented growth of the infrastructure systems market in 2018 was shared across both cloud and non-cloud segments," said Kuba Stolarski, research director, Infrastructure Systems, Platforms and Technologies at IDC. "As market participants prepare for a very difficult growth comparison in 2019, compounded by strong, cyclical, macroeconomic headwinds, cloud IT infrastructure will be the primary growth engine supporting overall market performance until the next cyclical refresh. With new on-premises public cloud stacks entering the picture, there is a distinct possibility of a significant surge in private cloud deployments over the next five years."
All regions grew their cloud IT Infrastructure revenues by double digits in 4Q18. Revenue growth was the fastest in Canada at 67.2% year over year, with China growing at a rate of 54.4%. Other regions among the fastest growing in 4Q18 included Western Europe (39.7%), Latin America (37.9%), Japan (34.9%), Central & Eastern Europe and Middle East & Africa (30.9% and 30.2%, respectively), Asia/Pacific (excluding Japan) (APeJ) (28.5%), and the United States (15.5%).
Top Companies, Worldwide Cloud IT Infrastructure Vendor Revenue, Market Share, and Year-Over-Year Growth, Q4 2018 (Revenues are in Millions)
4Q18 Revenue (US$M)
4Q18 Market Share
4Q17 Revenue (US$M)
4Q17 Market Share
4Q18/4Q17 Revenue Growth
1. Dell Inc
2. HPE/New H3C Group**
IDC's Quarterly Cloud IT Infrastructure Tracker, Q4 2018
The MSP market is in flux; the market has been characterised recently by major changes, with a lot of consolidation, merger and acquisition activity and also considerable new business formation because of the low cost of entry and as traditional VARs move into more profitable areas.
Looking at its list of the top 500 MSPs, IT Europa says there have been massive movements as companies have moved up and down due to winning or losing big deals, so there is inherent volatility in the sector. There are many new entrants: it has recorded several instances of new MSPs being born out of individuals and groups in customer IT departments, especially in financial services.
There has been a steady process of consolidation, with mergers and acquisitions reported weekly. The UK still has the largest number of MSPs in Europe, leading the German and Netherlands markets by some way. But other countries are showing signs of stronger engagement.
One of the issues is that of definition: there are estimates of anything from 18,000 to 25,000 managed services providers in Europe, with estimates of between 5000 and 12,000 MSPs in the UK, depending on definition; some provide very basic backup to local SMBs or just Office365, others are part of large IT partners and supply a range of services. There are few pure MSP businesses and they tend to focus on specialist areas.
According to multiple industry reports, the global managed services market is expected to reach $300bn by 2023, up from $180bn in 2018, at a CAGR of 9.3% during the forecast period. But Gartner (referred to below) says the rate of growth is much higher currently – around 35%. All the changes and unreliable definitions make it hard to get a handle on the true market. All we really know is that managed services demand is clear and is finding resonance among customers of all types, sizes and industries.
One large MSP says: “This is no surprise when you consider the demanding technology environments many organisations have. Customers want an MSP they can rely on to have the capacity, technical expertise and vendor credentials to help manage their IT estate. However, customers have incredibly varied requirements, so we’re not going to see a mass consolidation of the market with only a few players remaining. There will of course continue to be big players, but the smaller, more specialised MSPs are still incredibly important.” In terms of business, a particular trend for 2019 has been for MSPs to analyse their business more closely and turn away high cost and unprofitable business.
Research from the 2112 Group in 2019 and covering EMEA shows
Overall, the market demand remains strong: Gartner says enterprise software sales will rise 8% in 2019 and up by a similar amount in 2020. The limiting factor, however, is resources to implement these changes, and this is where managed services plays its part.
The agenda for the 2019 European Managed Services and Hosting Summit, in Amsterdam on 23 May, aims to reflect these new pressures and build the skills of the managed services industry in addressing the wider issues of engagement with customers at a strategic level. Experts from all parts of the industry, plus thought leaders with ideas from other businesses and organisations will share experiences and help identify the trends in a rapidly-changing market.
Gartner’s vp of research Mark Paine will deliver a vital keynote at the MSH Summit in Amsterdam entitled “Working with customers and their chaotic buying processes”. This will be a view on how the changed customer buying process has become hard to monitor, hard to follow and can be abruptly fore-shortened.
Introducing a new topic for many MSPs will be Igor Pejic from Europe’s largest bank BNP Paribas. He has authored an excellent book on blockchain entitled “Blockchain Babel” and his presentation will provide a solid grounding for those who need to understand the implications of how this will affect all industries and supply chains in the very near future. Those attending will also receive a free copy of his book.
Finally, the return of a favourite speaker from last year: Jonathan Simnett from technology M&A experts Hampleton Partners, who will reveal the latest thinking on how to build value in an MSP business to attract the best price.
The shortlist has been confirmed and Online voting for the 2019 DCS Awards has just opened. Make sure you don’t miss out on the opportunity to express your opinion on the companies, products and individuals that you believe deserve recognition as being the best in their field.
Following assessment and validation from the panel at Angel Business Communications. The shortlist for the 24 categories in this year’s DCS Awards has been put forward for online voting by the readership of the Digitalisation World portfolio of titles. The Data Centre Solutions (DCS) Awards reward the products, projects and solutions as well as honour companies, teams and individuals operating in the data centre arena.
DCS Awards 2019 are delighted to be supported by our sponsors, including:
Uninterruptible Power Supplies Ltd (UPSL), a subsidiary of Kohler Co, and the exclusive supplier of PowerWAVE UPS, generator and emergency lighting products, changed its name to Kohler Uninterruptible Power (KUP), effective March 4th, 2019.. UPSL’s name change is designed to ensure the company’s name reflects the true breadth of the business’ current offer, which now extends to UPS systems, generators, emergency lighting inverters, and class-leading 24/7 service, as well as highlighting its membership of Kohler Co. This is especially timely as next year Kohler will celebrate 100 years of supplying products for power generation and protection.
Established nearly 90 years ago, Universal Electric Corporation (UEC), the manufacturer of Starline, has grown to become a global leader in power distribution equipment. Originally founded in Pittsburgh, PA USA as an electrical contracting firm, the company began manufacturing in the mid 1950’s.
CBRE Data Centre Solutions (DCS) is the leading provider of full-spectrum life cycle services to data centre owners, occupiers, and investors, including consulting services, advisory and transaction services, project management, and integrated data centre operations.
NaviSite powers business innovation of the enterprise with its comprehensive portfolio of multi-cloud managed services, which spans infrastructure, applications, data, and security. For more than two decades, enterprise and mid-market clients have relied on NaviSite to unlock efficiencies and improve execution capabilities, leveraging a client-focused delivery model that couples deep technical expertise with state-of-the-art global platform and data centres.
Founded in 1986, Riello Elettronica is part of the wider Riello Industries group. Originally a manufacturer of power switching supplies for IT, the Group evolved into uninterruptible power supplies
The winners of this year’s awards will be announced at a gala ceremony taking place at London’s Grange St Paul’s Hotel on 16 May.
All voting takes place on line and voting rules apply. Make sure you place your votes by 3 May when voting closes by visiting: https://www.dcsawards.com/vote
The full 2019 shortlist is below:
Data Centre Energy Efficiency Project of the Year
|Aqua Group with 4D (Gatwick Facility)||Digiplex with Stockholm Exergi|
|EcoDataCentre with Falu Energi & Vatten||Iron Mountain Green Power Pass|
|Six Degrees Energy Efficiency||Techbuyer with WindCORES|
New Design/Build Data Centre Project of the Year
|Cyrus One – Frankfurt II||IP House supported by Comtec Power|
|Interxion supporting Colt Technology Services||Power Control supporting CoolDC|
|Siemon – with iColo||Turkcell Izmir Data Centre|
Data Centre Consolidation/Upgrade/Refresh Project of the Year
|Alinma Database Migration||Efficiency IT supporting Wellcome Sanger Institute|
|Huawei supporting NLDC, Oude Meer||IP House supported by Comtec Power|
|PPS Power supporting The Sharp Project, Manchester City Council||Six Degrees Birmingham South Facility|
|SMS Engineering supporting Regional Council of Puglia Region, Italy||Sudlows supporting Science & Technology Facilities Council|
|Techbuyer supporting University of Cambridge|
Cloud Project of the Year
|Christie Data with Clifton College||N2W Software for AWS|
|Pulse Secure with Atlassian||Surecloud with Equiom Group|
|Timico with The Royal Society of Chemistry (RSC)||Vmware CloudHealth for Adstream|
|Zadara with Brandworkz|
Managed Services Project of the Year
|Altaro with Chorus||Cristie Data with Hazlewoods|
|Pulse Secure with Healthwise||Navisite with Ed Broking|
|Timico with Youngs Pubs|
GDPR compliance Project of the Year
|Digitronic with HQM Induserv GmbH||GDPR Awareness Coalition supporting Irish SMEs|
|Navisite with Ed Broking||Surecloud with Everton FC|
Data Centre Facilities Innovation Awards
Data Centre Power Innovation of the Year
|E1E10 - Hotboxx-i||Digiplex - Waste Heat to Warm Homes solution|
|APC by Schneider - Smart-UPS||Huawei - FusionPower Solution|
|Master Power Technologies - Universal Controller|
Data Centre PDU Innovation of the Year
|Raritan - Residual Current Monitoring modules||Servertech - HDOT Cx PDU|
|Starline - Cabinet Busway|
Data Centre Cooling Innovation of the Year
|Custodian – AHU system Solution||Digiplex – Concert Control|
|Mitsubishi - TRCS-EFC-Z||SMS Engineerng – Cooling Containment Solution|
|Transtherm and 2bm – Budet-friendly, Compressor-less Cooling Solution||Vertiv - Knurr DCD Cooling Door|
Data Centre Intelligent Automation and Management Innovation of the Year
|Nlyte Software – Dedicated Machine Learning Solution||Opengear - IM7216|
|Schneider Electric - EcoStruxure IT Solutions||Siemon - Datacenter Clarity|
Data Centre Physical Connectivity Innovation of the Year
|Corning - RocketRibbon||Infinera & Telia - Autonomous Intelligent Transponder (AIT) Solution|
|Schneider Electric - HyperPod||Wave2Wave - ROME 64Q and 128Q robotic optical switches|
|Zyxel - USG110 Unified Security Gateway|
Data Centre ICT Innovation Awards
Data Centre ICT Storage Innovation of the Year
|Archive 360 - Archive2Azure||DataCore and Waterstons - SANsymphony|
|Rausch - Sasquatch SDI Appliance||SUSE - Linux Enterprise Server|
|Tarmin -GridBank Data Management Platform|
Data Centre ICT Security Innovation of the Year
|Chatsworth Products - eConnect Electronic Access Control||Frontier Pitts - Secured by Design|
|RDS Tool - RDS-Knight|
Data Centre ICT Management Innovation of the Year
|Ipswitch - WhatsUp Gold 2018||Schneider Electric - EcoStruxure IT|
|Tarmin - GridBank Data Management Platform|
Data Centre ICT Networking Innovation of the Year
|Bridgeworks - WAN Data Acceleration Solutions||Silver Peak - Unity EdgeConnect SD-WAN Edge Platform|
|Wave2Wave - Robotic Optical Management Engine Solution|
Data Centre ICT Automation Innovation of the Year
|Morpheus Data - Unified Automation Framework Solution||Wave2Wave - Robotic Optical Management Engine Solution|
Open Source Innovation of the Year
|Arista Networks - Arista 7360X Series||Juniper Networks - Native Integration with SONiC|
|OVH - Managed Kubernetes Service||SUSE - Manager for Retail|
Data Centre Managed Services Innovation of the Year
|ra Information Systems||Scale Computing with Corbel|
Data Centre Hosting/co-location Supplier of the Year
|ARK Data Centres||Green Mountain|
|Volta Data Centres|
Data Centre Cloud Vendor of the Year
Data Centre Facilities Vendor of the Year
Excellence in Data Centre Services Award
|Iron Mountain||Park Place Technologies|
Data Centre Manager of the Year
|Ole Sten Volland - Green Mountain||Amit Anand - NECTI|
|Sunday Opadijo - Rack Centre||Simon Binley - Wellcome Sanger Institute|
Data Centre Engineer of the Year
|Abdullah Saleh Alharbi - Saudi Aramco||Sam Wicks - Sudlows|
|Sinan Alkas - Turkcell||Turgay Parlak - Turkcell|
As technology has become central, and essential, to everyday lives, trends within technology, like Artificial Intelligence and deep learning, have exploded to keep up with consumer demand. Due to these advances, the lines between the digital and physical worlds are more blurred than ever before and, as a result, context must be considered when interpreting data to improve customer experience.
By Brenden Rawle is Director of Interconnection for EMEA at global interconnection and data centre company Equinix.
Indeed, one of the trademarks of recent technology is the way it can respond based on how people use it. Take, for example, Google’s Waze, a GPS app which provides turn-by-turn navigation information, user-submitted travel times and route details, as well as having the ability to track the arrival status of friends and relatives travelling to the same destination – all in real time.
This type of responsiveness is known as ‘contextual awareness,’ where a given computing device uses user-specific data, like GPS technology and sensors, to determine their user’s situation and surroundings. In this way, it seems as if the device ‘knows’ where the user is and what he or she needs. And this context-aware computing market is only expected to grow; with a predicted value of over $125 billion by 2023, due to a rising demand for more personalised user experiences. So how can businesses capitalise on this technology?
Making data human
In the past few years, advances in artificial intelligence have captured the public imagination, and led to widespread acceptance of AI-infused technology. However, what has become clear is that for businesses to maximise the potential benefits offered by an AI-human relationship, they must design or deploy computers able to interact with humans in a more natural manner. And the key to this lies in helping machines grasp one important component – context. And, just as contextual awareness is changing how we as humans personally use and interact with technology, AI and contextual awareness looks set to fundamentally change how businesses use it too.
Changing the ‘what’ to ‘why’
Artificial intelligence is not new to businesses. Organisations already use data to analyse and answer important questions, often allowing them insight into ‘what is happening,’ and ‘what will transpire in the future.’ But, without contextual awareness, what is often missing is the key ingredient – ‘why.’ By utilising a contextually-aware system, businesses will be able to identify and anticipate changing circumstances, enabling them to react in real time with the correct response.
As data sets continue to grow exponentially, contextual awareness is a way to make AI solutions faster, more accurate, and better tailored to customer needs. It also enables businesses to receive useful data from customers, which can support and inform decision making around business strategy and new services or products. For example, in retail, it’s about having background knowledge of a customer as they come through the door – name, preferences and interests, buying history, who they are connected to, where they live, and more. That kind of intelligence makes it easy to offer a more tailored shopping experience and may ultimately increase chances of a sale. The better the company is informed, the more opportunities they have to up-sell their business.
Tackling privacy concerns
Of course, there is always a catch. While context aware computing can be advantageous, it can also be intrusive. As ever, companies using such techniques need to be aware of data privacy. Much of the news around data breaches focuses on compromised personal information, such as names, addresses, credit cards or login credentials. As connected devices become more deeply woven into the fabric of our daily personal and business lives, new risks will continue to emerge.
Indeed, as vast amounts of data amass in one place, and are increasingly stored in the cloud, organisations need to strengthen their data security strategies and invest in software to proactively prevent breaches of data. This is where Equinix comes in – by enabling businesses to circumvent the public internet via an interconnection-first approach, all data can be protected and productive.
Upscaling and protecting your business’ technology
To ensure your business is playing by the data security rules and can truly reach the next stage of digitisation – turning information into insights – there are undeniably several key technological capabilities required, meeting the demands of contextual computing.
At Equinix, we are constantly reviewing how we can best support companies as they harness the power of data. We believe the ability to directly and securely interconnect with other companies in your supply chain, and therefore privately exchange data, is the way forward for all businesses hoping to fully embrace the benefits of AI and contextual awareness.
As AI becomes more integrated – requiring real-time interactions between people, objects, locations, clouds and data – data usage will increase exponentially. Traditional IT architectures, fixed and siloed, within the confines of a corporate data centre, are not built to handle fleets of connected devices, all generating vast amounts of data concurrently. In order to utilise and analyse the data generated, organisations will have to ensure their cloud capabilities can handle both the data and security risks.
Placing cloud security controls at the edge, enables greater performance, visibility and agility for organisations looking to keep their customers’ data secure. A global, vendor-neutral colocation and interconnection platform, such as Equinix, allows you to deploy robust security control points along the perimeter of your digital business for the highest-level of data security, protection and compliance.
Tech is king
Technology is considered the solution to almost everything in today’s modern world. Its promise has always been about making things better. Whether that means transforming businesses for success, improving communities and lives, protecting the environment, or just making things more convenient and efficient.
In today’s world, data traffic is continuing to proliferate, and as businesses prepare for the next phase of business intelligence, what is becoming clearer and clearer is that they must be smarter in how they approach data to unlock its full value. In an age of digitisation, the biggest threat to companies comes from not being able to move quickly enough to respond to fast-changing market conditions and customer requirements – integrating contextual awareness and AI into their business models, might just be their ticket to success.
 Global Market Insights, https://www.gminsights.com/industry-analysis/context-aware-computing-cac-market
Remote monitoring and management solutions ensures maximum uptime.
“Since the formation of CMI, we have used Kaseya as the basis of our remote monitoring and management (RMM) capability. Knowing Kaseya is the market leading supplier of RMM solutions made it our first choice and from here our relationship has grown to incorporate the full range of Kaseya IT Complete products over almost 10 years. During this time, we have seen operational efficiencies improve by more than 20 per cent and we now manage more than 5,000 endpoints,” says Ken Roulston, managing director at CMI.
Founded in 2009, CMI is one of the UK’s leading IT service providers, delivering a wide range of services including infrastructure design, implementation and support. The company creates, manages, implements and supports IT systems which in turn keeps organisations effective and secure. As Roulston explains, “We have seen continued and significant growth during this relationship and Kaseya has supplied us with the products needed to ensure our customers experience the maximum uptime possible from their investment in IT infrastructure.”
The need for a fully integrated RMM system
CMI was formed after the acquisition of two firms in 2009 that didn’t have RMM systems in place. “An RMM system was a key requirement for our business and after evaluating the marketplace, we discovered that Kaseya was the market leader in this area. The VSA endpoint monitoring and management system provided everything we needed at that time, allowing us to monitor and manage devices remotely,” he says. VSA provides CMI with the unified management, comprehensive visibility, and scalable automation needed to install, deploy, and update all software. CMI used VSA in isolation for a number of years, however, began to realise that Kaseya offered a number of other solutions that could be integrated and customised to develop its VITA offering.
Kaseya IT Complete stack integration
“We have seen significant growth since 2009, taking our turnover from less than £2 million to more than £5 million and expanding our employee base from fewer than 20 staff to about 50 employees. As our company has grown, we have increased our adoption of Kaseya’s products and been able to expand our own offering to customers,” explains Roulston. CMI now uses the full stack of Kaseya IT Complete products, which includes Kaseya Unified Backup and Cloud Backup solutions that allow the backup of both servers and desktop devices to an onsite appliance and off-site location, respectively. Next was the adoption of Kaseya Antivirus which simplifies the ongoing maintenance and security of CMI’s endpoints and network, saving time and IT resources. AuthAnvil was also added to allow easy and secure single sign-on (SSO) access to any application and protects the company’s data by ensuring that only authorized people are given secure access to sensitive applications and information. Additional products integrated throughout the relationship have included NOC services for 24/7/365 support, IT Glue for real-time, seamless documentation, Kaseya Antimalware and Enterprise Mobility Management (EMM). IT Glue allows CMI to put client information at the technician’s fingertips leading to better technician load balancing, while NOC Services have enabled CMI to focus on high-value services and strategic growth, rather than mundane, repetitive tasks. “We much prefer having an integrated solution than a combination of disparate product offerings which are difficult to manage. The ease of which Kaseya’s products can be integrated and manged has resulted in us building additional components into our offering as Kaseya brings them out. This has allowed us to benefit from learning capabilities and to offer new proactive services that lead to greater monthly recurring revenues,” adds Roulston.
The partnership has seen Kaseya provide both technical and sales support to CMI and the use of Kaseya products has resulted in a reduction in time spent on fixing issues, thereby reducing the need for engineers onsite as more can be done remotely. “In recent years, Kaseya has been very supportive in wanting us to succeed. Kaseya accepts that its success is tied to our success and is very proactive in helping us to deploy new solutions as well as support existing ones,” explains Roulston. “The quality of Kaseya’s products has continually improved, and we find the team at Kaseya very active in asking for our input about what we want to see in products. They do their best to incorporate these suggestions into new releases. Kaseya works with us to ensure we maximise the leverage we can get from new products we bring to the table.” Roulston adds that one of the benefits of the long-term relationship has been the access to senior management. “This access to senior decision-makers within Kaseya has strengthened our relationship. I find them to be very open as a partner. The major benefit to us is that they understand our business and are always willing to help us.”
Looking to the future
In the future Roulston says CMI will be looking to expand its use of Kaseya products even further. He explains: “We have truly embraced Kaseya’s full suite of offerings and are deploying them as they mature and are ready to be integrated. This is a move driven by their ease of integration of the products and the improved productivity that arises as a result. We are continually looking at all of Kaseya’s products and it is our intention to use as many of them as possible because of the benefit of integration.”
Threat information sharing is recognised as an important and evolving topic within cyber security. The need for organisations to collaborate on the protection of IT systems is, in part, driven by the highly collaborative and diverse ecosystem of threat actors, with an ever-greater overlap of tools, techniques and teams targeting the public and private sector. Where organisations effectively share experiences and insights that may be unique to them, broader communities can benefit at scale – a rising tide lifts all boats.By Bernard Parsons, CEO of Becrypt.
Much has been done to improve the sharing of threat intelligence, both nationally through the National Cyber Security Centre’s (NCSC) Cyber Security Information Sharing Partnership (CiSP), as well as within specific communities of interest. However, it continues to be recognised that more needs to be done, as reflected by initiatives such as the Financial Sector Cyber Collaboration Centre, announced by UK Finance in 2018. Calls continue for Government, or specifically the NCSC, to share more advanced threat intelligence given their unique visibility of the evolving threat landscape.
However, balancing the risks associated with information disclosure, relating to both vulnerabilities and evolving adversary capabilities, will always create a practical limit to both the speed and extent to which this can be done.
Evolving cyber defence
There is, however, another area in which the NCSC possesses unique capabilities that is both valuable to the industry and easier to share, but which has to date been demanded far less. Threat intelligence sharing is primarily about detection and response, however in its role as the National Technical Authority, much of the NCSC’s guidance, as delivered to Government, is focused initially on defence.
After all, architecting systems that are well protected and minimise the likelihood of compromise is the first step to a successful detection and response strategy.
In the pre-NCSC era, very little of the architectural advice for the Government’s classified networks would have been relevant to the needs of many in the private sector. Government systems were typically built as bespoke, expensive and exhibiting poor usability, with all system requirements subservient to security – an approach which, ironically, often undermined security.
In recent years, the Government has evolved to make better use of modern technology and meet the expectations of a modern workforce. As a result, many of the newer Government systems, even those that operate at higher levels of classification, now leverage commercial technology. This offers the levels of functionality, flexibility and usability that private sector employees would be familiar with, whilst still achieving the levels of security required for sensitive Government systems.
However, as far as information sharing is concerned, relatively little has so far been done, in terms of more broadly communicating the innovations and experiences gained within government in recent years.
Moving towards informed risk management
Cyber-related IT transformation within Government has been achieved by significant advances at product and architectural level. This has been driven by both the NCSC’s world-leading expertise and the shift within exemplar Government departments towards informed and effective risk management.
The resulting ‘defence in depth’ architectures allow departments to proportionately manage the risks that are important to them. This can be achieved by employing products that provide a high degree of assurance against well-articulated security claims – claims that can be independently validated. High Assurance products deployed within appropriate architectures allow risk to be quantified in a way that is difficult to achieve in systems that are primarily reliant on probabilistic defences – be that signature or other forms of anomaly detection.
Such technologies may be necessary but are not sufficient for achieving well-quantified and well-managed technical risk in today’s diverse and evolving environments. This encapsulates cloud, mobile big data, IoT and the myriad of technology trends that even the most security-conscious organisations need to adopt at pace.
Driving demand for better cyber
The broader sharing of relevant guidance by the NCSC certainly shows signs of growing, with recent examples being published architectures for secure data import, and publicising work focused on secure mobility.
However, the pace and extent of sharing does need to be driven by demand from the private sector. Arguably today, the market is far from optimised to drive demand for better cyber security technology and services. One absent market lever is the necessary assurance schemes and standards that can appropriately define what good looks like, and how technical risk can be better quantified and managed.
Existing schemes are not yet sufficiently mature to cope with the scale, agility and innovation required. Instead, many organisations are reliant on the more subjective opinions of sources such as industry analysts, who may be more subject to marketing budgets than an informed and detailed analysis of a new product or service capability.
Encouragingly, the Government does have a current focus on innovating in the product and service assurance spaces, with active initiatives within the NCSC and through the Cyber Growth Partnership (CGP). The CGP in particular is keen to reach out to broader stakeholder communities, encouraging the private sector to play a greater role in such initiatives.
This is being achieved through using its unique perspective to help inform and improve the common standards of assessing technology and best practice, particularly for the sometimes slightly under-valued topic of cyber defence. If successful, the UK would be well positioned within the domain of cyber, to establish a rising tide that does indeed lift all boats.
Digital transformation is no longer optional for businesses. These projects drive innovation and are crucial in order to keep up with competitors and ensure that organisations continue to win market share.
By Mike Walton, CEO and Founder at Opsview.
Transitioning into a digitally empowered business is therefore at the top of many C-Suites’ priority lists and, according to a 2018 IDC report, 89% of enterprises have plans to or have already adopted a “digital-first business strategy”. It therefore should come as no shock that the global digital transformation market is growing at a CAGR of over 18% and according to MarketWatch is estimated to exceed $462 billion by 2024.
While the role of the CIO has rapidly evolved to that of business enabler and a key player in the boardroom, they are also under more pressure than ever before for their teams to support business demands for continuous digital innovation. However, ROI is difficult to achieve due to the complex nature of these projects. Half of the US executives surveyed in one 2017 poll said their company isn’t successfully executing against 50% of their digital transformation strategies. One in five said they secretly believe it’s a waste of time. This is also supported in numerous other sources. In fact, a report from analyst firm, IDC, also warns that while spending is increasing, firms aren’t getting the results they crave. Over half (59%) of organisations questioned for the report were described as being stuck in the early stages of DX maturity, what IDC calls a “digital impasse”.
Visibility is the driving force behind successful digital transformation projects
Legacy IT and complexity is one of the major barriers to successful digital transformation projects. Behind a new breed of innovative customer and employee-facing digital services lies a hotchpotch of disparate and decentralised systems – virtual machines, hybrid cloud accounts, IoT endpoints, physical and virtual networks and much more. These disparate, decentralised systems don’t talk to each other, and they frequently fail. To make things worse, many of these systems are outside of the control of IT, and new investments designed to solve the problem of legacy IT can create further complexity.
The only way to mitigate against the risk of failure is for businesses to ensure that they have a practical strategy in place that improves visibility and control. It is easy for IT leaders to get caught up in the dizzying hype and ‘promised land’ of AI in helping drive successful digital transformation. The reality is, that we are years away from the day-to-day deployment of AI within an organisation and companies need to have a clear vision of what it means in terms of people, process and technology. While big data, analytics, cloud, IoT and machine learning undoubtedly play a key role in achieving digital transformation and enabling companies to achieve new levels of enterprise productivity, the transformation really all starts with the network – the very framework that connects and supports every other element. The network of the future needs to be faster, more resilient and capable of supporting the new breed of digital services and work streams from multiple vendors that businesses have rushed to implement under the umbrella of digital transformation.
Without solid foundations in place, businesses’ digital transformation projects and growth plans are destined to fail. Organisations must centralise IT monitoring and destroy data islands if they want to have a holistic view of what is going on across the network and ensure digital transformation projects are successfully supported. In many ways, IT operations including monitoring, has been left behind by digital change, as many organisations still view it as an afterthought: major investments in new apps and services are not matched proactively by improvements in performance monitoring. Part of this is down to perceptions of IT operations and monitoring as a cost centre rather than a value driver. However, you can’t monitor what you can’t see, and businesses need to make sure that they have one single overall view of the entire IT estate. Research from Enterprise Management Associates revealed that a vast number of organisations still have ten or more monitoring tools and that it can take businesses between three-six hours to pinpoint the source of any IT performance issue, which is clearly unsustainable. By monitoring all aspects of applications and systems from a single pane of glass, organisations will have the full picture of system health, availability and capacity at all times in near real-time, and be able to drive business value from digital transformation operations.
Furthermore, without visibility into the performance of applications and systems, early-stage problems can be missed which then end up snowballing into major incidents, such as IT outages. These types of incidents have a huge financial impact and every minute a business is suffering from an IT outage it hemorrhages money – in fact, it costs about $5,600 per minute according to Gartner’s research, although the analyst admits that the figure could reach far higher — $540,000 per hour — at the top end.
Breaking down institutional silos
Siloed IT teams do not make a CIO’s role any easier. The lack of communications between departments prevents the insight needed to demand and utilise a move to more proactive monitoring practice. This in turn creates an inherited culture of disastrous tool sprawl, with each team purchasing similar tools from different vendors to suit their needs without first consulting each other. In this sprawled environment, it is impossible for IT leaders to gain clear visibility over the entire environment. By monitoring all aspects of applications and systems from a single pane of glass, the CIO can ensure that the business has the full picture of system health, availability and capacity at all times in near real-time, encouraging the breakdown of institutional silos, and ensuring that IT is able to drive business value from digital transformation operations.
Visibility into the network and across the entire IT estate really is power, and IT leaders need the right processes in place to provide them with key insights into dynamic cloud and virtual environments as well as the traditional static, on-premises world. Additionally, this insight needs to be consolidated via a single monitoring platform so that a single version of the truth can enable IT leaders to detect bottlenecks, see how the IT infrastructure reacts to specific changes and spot the early warning signs of any problems which could impact performance. It’s the only way to minimise disruption, drive value from IT operations and ensure that a digital transformation project positively impacts businesses.
Is IT security improving, and will it continue to improve, or are the bad folks winning the battle? Are data breaches inevitable, and therefore it's all about managing the fallout from these, or is it possible to prevent security breaches from occurring? DW asks vendors for their views on the digital security landscape. Part 1.
Right now, it’s fair to say hackers are winning the war, according to Omar Yaacoubi, CEO and founder of cybersecurity start-up, Barac.
‘According to Cybersecurity Ventures, cyberattacks are the fastest growing global crime, and they are increasing in both size and sophistication. It estimates that, by 2021, cybercrime will cost the world a staggering $6 trillion (approximately £4.5 trillion) each year, that’s more than the economic impact of all the world’s annual natural disasters combined.
‘Much still needs to be done to put organisations on an equal footing with hackers. While the pursuit of profit has meant hackers have been quick to invest in tools that automate and weaponise their attacks, cost-conscious organisations haven’t upgraded their cyber defences with anything like the same sense of urgency. In many cases, they are still relying on outdated tools that aren’t up to the task in hand.
‘One prime example is with anti-malware solutions. Nearly every organisation will rely on these tools to check the traffic traversing their networks to ensure its free from threats and suspicious activity. Yet the majority of these solutions find it virtually impossible to scan encrypted traffic. Indeed, they must decrypt the traffic before they can inspect it for malware or other threats. This process is not only compute intensive and difficult to scale, it creates an absolute nightmare for compliance officers, as there will be a moment in time when confidential and sensitive data is in plaintext.
‘To provide some context around the scale of this challenge, Google estimates that, this year, some 80 percent of internet traffic will be encrypted. Hackers can smell blood. They know that popular tools aren’t equipped to deal with these volumes of traffic and that organisations are vulnerable. That is perhaps one of the reasons why PWC believes that 60 percent of all of this year’s malware will hidden inside encrypted traffic streams.
‘As bleak as this situation sounds, it’s worth noting that the capabilities of IT security solutions are definitely improving. Recognising the threat posed to their critical infrastructures, governments and vendors across the world are also putting their backing behind some heavyweight cyber initiatives. In particular, we’re seeing vast steps forward in the capabilities of Artificial Intelligence and Machine Learning tools, which are able to identify and block more threats, in less time than ever before. These include techniques to stop threats on encrypted traffic in real time, without the need for decryption and all the difficulties it brings.
‘Things are definitely moving in the right direction, but there’s much work to be done - and investment to be made - before hackers will have their advantage wiped out.’
What is IT security worth?
asks Austen Clark, Managing Director of Clark Integrated Technologies.
The digital age brings huge opportunities but also risks. Every day organisations face cyber-attacks, with attempts to steal information and money, or disrupt business.
Attackers have the patience to acquire multiple footholds to launch an attack at the proper time. They’re more motivated and sophisticated than ever.
Almost every day the mass media has another story of the latest giant cyber breach, with an explosion of private data being released into cyberspace. We scrutinise the organisation that has come under the spotlight and it sinks in that it remains a real and persistent risk for every boardroom to address.
There’s nothing to indicate that cyber breaches will disappear. No area has escaped, the threat morphs and grows daily as a new generation of impact is discovered, knowing no boundaries and often with the potential of creating a global threat.
But how much should be spent on investing in cyber protection and IT?
Asking this is like asking how long is a piece of string. There is no right or wrong response, but perhaps it would be better to pose the question, “How much value do you place on your reputation?”
We read news of a breach to an organisation, and the stigma of being unsecure stays with the victim, confidence and integrity is damaged and that can be more damaging than any initial financial loss.
Technology has a shelf life, it needs constant updates and maintenance, and failing to do that in vulnerabilities and being exposed to hacks, malware infections or ransomware attacks to mention a few.
Almost all software has a lifecycle. Software engineers are tasked to manage and maintain this during its lifespan. When it reaches End Of Life, engineers move on to the ‘new’ software leaving the older unsupported software vulnerable to future attacks.
The growth in applications and data being migrated to cloud enabled services means that cloud security and protection is big news.
Cloud services, by design, are accepted as a reliable means of distributing technology. Cloud Application Security Broker (CASB) services manage and secure applications in the cloud.
There are opportunities to introduce security and additional protection to cloud-based services securing online applications through backup and encryption, along with Multi Factor Authentication, which adds further layers to security.
Bear in mind that there are many actions that cost nothing.
Like changing your password, locking your phone, educating yourself and others. Other steps have minimal cost, like install a malware and antivirus package, or ensuring your router and firewall are secure and up to date.
It’s understandable that people have concerns about information security today, says Darren Hockley, MD of eLearning provider DeltaNet International.
The amount of data we produce and process is growing at a mindboggling rate (2.5 quintillion bytes per day!). Every time we use smart devices to search for answers, make payments, access social media, order food, book a taxi, send messages, check the weather, etc., etc., we are adding to the immense amounts of data captured by the Internet of Things (IoT). In fact, there’s a name for this type and volume of analysable data – we call it ‘Big Data’.
It’s easy to assume that with more data comes more risk of a breach, but this isn’t strictly true. That’s because the same advanced technologies that have made data into such big business (powering emerging industries like Fintech, for example) are also extremely effective at strengthening information security. Advances in data science have made it possible to predict normal vs abnormal patterns at scale. This means that digital information such as log-in information and times, authentications, authorisations, access permissions, network activity, malware attacks, updates and more can all be analysed in near real time and used to identify and mitigate the risk and probability of a security breach.
Whilst interesting and certainly encouraging to think about data-led intelligence as a security tool, it’s key that humans don’t become complacent as our machines grow ever smarter. The human element in IT security is still as important as ever – and not just for data analysis and computer programming but also in terms of behavioural best practice. Awareness training is a must when it comes to battling the sort of errors in judgement that most often lead to security breaches and hacking, e.g. downloading a document from an unknown email source, clicking on a malicious link, or using weak passwords. Even though we’re all well versed in the dangers of these things, without continuous awareness training to keep threats fresh in our mind, it’s all too easy to fall into the trap of feeling protected by advanced technology. The irony is, most cyber-attacks aren’t sophisticated at all – they’re opportunistic.
Melanie Jones, a Product Director for cybersecurity portfolios at Global Knowledge believes that there is a definite increase in security awareness - nothing like a few data breaches to focus the mind – however, this is not necessarily translating into a vast investment in network security.
“There are a number of reasons for that, the most notable being the shortfall in cybersecurity skills and cost. The shortfall in cybersecurity skills is predicted to grow year on year and will continue to do so unless we invest heavily in recruiting new personnel. Unfortunately that does not appear to be the issue in the black hat world where hackers appear to be growing in number as they realise the financial benefits of a successful breach.
Data breaches are inevitable but the challenge is how you mitigate that risk and handle the breach, as time is always of the essence. Organisations need to invest now to ensure they stay safe for the foreseeable future. Failure to invest now could prove costly in the long-run.”
ICU IT Director, Simon Lewsley, thinks that IT security is still a game of cat and mouse for businesses, and that’s unlikely to change.
If you strip away the people who break IT security for financial gain and focus on those who are searching for holes in security, while the challenge exists, so will the people looking for these holes.
Focus back on the people that look for security holes for financial gain, and it's reduced the number of bank robberies in the world and in turn, made cybercrime highly organised and profitable. If you consider that you can buy custom ransomware for around £1000, with the average ransom fees being £300 to £600, you only need a few victims to pay the ransom to cover your costs. To put the possible revenue incentive into context, 200,000 people caught the WannaCry ransomware. While the incentives remain, so do the risks.
Security itself over the last 10 – 20 years has vastly improved, but as Microsoft and other vendors are trying to make their systems more secure, the attacks have become more sophisticated and will continue along this trend. Although prevention is better than cure and there is no excuse for not having firewalls in place which are tested, systems for security are becoming multi-layered and smarter (with learning technologies), with the aim of raising the alarm and limiting the damage done.
It is no secret that the pharmaceutical industry has faced some challenges when it comes to modernising its platforms, processes and practices, but recent trends suggest the industry is tackling this head on. In this piece Christian Hebenstreit, General Manager & Senior Vice President EMEA, Medidata Solutions explores the steps life sciences and healthcare companies are taking to embrace digital transformation.
A host of new technologies – including mobile, social, big data, predictive analytics, machine learning (ML) and artificial intelligence (AI)– are transforming the pharmaceutical sector today as well as helping life sciences, healthcare, biotech, and medical device organisations to make data-driven, insights-led business decisions so that new drugs can be brought to market faster and more efficiently.
This explains why, in just a short space of time, an increasing number of innovative organisations in the life sciences and healthcare sector are embracing digital transformation and IOT (Internet of Things) strategies. These companies have prioritised the shift – despite the challenges it represents – and are seeing the benefit in terms of reducing risk and delivering positive business outcomes.
That said, a transformation like this is never easy. Furthermore, it can’t happen overnight, nor is it necessarily a process that ‘ends’ - it will be an ongoing ‘work in progress’ for every organisation that embraces it. There is also no single template for success - every engagement we’ve had with leaders in the life sciences industry tells us that to get digital transformation moving in the right direction, organisations will need to embrace three key considerations – operations, infrastructure and culture. So, let’s explore each in a little more depth.
1. Re-imagining the operational model
An effective digital strategy defines where a business sits today, where it is going in the future and how it will get there. But more important than the “where” and the “how” is the “why “: the reason a company exists today despite all the disruption that is occurring around us. Each business leader embarking on digital transformation must start with a simple question: how can I utilise digital to reimagine the business model of the future?
Digital transformation demands a radical rethink of business operational models and where an organisation exists within its industry. For example, where can an organisation play to win in the near-to-medium term? How will a digital transformation program add new value for customers, streamline the supply chain or support business outcomes?
New models of care are emerging in the health and life sciences sector as digital enables activities, such as remote monitoring, and data exchanges between clinicians and patients. Together with these new models of care, new economic models are being created as insurers, governments and health care providers innovate and build new revenue model. Similar to every other industry, life sciences is being impacted globally with digitalisation, specifically from smart devices. Subsequently, real-world data streams from patients’ smart devices is generating massive amounts of data, providing more accurate insights into patients’ conditions.
To embrace these new technologies, the organisation operational model from the supply chains, the manufacturing process, the delivery model as well as the interaction with the physicians and the patients will have to change. These radical changes and digital technologies are what will really enable digital transformation. The devices will inevitably give greater visibility and transparency at the patient level. But furthermore, the models that are coming out of this to support that process and those products are also equally as transformational.
In today’s digital age, every life science and healthcare organisation will need to approach business like a strategy house, innovate like a start-up, design like a tech giant and scale like a venture capitalist. Only then will can they really shift from doing digital to being digital.
2. Refreshing existing infrastructure
Due to the growing data volumes involved in life sciences research and the need for speedy analysis, traditional IT infrastructures can no longer deliver what is required. The reason being that such infrastructures are hard to scale or struggle to deliver the needed performance, and as a result can be an obstacle to research progress and investigative success.
What is needed is an infrastructure that can accommodate the large volumes of data in such a way so that high-throughput computational workflows can be sustained to quicken the pace of research. This must be accomplished in an economical manner that does not require the IT staff to invest large amounts of time managing and operating the systems involved.
Life science companies also will often be working with multiple technology vendors, on an array of different platforms. This results in a complex IT environment. When this is combined with the necessary and painful management of ageing physical IT infrastructure, it is not difficult to see how workflows could be interrupted and processes slowed. This is where the migration to consolidated platforms – and in particular the cloud – can have a dramatic impact.
Even though an aging IT infrastructure can act as a significant obstacle to many, the decision to refresh and replace it is a brave one. But when it comes to digital transformation, fortune favors the brave. Many across the life sciences industry are taking inspiration from like-minded organisations thereby undertaking their own transformational efforts, looking to build cohesive, sustainable IT strategies and move legacy systems into aggregated cloud platforms for many parts of their workflow.
3. Rethinking the organisational culture
Many companies already have a strategy of continuous improvement in their businesses and operations globally. Digital transformation will demand changes to strategy, technology, processes, and organisational structure and culture change is the glue that will bring it all together.
Changing the internal culture is often one of the biggest barriers to digital transformation. Culture is everyone’s responsibility, and digital leaders must constantly educate about cultural changes that will help shift their organisation towards a digital culture that is customer-focused, innovative, agile and collaborative. That said culture change is a slow process and must be handled patiently, with understanding and persistence. Some changes are gradual and evolve toward an end goal, which becomes clear over time and some such as acquisitions, investments, partnerships, or other external activity or statements can be more immediate.
The role of the board in a digital business is key yet very different from the role of the board in a legacy business. To make a digital transformation happen successfully, there needs to be complete alignment - from the board and executive team through the whole organisation. To this end, there is a new generation of board director emerging that is much more hands-on, with a more entrepreneurial background actively striving to digital transformation a reality.
Paul Westmore, IT Director at the University of Plymouth, UK, explains how a single sign-on digital learning environment has helped the institution to create an edgeless learning experience for its 20,000+ students.
Plymouth’s ambitious digital strategy supports our goal of becoming an edgeless university as well as recognising our high quality, internationally-leading, education and research into innovation. Our vision looks to enable students and staff to experience the university entirely digitally if they choose to. In order to achieve this vision, we decided that we had to replace our existing module-focused virtual learning environment (VLE) with something more dynamic and flexible: something that was mobile-first and housed a range of resources and systems which would all sit behind a single sign-on.
Our aim was for the new digital learning environment (DLE) to become an online hub where students and staff could access all content such as lectures and course materials, submit work, receive results, and interact with each other all through one online system. We didn’t just want to install a SharePoint-based system, we wanted to ensure a completely different experience for students, where they could go into an online environment, and access everything they need. DLE’s are gaining moment across the education sector and for us, it would form the building block of our digital vision for the university. We also wanted the DLE to be available through multiple platforms, including personal devices such as mobile phones and tablets.
Integrated tools can be added to the VLE, such as plug-ins like plagiarism detector Turnitin, or even open-source instruments like e-portfolios. Open learning environments were also introduced along with VLEs but with a focus for universities to create their own environment and therefore have more control over it. Much like our DLE, we have built the system to offer a platform for our students to engage with all forms of content in one place, at one time. This is an improvement compared to our previous solution, which saw several systems built in-house. Standardising this as one system has helped us to be more aligned with our digital strategy and another step closer to becoming an edgeless university.
We have a large student body with ever-changing wants and needs. The University of Plymouth is home to more than 20,000 students and almost 3,000 staff. A further 17,000 students are studying for a Plymouth degree at partner institutions in the UK and around the world, making Plymouth the UK’s 15th largest university. As such, we wanted to create a learning environment that was the sum of all the components that go into Plymouth’s student experience. I’ve found that when people refer to a VLE, they just mean Moodle, the core environment where all of the learning materials, are held. Therefore, we wanted something with a wider range of learning environment tools.
We made the decision to ask CoSector – University of London to implement and support the hosting of Moodle, a traditional VLE based platform. However, one of the constraints of Moodle is that ordinarily documents cannot be shared with different programmes. So CoSector – University of London set up a bespoke solution where the documents are held in a different system, creating one learning asset which is shared between multiple programmes.
During a year-long project, we rolled out the DLE across multiple sites, over the summer period, which included developing an advanced assignment tool, mobile app integration and the subject view courses. Today, Plymouth’s DLE brings together a number of systems integrated through Moodle, using it as the hub. The new functions included timetable information, coursework submission, e-assessments, quizzes and ‘minimum module information’ consisting of details of each module, electronic reading lists, past exam papers, forums and wikis – all to help students make more informed decisions about their learning journeys.
Our new DLE at the university now brings together a number of systems integrated through Moodle as the hub. Single sign-on technology provides easy integration and movement between systems such as Talis Aspire (reading lists), PebblePad (ePortfolio), Panopto (content/lecture capture), and Turnitin, in addition to a range of excellent tools including formative and summative testing, submission and feedback. A subscription to LinkedIn Learning provides a wealth of online video-based courses for staff and students to enhance their courses or develop their own skills. It’s clear that the new solution was more superior to the old system, which had multiple platforms and no central hub, now staff and students can access information from one place.
The uptake and use of the system has been unprecedented. Around 13,500 students access the DLE on a daily basis and 90% of the Mobile with Plymouth App usage is now related to teaching and learning activities.
Usage of the DLE continues to grow year on year and in 2018 serviced 6,000 module sites, hosted 1,200 formative quizzes, contained nearly 30,000 embedded learning resources and supported over 8,000 eSubmissions. Students accessed over 25,000 hours of captured lecture content, viewed over 170,000 LinkedIn Learning videos and generated over 350,000 originality reports via Turnitin.
Following the integration of the new system, we received a very positive response on the National Student Survey (NSS). Some of the student feedback included; “The DLE is a fantastic and easy to use resource…” and “Resources available in the library and DLE are great, with a good range of books, articles, online seminars, etc.”
The centrality of having a single-integrated user experience has been really crucial to the success of the project. The University of Plymouth’s DLE has now become a portal into the teaching and learning community of the University.
Have you ever thought about how most cyber-attacks are carried out?
Asks Karl Lankford, Director Solutions Engineering, BeyondTrust.
Many are opportunistic, meaning that hackers scan the internet for network or system vulnerabilities to exploit, send users tailor-made phishing emails, or even target specific companies in the hopes that they lack strong cybersecurity practices. Usually, after the cybercriminal has managed to gain access to a business’s network, the next step is to cause as much harm as possible or, more often, steal information. This is something we’re likely to see in the future as well – attacks focused on data modification and corruption that are well-orchestrated in advance to provide maximum output for the threat actor.
For security teams hoping to mitigate cyber-attacks, knowing the stages that an attacker goes through can provide useful insight into the tools, tactics and procedures that are used, which could help determine whether the attack is linked to a wider campaign. More crucially, it allows IT teams to build a profile of attacks and gather intelligence that can improve incident response procedures and prevent future attacks.
The Cyber Kill Chain
When understanding the sequence of events involved in an external attack, we see that threat actors establish a defence first, steal privileges second and then begin to move laterally within the organisation to gain access to sensitive systems and data.
Initially, these compromises are an end user’s workstation, but phishing attacks, browser attacks and vulnerabilities which have been left unpatched can also lead a threat actor in. In any case, once that cyber attacker starts moving laterally within systems or is persistently present, this requires an elevation of privileges.
How do cyber attackers gain elevated privileges in the first place? From two places: stolen credentials or a vulnerability in the system. When a threat actor is undergoing a cyber-attack, they will try and leverage both or even just one in order to install malware, establish a persistent presence and identify credentials that have elevated privileges. To counteract this threat, employee cyber security education and defensive security solutions are foundational to an effective mitigation strategy but it’s just not enough in today’s threat landscape.
Vulnerability and Patch Management, and PAM as a defence
So, what can organisations do? On a basic level, to protect against a targeted attack, organisations must implement strong cyber hygiene practices into their organisations and do it extremely well. Based on the consideration that every attacker aims to steal privileges – and there’s only two ways to do this – every business needs to welcome vulnerability and patch management and privileged access management (PAM) if they are to overcome these cyber risks.
Having vulnerability and patch management on every device, resource and application means that a threat actor cannot take advantage of vulnerabilities and exploit combinations to install malware and elevate privileges. Endpoints are crucial but are often ignored in favour of just keeping an eye on servers. All points of access including desktop computers, the cloud, IoT and DevOps should be monitored for potential cyber threats and where possible remediated or mitigated from an identified threat. While this may sound like a hefty task, by having risk prioritisation and a risk acceptable policy in place, an organisation can ensure a procedure that defends against threats and is easy to implement. (or do you want to say “time and time again?)
Secondly, having privileged access management solutions in place is paramount to the operation of strong cybersecurity in an organisation. PAM involves the management of all passwords and keys, their continued rotation, the check in and check out of passwords and session management to ensure passwords cannot be she shared, reused, or stolen amongst employees or otherwise. Also important is the principle of least privilege and the removal of privileged accounts from daily users who may not need that level of access. Furthermore, instead of elevating users to perform privileged tasks, it’s worth considering the elevation of applications instead. By doing this, threat actors will find it much harder to obtain privileged credentials that come from poor credential hygiene in the business and use them against other resources in the organisation. As such, the risk of lateral movement is diminished.
The endless tools and solutions in the market can often blur the understanding of which ones organisations actually need to implement, but regardless of a business’s current IT security posture, vulnerability and patch management and privileged access management can stop or at least drastically delay a cyber-attack.
Only by getting to grips with the cyber-attack chain and what a cybercriminal needs to achieve in order to compromise a network, will an IT team have the insight needed to put in place defensive strategies to keep an organisation secure.
Is IT security improving, and will it continue to improve, or are the bad folks winning the battle? Are data breaches inevitable, and therefore it's all about managing the fallout from these, or is it possible to prevent security breaches from occurring? DW asks vendors for their views on the digital security landscape. Part 2.
“Recent figures show a rapid increase in online fraud, incidents such as Marriott and BA suggest nothing is changing, and I know of one UK police area where almost half of recorded crime is cybercrime. I do think, though, that the story is more nuanced and that the truth may be somewhat less bleak.
Which is not to say the issue is going away, or that it will ever go away. Cyber-attacks are here to stay and a fact of online life. But the move of the subject from specialist to mainstream media has turned heads; organisations, businesses and to a lesser extent individuals are now more aware of the problem and ignorance can no longer be a defence. When I first started in this industry, our clients were almost without exception in the financial services or technology sectors. Now, we have clients in every sector of the European economy, from manufacturing to extraction to retail (a lot in retail) to research establishments, power generation, water and sewage and more. This is not because the threat has expanded into these areas, it is because these areas now realise they are being attacked and may have a lot to lose. As a result, the economy as a whole is becoming better prepared and is beginning, perhaps still too slowly, to take the necessary steps to defend itself. In that sense, the cyber issue is being managed better and attacks are being averted.
Technology – security technology – is also improving and the market driven by cyber-attacks is now creating innovative solutions. It is worth saying, however, that technology is not a panacea and will almost certainly never be so; user behaviour, leadership and the fact that technological breakthroughs atrophy at some speed all mean technology is not enough. But it is helping nonetheless.
Governments are playing a role. The recognition of cyber as a national security issue has broadened the definition of the Critical National Infrastructure (CNI), created regulation (GDPR for one), and in places forced cyber security into boardrooms. In the UK, the National Cyber Security Centre has the key role in protecting not just the CNI but also industry more broadly. Resource will always be a problem, but the engendered new security culture is delivering a change of mindset and behaviour. All of these things combine to mean, for me, that at a macro level cyber security is improving.The cyber security industry is fond of saying, “it is not a matter of if but when” one is attacked. I am not sure that is very helpful, though it may be a truism. The cyber threat must be treated as yet another risk faced by businesses, and managed as such. Good cyber security is about risk management, using the frameworks and techniques which have served us well for years across many other risks such as financial, legal, physical political and more. Yes the risk is new and in some cases deeply complex, but then so are the other risks that have been managed to acceptable levels. Organisations – and individuals – who adopt risk management of cyber security suffer fewer, and less damaging, cyber-attacks. That is, Enterprise Risk management; understand the threat, decide if you can live with it or wish to mitigate, and then act. Cyber security is a fact of online life, but it isn’t something to be afraid of. Engage with it, manage it, and we will all be more secure.”
It’s a classic question and the metaphor is quite appropriate. The follow-up question should be, “Is the glass getting more or less full?” Are attackers or defenders gaining the advantage? According to Andy Harris, CTO, Osirium.
The challenges facing anyone responsible for IT security aren’t getting any easier. The world is getting increasingly complex – more devices, more users, more locations, more mobility – and attackers are more efficient and effective.
The concept of the lone hacker is out of date. Instead we face highly organised gangs, and nation states, with plenty of resources at their disposal. Groups are also getting better at the psychological aspects of social engineering to get unwitting accomplices to compromise their own (or their company’s) systems.
Modern life depends on technology and the potential impact of an attack could be far more devastating than any time in history.
Thankfully, there are some signs of good news.
In return, organisations are getting better at understanding the risks they’re facing and are building appropriate defence and response mechanisms. Most CISOs have moved on from building higher walls, with a view of keeping all attackers out, to accepting that breaches will occur and preparing to instead limit damage and ensure a fast recovery. They’re also making informed judgements to balance the cost of prevention against the likelihood of attack and cost of the damage when the attack occurs. These are strategic decisions and it’s encouraging that the CISO role is now so prominent.
I believe good defences and responses are actually now being seen as a competitive advantage.
Part of this plan is ensuring good segmentation between different users, whilst still ensuring users have the correct, but least, permissions needed for their work.
In parallel, technology defences are also improving. Tools are far more sophisticated including applying Machine Learning techniques to huge amounts of data to spot attacks and, in some cases, automatically start containment and remediation, much faster than any human could achieve.
The human factors are also evolving. The new generation of employees entering the workplace are far more aware and prepared than their predecessors. They already control their Facebook posts and have multiple accounts for different audiences. These are smart people that won’t fall so quickly or easily for phishing attacks as the generation before did.
Circling back to the original question about that glass. In recent times, the balance has been tipping in favour of the attackers. However, I think there are enough fundamental changes in understanding threats, and preparing for them, that it is starting to tip the balance the other way. However, from a business perspective, it’s far from full, and no-one should be complacent.
The glass isn’t empty but, like an over enthusiastic waiter, you have to deal with it getting changed a heck of a lot quicker than it used to, explains Marco Rottigni, Chief Technical Security Officer EMEA at Qualys.
I don’t think IT security teams should be pessimistic. However, the job they have to do is more difficult than it was previously, as business demands have created more complexity and risk. The challenges that they face are evolving, and they have more platforms to track and care about.
Most enterprises today are in the process of moving to the cloud - this means that traditional IT security teams have to understand how these deployments are secured, what their responsibilities are, and how they have to operate in this new environment. The shared responsibility between internal security teams and public cloud providers has to be understood and planned ahead, even while there are new tool launches taking place all the time.
The pace of change here is faster, which can make it more difficult to keep IT systems secure over time. The problem is that security is not built in from the start. Focusing on building security into everything from the start - instead of bolting it on later - would improve hygiene and reduce the vulnerable surface. This would reduce the number of events to investigate and reduce the number of signals that would have an impact on the security operations team.
Software developers are moving over to new methods like microservices, containers and serverless - these are innovative ways to deliver new digital services and transform companies’ operations, but they are not secure by design. IT security teams have to get to grips with these new software models that can be ephemeral and based on demand levels. For instance, serverless technology is stateless. It has no record of previous interactions and is designed to respond to specific events for a very short window of time and then shut down again. Keeping track of these assets and this activity is much harder if you don’t understand the use case, so getting security involved from the start is essential here.
Alongside these new innovations, IT security teams still have to meet the demands around more traditional IT management tasks. Areas like vulnerability management, IT asset and inventory, and patch management all impinge on security. You have to know what you have, that it is up to date, and that it is secure. These are not solved problems, however long they may have existed.
Patch management and software vulnerability management issues often overlap, so getting the right data can help multiple team members achieve their goals and deliver security. Equally, where software assets can’t be updated due to their dependencies, other security processes can be implemented to keep assets protected.
For IT security teams involved in everything from the most traditional on-premise application through to new container-based services running in the cloud, the risk is losing insight into what is happening and what has changed over time. Without this complete overview, IT security will not be able to achieve what they want to. However, with this insight, security teams can deliver better security for the whole business and across whole processes like software development from the start.
To use the glass metaphor - security should not be viewed as glass half-full or half-empty. Instead, the glass is being changed far more frequently. Keeping on top of this - by being proactive, using data and collaborating across teams - is essential and achievable.
Marco Hogewoning, Senior External Relations Officer at the RIPE NCC, comments:
“Consumers are generally unaware of how much IoT devices communicate with their respective service providers and the Internet in general. They might be surprised just how much data their connected TV is sending back to home base for example. As increased bandwidth becomes available to users’ home networks, together with connected devices becoming more powerful, there is a significant risk that even a relatively small number of compromised devices can launch devastating denial-of-service attacks against other users, or common services such as the DNS.
As many IoT devices use common components and software, there is a risk that a wide range of them turn out to be vulnerable through a common fault. This allows attackers to quickly expand or include new targets once a vulnerability has been identified. New devices also open up alternative and unorthodox routes through which to penetrate networks, such as via a connected fridge. This means that security is something that everyone in the IoT ecosystem will need to be increasingly more mindful of – as it is inextricably linked to the number of connected IoT devices in operation.
Given the massive security, privacy and safety risks, regulation might be inevitable and could become the equivalent of fire safety regulations. On the other hand, regulations would take time to be developed and compliance could disproportionately impact small players or producers of low-margin products. Some also feel that regulation would damage permissionless innovation.
While we acknowledge regulation is one of the available tools, the RIPE NCC would encourage parties to first try and engage in the IoT ecosystem to identify constructive solutions that can be adopted under a regime of industry self-regulation and voluntary but universal adherence to common standards and norms.”
Chris Bush, Head of Security, ObserveIT about why data breaches aren’t inevitable and can be prevented with the right proactive approach:
“Ponemon research shows all types of insider threats are increasing – since 2016, an increase of 26 percent for negligent insiders and 53 percent for malicious insiders. But that doesn’t mean that businesses have to sit idly by and wait for the inevitable. Businesses have begun to sit up and take notice of the insider threat, with global information security spending to exceed $124 billion in 2019, according to research firm Gartner.
“There are a multitude of tools that companies can implement to tackle the insider threat, but it’s essential for businesses to understand that investing solely in security technology is not the be-all and end-all solution. Organisations should also focus attention on building robust, yet enforceable, cybersecurity policies, tested incident response plans and clear and consistent employee training programs.
“By approaching the insider threat holistically, with a management plan that leverages people, process and technology, businesses stand the best chance of dealing with cyber threats before they become costly breaches. This begins with visibility into user activity – knowing the whole story of what, when and where things happened when it comes to the company network. With comprehensive visibility, businesses can detect potential threats in real time and view user activity and data movement in context. This allows quick assessments of risky behaviour and fast and effective investigations if the activity warrants it.
“When combined with regular communication and coordination across the organisation – among security teams, IT, HR, legal and executive leadership – businesses can prevent data exfiltration and learn from incidents to do better in the future. There is no magic bullet when it comes to stopping data breaches, but by taking a practical, proactive, and systematic approach, the problem of the insider threat is a solvable one.”
When it comes to operational excellence, hazardous industries say: let’s get digital.
As our third-annual Operational Excellence Index shows, digital transformation is upon us. As companies look for new ways to keep their people safe, their operations productive and their products sustainable, being able to tap into and monitor data from Industry 4.0 solutions will be a major differentiator for organizations looking to separate themselves from the competition. It’s not surprising that 90 percent of the people who responded to this survey agree that digital technology will accelerate Operational Excellence. We couldn’t agree more. Sphera believes digital transformation is the wave of the future for Operational Risk mitigation.
Paul Marushka Sphera President and CEO
IT’S CLEAR: DIGITAL IS THE FUTURE
There is no doubt digital transformation is fueling operational excellence. 90 percent of industry leaders agree digital transformation will accelerate their ability to achieve operational excellence – not just as a one-off target, but as a sustainable way of doing business. This is a stark increase from 2017, when 73% of leaders reported the same.
What’s more, the main objectives for digital transformation are fully aligned with widely recognized business objectives within the industry: 73 percent of this year’s survey respondents cited operational excellence as the key objective of their plans for digital transformation, with 55 percent aiming to reduce or manage operational risk, and a further 55 percent aiming to improve asset availability and uptime.
In other words, digital transformation and OE are acknowledged to be intimately connected to the bottom line, and long-term efficiency and profitability.
As a recent report from Accenture Strategy on the oil and gas industry acknowledged: “The right digital approach can produce an EBITDA improvement of 30- 35 percent*.”
IT’S ALL ABOUT THE INSIGHT FROM ACROSS THE ORGANIZATION
It is also recognized that digital insights will, in effect, become the feedstock of operational excellence of the future. Looking at the way technology is leveraged today; 58 percent say it delivers key performance indicators (KPIs) and metrics that reflect the operational reality. A further 45 percent say it improves prioritization and planning, while 44 percent say it delivers predictive analytics that keep assets up and running.
Looking to the future, 75 percent say it will create new, insight-driven business processes across various functions. More specifically, 70 percent say digital twins will enable modeling that simulates what-if scenarios for better planning and informed decision-making.
Nearly the same number – 69 percent – say it will connect disparate data as well as separate systems that will create truly actionable insights. Meanwhile, 65 percent say technology will provide advanced analytics that will enable them to better understand where to make operational improvements.
BUT – CONFUSION AND CONTRADICTION STILL PREVAIL
Survey respondent insight: “Companies must first understand what digital transformation means before investing in implementing it wrongly!”
The future certainly looks bright. Nonetheless, there are hurdles to overcome. Operators are moving slowly and still trying to figure out the path they need to follow: indeed, 69 percent of survey respondents said that their company is only just starting or is currently implementing digital transformation projects, while 53 percent believe their company is still trying to figure out what ‘digital transformation’ means to them.
Again, this reflects Accenture Strategy’s report, which stated that: “58 percent of energy company leaders – the highest percentage of any industry group – admit that they don’t know how to keep pace with technology innovations.”
SOME AMBITIONS ARE UNCLEAR
This is perhaps to be expected when digitalization and digital transformation can be interpreted in different ways. There can be a disparity of perspectives within an organization as to what its digital ambitions are – or should be. For some, digitalization involves tweaking at the edges and automating standard, paper-based processes (the paper-on-glass approach). For others, it’s a far more disruptive force that will impact the entire business and operating model. For many, it is something in between.
This may account for the relatively low seven percent who suggested they were securing sustained return on investment from their digital transformation projects: firms that are still to establish what digital transformation means to them are unlikely to see long-term results.
Nonetheless, all this points to a wider issue: the starting gun has been fired on the race to second place. The earliest adopters have taken the risk, and are learning lessons.
Digitalization is no easy feat, but no one wants to be the last out of the gate – that’s a possibility a significant number of firms across the hazardous industry sectors are facing.
Survey respondent insight: “When it comes to digitalization across industries – it can be compared to a marathon. There are frontrunners (pioneers) and plodders (denial and resistance). There are those that “get it” and those that never will … until it becomes the norm.
THERE ARE VERY REAL CHALLENGES TO OVERCOME
Survey respondent insight: “[Digital transformation] requires support from the highest levels with a vision and willingness at all other levels to be part of the solution and support.”
This year’s survey also highlighted what senior leaders believe to be the major barriers to advancing digital transformation projects. Nearly half (47 percent) say it is the corporate culture, for 38 percent it is budget or investment priorities, while 31 percent say it is due to a lack of digital expertise within the organization. For 28 percent, effective leadership is missing.
As these numbers indicate, there is something of a vicious circle at play here. Without digital expertise, for example, the culture is unlikely to change. Survey respondents believe stakeholders, across the board, need to get involved in digital transformation, with the greatest need coming from regional head(s) at 55 percent and frontline operations and maintenance at 60 percent. But without a changing culture, digital expertise is unlikely to be hired or new budget allocated. All of which points to the absolute necessity of effective leadership as the cornerstone of digital transformation projects.
AND CULTURE AND TECHNOLOGY ARE INTERTWINED
Survey respondent insight: “Technology is a distraction. Many organizations are held back by the terminology and lack of a plan that is relevant to their specific goals.”
A deeper dive into the survey responses shows two distinct areas of concern that reflect these top line responses. The first is working culture. Only 37 percent believe organizations are fully integrated units and that people are generally able to work in a cross-functional manner. The fact that 57 percent also say operational decisions are made with single-function decision-support data is further evidence of the ‘silo’ problem that afflicts many organizations.
The second issue is approach to technology. Only 17 percent of respondents say they have embedded operational best practices into a cross-functional enterprise management system. The remaining 83 percent appear to correspond with Accenture Strategy’s report that 82 percent of oil and gas players still rely on legacy systems to improve their business agility. Either way, the lack of digital expertise – as distinct from IT expertise – is being felt directly: the marginal gains to be had from piecemeal implementations or from pushing existing technologies to the limits of their capability have been all but exhausted.
The OE survey also shows that only 41 percent have an enterprise-wide view of their digital transformation strategies, and according to 52 percent, evaluation and deployment of new technologies is still conducted at the level of an individual plant or asset, or on an ad hoc, case-by-case basis. The technology story reflects the cultural problem: siloed thinking, information, and decision-making has been enabled and driven by siloed systems.
Survey respondent insight: “In most cases, analysis is done separately and outcomes are discussed separately. This limits the horizon of prediction and results are not long lasting.”
THE GOOD NEWS: THE PATH FORWARD IS CLEARER
Survey respondent insight: “Lasting results are best achieved by integrating the practices and processes into the enterprise system.”
There is a strong sense coming through from this year’s survey that the industry is on the brink of a major step forward when it comes to achieving operational excellence through digitalization. Digital integration is almost universally seen as the new baseline, and many of the building blocks are in place – data, systems, and business processes.
What’s more, there is growing recognition of what needs to be done to achieve these ambitions. With plenty of data now available, turning disparate information points into valuable insight is becoming a priority, and 75 percent of senior leaders now believe that creating new, insight-driven business processes across functions is key to OE.
On average, an impressive 94 percent believe digital transformation requires the integration of multiple disparate systems and tools. Above all, new cross-functional business processes are seen as fundamental: on average, 98 percent believe functional business process collaboration is essential for delivering operational excellence.
FROM HERE TO THERE: A HOLISTIC APPROACH
Survey respondent insight: “When considering how quickly technology is becoming available, it is important for companies looking to invest to understand how it will benefit the entire value chain, i.e. don’t restrict the business case to a single functional discipline.”
A joined-up approach that first looks at and reimagines business processes is essential for enabling digital insights that can power cross-functional working. The evolution of technology is accelerating, creating new possibilities for operators, and making it clearer what tools are most likely to be adopted as hazardous industries move towards their new future.
Digital insight will be powered by technology in key areas. Among senior industry leaders, the anticipated growth rate for predictive analytics is 211 percent. For Industrial Internet of Things (IIoT) platforms the anticipated growth rate is 363 percent; while the anticipated growth rate for digital twins is an impressive 440 percent. All of which point to a far more integrated, cross-functional future – and better decision- making from board-room to front-line.
Survey respondent insight: “The best approach to digital is not to use technologies to close gaps that you know already exist. Rather, start with a blank sheet of paper and define what you need – and then assess the available technologies.”
Petrotechnics, a Sphera company, conducted the survey between October and November 2018, collecting 116 responses from a broad representation of functions, demographics and industries across the hazardous industries, including: oil, gas, chemicals, manufacturing, utilities, mining, engineering, and other sectors.
Over a third (38%) of respondents represented the upstream industry, 25% downstream and 13% petrochemicals. Midstream and chemicals accounted for the remainder. The survey also included respondents from each major region – specifically North America (32%), Middle East (27%), Europe (22%), Asia Pacific (12%), Africa (3%) and South America (3%).
Of the job functions, the most prominent was health and safety executive (15%), followed by operations (13%), asset management (9%), operational excellence (9%), planning & scheduling (7%) and data & information management (6%), executive management (6%), project management (4%), reliability (4%), strategy (4%), audit, risk & compliance (3%) maintenance (3%), plant management (1%), other (16%).
*Oil and Gas: Moving from high hopes to higher value. Accenture Strategy, 2018.
Is IT security improving, and will it continue to improve, or are the bad folks winning the battle? Are data breaches inevitable, and therefore it's all about managing the fallout from these, or is it possible to prevent security breaches from occurring? DW asks vendors for their views on the digital security landscape. Part 3.
Nigel Ng, VP of International RSA, gives some answers to the questions posed in our security special focus:
Is IT security is improving and will continue to improve, or will the bad folks win / are winning the battle?
It’s important to recognise that we’ve reached an inflection point in the evolution of digital and digital transformation initiatives. Businesses are increasingly using data – whether to improve customer experiences or optimize their supply chain. Whatever the business reason, each of these objectives carries with it its own digital risks—be they customer data breaches, intrusion into secure networks, business disruption, or employee access issues. To prevent the ‘bad folks’ from winning the battle is to implement a Digital Risk Management strategy within the heart of the business.
Are data breaches inevitable, and therefore it's all about managing the fallout from these?
Businesses cannot eliminate digital risk without halting the progress of digital initiatives and need a way to manage these risks by breaking down the barriers between their security and risk functions. As more companies look to digitise, the biggest risk is around the loss of customer trust if their personal information is compromised.
Is it possible to prevent security breaches from occurring?
Managing digital risk effectively requires collaboration between security and risk management teams. Aligning security and risk provides organizations with visibility so they can be sure they have the right information and business context, insights to understand what is happening and determine how responses can be prioritized and the ability to take appropriate and timely action. Data breaches can’t be prevented, and leading companies will develop strategies to manage their digital risk while continuing to grow.
Ben Rafferty, Chief Innovation Officer at Semafone, offers his take on the current state of IT security - the glass is neither half full or half empty - and there is always the danger of the glass being smashed.
No matter how sophisticated the advancement in new security technologies, you can never say that a data breach will not occur. With so many high-profile breaches hitting the headlines, protecting reputations and customers has to be a priority for any organisation, which means that IT security needs to be the cornerstone of any organisation. It also needs to be embedded in the procurement and products an organisation buys and sells.
There’s an old saying, “When chased by a bear, you only have to be faster than the slowest person.” Well, the world has changed - and so has the bear. The bear is actually the “bad folks” who are hungry and relentless in their hacking pursuits and won’t stop with their first successful attack. Attacks today are automated and scripted, with bots continually finding vulnerabilities and compromising organisations’ security defences.
On the positive side, IT security is improving. Organisations are taking strides to implement measures such as training staff continuously and rigorously, having a crisis comms plan in place and keeping up with the latest technology. Increasingly, they are implementing preventative measures before, not after, attacks – and not just to comply last-minute with a piece of particular legislation, such as the GDPR and PCI DSS.
However, the bears are also keeping up with the latest technology. Hacking can be a ludicrously easy thing to do if an organisation’s security defences are weak – the black market is rife with stolen data that is cheap and easy to obtain - and the techniques used in cyber-crime evolve continuously. The key is to remain vigilant; ensure you are up-to-date with all the patches, ensure your teams are trained and are alert to potential threats and most importantly, only hold data if you really need to – your data can’t be hacked if it’s not there! If you do need to keep data, make sure it’s fully encrypted and locked down.
It suggests that no matter what we do to keep criminals out, whether that be strengthening human or technological defences, this will never be good enough. The fact is that there is still plenty of work to be done to improve IT security. For example, it’s clear from a number of the high-profile breaches that have hit the headlines that many enterprises are still relying on basic methods of authentication such as passwords, or weak multi-factor authentication techniques such as one-time SMS codes which can easily be intercepted by malware. Furthermore, many users are still continuing to re-use passwords across multiple accounts, which increases the likelihood of their credentials being stolen if a breach does occur.
Compounding poor practices is the effort cyber-criminals are putting in to adapting to existing security measures and improving the effectiveness of their attacks. They are trading knowledge in underground marketplaces which is allowing them to specialise in particular aspects of cyber-crime, for example, breaking into accounts, or stealing security credentials. By sharing tips and hacks with each other, they’re often able to stay one step ahead of security protocols. They are also focusing their attention on more insecure communication methods organisations are using to interact with customers, such as email, web and phone.
To keep up with the rapidly changing strategies of criminals, more value and importance must be placed on dynamic and flexible controls. Organisations need to invest in the collection of high-quality data that will provide the basis for these controls, as well as the informed decisions they need to make on threats and criminal activity. Although there are a number of tools on the market, we’re seeing the emergence of the next generation of intelligent security such as adaptive authentication, which uses AI and machine learning to score vast amounts of data, analyse the risk of a situation, and adapt the authentication levels accordingly. For example, if a user checks their online bank balance from a recognised device and location, they would only need to go through basic authentication requirements to gain access to their account. However, for higher-risk activity, such as high transactions, that fall outside of normal behaviour, additional authentication will be required. By combining a range of authentication tools such as multifactor authentication, behavioural analysis, biometrics, and even pulling in data from third party tools, adaptive authentication makes staying ahead of the cybercriminals becomes that little bit easier. Security moves from being a black and white binary story to becoming precise and intelligent – providing the exact level of security as and when it is needed.
At a time when breaches regularly hit the headlines, it’s crucial that businesses have the tools to outpace attackers before systems can be compromised, or data stolen, says Iván Blesa, Director of Technology, Berry Technologies.
As businesses continue to evolve digitally and IT networks become more saturated – especially thanks to the explosion in connected devices – network monitoring systems must be able to interpret even the most complex behaviours in real-time to detect potential threats. Unfortunately, many businesses are still relying legacy approaches to this, with systems powered by rule-based automation that works off historical data. The danger of this lies in the fact that threat detection is entirely restricted to previously seen behaviour, hindering organisations in their ability to detect new threats.
The good news is technology has advanced so that we’re now seeing a new breed of intelligent network monitoring able to analyse vast amounts of data, detect anomalies, and proactively identify new threats. These tools are powered by advanced automation methods, specifically unsupervised deep learning. Unlike legacy automation, deep learning does not work off historical data. Instead, it continuously adapts and responds to an organisation’s network behaviour to detect anomalies and proactively look for the unknown, to uncover the first-seen and most sophisticated attacks that we’re witnessing today.
As deep learning becomes smarter over time, threat detection will only become more intelligent in its ability to outpace and outwit attackers. As a result, businesses will be able to prevent even the most advanced methods of attack from occurring. Furthermore, automated anomaly detection reduces the burden on overstretched security teams, allowing them to focus on the most rewarding part of their job: the investigation and detection of complex malicious activities. By automating detection and accelerating access to the information, teams can collaborate and focus on understanding the root cause and the total extent of campaigns against organisations. As a result, security teams’ efficiency is boosted, cyber security analysts work is highly valued, and the overall organisation security is strengthened.
Cybersecurity has often been seen as a game of cat and mouse, where organisations are always one step behind attackers and security tools are merely reactive. Having technology in place that can mitigate damage once it’s occurred is important, but enterprises that are exclusively reliant on reactive solutions will remain on the back foot when it comes to security. The rise of intelligent network monitoring is proof enough that data breaches are no longer an inevitable threat facing enterprises, and can in fact be prevented from occurring in the first place.
In terms of human cyber crime, it’s very much a cat and mouse situation.
Organisations are generally taking the security of their people more seriously than they were 5 or 10 years ago. Data breach headlines have been a wake-up call for many. GDPR has been another clarion call for businesses to take action - setting the bar that little bit higher.
But while businesses, on the whole, are doing more to combat cyber crime, cyber criminals are also getting smarter.
Yes - run-of-the-mill 451 scams are still popular, poor spelling and grammar are commonplace, and amateurish criminals dominate the space. But increasingly, social engineering is a sophisticated, organised operation.
There’s a growing state-sponsored element, with access to advanced tools and technology. Organised criminal groups (OCGs) with similar capabilities to state-sponsored hackers are also more prevalent than ever before. The rise of both of these groups is a worrying trend that doesn’t look likely to stop any time soon.
Are data breaches inevitable?
While most data breaches are preventable, the unfortunate truth in human cyber security is that sometimes, yes, data breaches are inevitable. With that specifically in mind, it’s important that business leaders don’t go around blaming staff when things go wrong. Often it’s not their fault, and handing out blame does more harm than good.
Data breaches tend not to be the fault of lone individuals; rather, they tend to be the fault of a systemic problem, such as poor cyber security training.
As James Reason - a psychologist who has long researched human error and error management techniques - argues, “a substantial part of the problem is rooted in error-provoking situations, rather than error-prone people.”
Are the bad guys winning the battle?
Ultimately, whether businesses are winning the battle against cyber criminals will vary from business to business. There are plenty of CIO, CISOs and Risk Officers in the UK doing amazing work to reduce risk, and who are implementing innovative strategies and technologies. On the other side of the coin, there’s also a surprising number of UK businesses falling below well below acceptable standards.
Our own research at CybSafe suggests that only around three quarters of UK businesses are using basic cyber security precautions such as anti-virus, and only about half have cyber security training in place, let alone effective training. In the case of these businesses, the bad guys are always going to be winning the battle.
Edge computing promises the superfast processing our data-driven society increasingly depends on. But does this impending revolution signal the death of the data centre as we know it?
By Leo Craig, General Manager of Riello UPS.
Our personal and professional lives are already dominated by the ‘Internet of Things’. The UK alone is home to 272 million connected devices, a figure set to more than double within the next five years to 600 million. By 2025, each of us will interact with a ‘smart’ sensor, app, or gadget an average of 4,800 times a day – once every 18 seconds!
Smartphones enable us to shop online, browse the internet, stream films or watch TV on-demand, remotely adjust central heating temperatures and more at the touch of a button or swipe of a screen. While automation, robotics, and machine-to-machine communications play an increasingly influential role in the commercial sphere and public service delivery too.
Data centres underpin modern society – it’s only a slight exaggeration to claim virtually everything we do depends on data storage and processing. But can our existing IT infrastructure keep pace during this next phase of rapid digitalisation?
In reality, it’s already struggling to cope. And that’s before we even consider the looming – and much-anticipated – introduction of superfast 5G wireless.
The traditional enterprise data centre has served us well for decades, with cloud facilities an additional option in more recent years. But with real-time low latency processing becoming an absolute must, both these options tend to come up short.
Sending packets of information generated by all those sensors in our factories, offices, or smart devices to a centralised location, processing it there before ‘returning to sender’, doesn’t meet the needs of today’s interconnected reality.
Every millisecond counts, so relying on data connections potentially hundreds or even thousands of miles apart results in throttled processing speeds, leading to bottlenecks in decision making. With automation and artificial intelligence taking on a progressively bigger role in daily lives, such log-jams could have catastrophic consequences.
For example, factory production lines could come grinding to a halt. Sensitive scientific research or pharmaceuticals manufacturing could be ruined in an instant.
In addition, transporting petabytes of data to and from a centralised location takes up massive amounts of expensive bandwidth. And that’s before we even consider the lucrative target such a steady flow of potentially lucrative information offers to cybercriminals or hackers.
Finding Answers At The Edge
So what’s the solution? Rather than transfer the data to and from where the processing power is, why not bring the processing to where the data actually comes from in the first place?
In essence, that’s what edge computing is. Compact and flexible facilities installed as close as possible to the original information source, be that the factory floor, office, hospital, or perhaps a densely populated area.
Processing at the edge removes the need for lengthy two-way data flows, satisfying the IoT’s demand for a low latency response. It also reduces internet bandwidth and enables more information to be stored locally, limiting the threat of data corruption or harmful cyber-attacks.
This move away from hyperscale data centres to compact ‘centres of data’ is aided by the development of modular and micro data centres.
It goes without saying that it’s neither feasible nor financially-viable to simply build a full-scale data centre next to any production plant or office block. Edge processing tends to be housed in spare spaces – places not originally built with the particular needs of a data centre in mind, such as an empty office, a spare corner of a warehouse, or outside in the car park.
That’s where modular data centres come into their own. Basically, these are prefabricated facilities, often housed inside a secure steel shipping container. They include all the core elements and technologies you’ll find in your typical enterprise data centre – UPS systems, PDUs, cooling, server racks and cabinets, cables, communications and monitoring software – but in a significantly-condensed footprint.
Such data centres are built offsite and can be ready to ‘go live’ within eight weeks of initial design. Containerised data centres are weather and fire-proof, making them suitable for virtually any indoor or outdoor environment. While as long as they meet Lloyd’s Register Container Certification Scheme (LRCCS) criteria, they can also be easily transported between sites by road, rail, sea, or air without having to be broken down and reassembled at its new location.
Another advantage of modular data centres is the scalability to increase capacity as and when required by adding in either additional cabinets or containers themselves, even in some cases stacking them on top of each other.
Powering The Revolution
The micro or modular data centres making edge processing a reality rely on continuous, reliable power in the same way as any substantial enterprise or hyperscale facility does. That’s why uninterruptible power supplies are an integral part of any installation. They ensure system availability when there are any problems with the mains supply, minimising the risk of disastrous downtime.
Due to the footprint restrictions common with containerised data centres where space is at a premium, traditional static tower-style UPS often aren’t particularly suitable. Thankfully, recent advances in UPS technology have led to modular systems, which are ideally suited to power micro data centres.
Systems such as our award-winning Multi Power (MPW) pack exceptionally high power density into a compact footprint as well as offering the ‘pay as you grow’ scalability to add capacity or redundancy where necessary.
Modular power supplies deliver other advantages too – they’re based on transformerless technology that enhances operational efficiency, they generate less heat too so don’t need as much air conditioning, while power modules and batteries are hot-swappable, guaranteeing downtime-free maintenance. They’re the perfect partner to power edge processing.
Does Edge Signal Doom For Traditional Data Centres?
Obituaries proclaiming the ‘death of data centres’ have been written many times before – Gartner analyst Dave Cappuccio boldly predicts the traditional enterprise data centre will actually be extinct by 2025.
Within the next five years, it’s predicted 75% of all data will be processed at the edge. Currently, that figure stands at roughly 10%, an astonishing shift in behaviour.
However, it’s still too early to consign hyperscales and the cloud to the past. As we head into the era of superfast 5G, it’s more likely we’ll need to strike a healthy balance in the range of processing power that we deploy.
So yes, real-time processing power will undoubtedly rely on the edge. It’s the obvious solution to deliver the desired low latency. But wider trend analysis, performance reporting, general data storage and countless other non-time-sensitive activities can still take place either in the cloud or at a centralised data centre.
It doesn’t boil down to an either-or choice, it’s about finding the right mix that satisfies our increasingly interconnected world in the years to come.
Cyber threats originating on the public internet currently pose the biggest issue for security teams. The increasing sophistication and variation of tactics employed by threat actors, combined with their ability to easily hide, have left organisations struggling to develop any kind of proactive posture to detect and defend against them.
By Adam Hunt, CTO, RiskIQ.
A new approach is needed, and it must be one that relies on automation to deliver the scalability to transverse the internet, powered by sophisticated detection through machine learning that can automatically adapt as adversary attacks change.
The public internet is a place where most organisations have little to no visibility. This lawless expanse sits beyond the comfort of corporate firewalls, proxies, and other traditional cyber defences. Not only is it outside the purview of network security, it's also incredibly vast. This creates a broad and inviting space for hackers to exploit brands, consumers, and employees with relative impunity.
Threats increasingly overwhelming
By taking advantage of the scale of the internet, threat actors overwhelm victims by creating thousands of domains, IP addresses, SSL certificates and content in extremely short periods and use the infrastructure to deploy scaled phishing attacks, flood the web with infringing domains, and commit brand fraud for monetary gain. Before security teams are even aware of the threat, their website or server has been compromised, customer credentials stolen, or their prospects siphoned off to a dangerous or fraudulent site. Often, they are alerted after the fact through a customer complaint, from law enforcement or worse, from reading about it in the news.
Automation helps hackers cover their tracks to evade detection from security researchers that track them, continuously changing their exploit and evasion tactics. One common tactic is the automation of randomly generated malware that changes appearance every time it's deployed. This 'high-entropy' code creates enough extra noise to confuse and defeat the static approach of signature-based detection.
The promise of machine learning
With the span of automated threat infrastructure increasing every day, and obfuscation techniques evolving, security professionals must find better ways of using machine learning to respond – and it all starts with big data. In machine learning, more data usually means smarter models, and with internet threat detection being as easy as finding needles in a cyberspace-sized haystack, this is the only viable way to stop these threats before they cause any real damage.
Luckily, efforts are underway within the cybersecurity community to collect as much internet data as possible – not just about the threats, but the internet itself. It is now possible to continually collect vast amounts of data to build a picture of what the internet’s infrastructure looks like at any given point in time. By analysing this data, machine learning models can see the internet not as the infinite, chaotic place it looks like to humans (within which threat actors can hide so easily), but as a tidy graph of highly-connected data points. That is what the internet truly is, and this is a place where threat actors have nowhere to hide.
Making machine learning effective
Using machine learning to successfully combat cybercrime requires a solid partnership between human and machine, as well as a blend of different types of models, each bringing something unique to the table. Machine learning models are broad, fast, and tireless, yet you’ll always need a human to write the rules, beginning with a simple decision tree. As the human analyses more attacks, they can build out a larger and more involved decision tree, creating stepping stones to more nuanced threat detection. Over time, the machine learning algorithms will have enough data to need less human intervention. Only then will they be fully equipped to start detecting threats at internet scale.
There are three ways to ensure the effectiveness of these algorithms in the battle against an ever-changing cyber adversary:
1. Active learning
Active learning is the development of a feedback mechanism that provides your model with the ability to identify and surface questionable items. This is critical to the model’s success. When a model is unsure how to categorise a particular instance, it typically asks for help by providing a probability or score with its prediction, which gets turned into a binary decision based on some threshold, such as ‘threat’ or ‘not a threat’, without which things can get problematic very quickly.
Imagine a junior security researcher, who doesn’t know how to assess a specific threat but thinks it might be malicious, firing off an email requesting help that doesn’t get answered for a month. Left to its own devices, the employee may make an incorrect assumption. In the case where the instance was just below the cutoff, but the threat was real, the model will continue to ignore it, resulting in a potentially severe false negative. However, if it chose to act, the model will continue to flag benign instances generating a flood of false positives.
2. Blending and co-training
Everyone knows that collaboration and diversity help organisations grow. When CEOs surround themselves with ‘yes-men’ or a lone wolf decides they can do better by themselves, ideas stagnate. Machine learning models are no different. Every data scientist has a go-to algorithm that they use to train their models. It is essential to not only try other algorithms, but attempt to marry two or more different algorithms together. This calls for techniques inspired by co-training, a semi-supervised method where two or more excellent supervised models are used together to classify unlabeled examples. When the models disagree on the classifications of these examples, the disagreements are escalated to the active learning system described above.
3. Continuous attention
Your model may work at first, but without proper feedback its performance will degrade over time, with the precision during the first week likely being better than on the tenth week. How long it takes the model to deteriorate to an unacceptable level depends on the tolerance of the security team its ability to generalise to the problem.
As attackers get increasingly sophisticated, machine learning in the hands of the security community will continue to adapt to identify threats at internet scale. Unfortunately, this cat and mouse game shows no signs of slowing down in the foreseeable future.
Is IT security improving, and will it continue to improve, or are the bad folks winning the battle? Are data breaches inevitable, and therefore it's all about managing the fallout from these, or is it possible to prevent security breaches from occurring? DW asks vendors for their views on the digital security landscape. Part 4.
This is an interesting topic and a debate that is far from over, says Andy Baldin, VP EMEA at Ivanti.
For example, it has recently been discussed whether 2019 would signal the end of ransomware because more organisations are bolstering and increasing their operational security. This viewpoint promotes the ‘glass half full’ philosophy and the question itself fosters optimism. However, with every advancement in security, additional cyber threats are exposed and new variants of existing malware evolve, which only seeks to enhance the threat from malicious actors.
Overall, it’s true that security teams are adopting a far more proactive approach to today’s cyber threats and implementing multiple technologies and processes that enable a 360-degree view of IT security. However, nothing in security is guaranteed and constant vigilance must be employed to ensure an effective response to any situation that might arise. ‘You don’t know, what you don’t know’ is a great starting point for all things in IT operational security and therefore visibility is critical, whether that is into the inventory or the location of all physical devices, or knowing which applications are running on what devices as well as ensuring that all unknown devices and applications are blocked from running. With this approach, in combination with effective reporting, businesses can ensure that improvements are made through thorough and effective audits. For example, The Centre for Internet Security releases cybersecurity controls regularly with thorough advice on how to defend against these sorts of attacks. Yet the ongoing proliferation of cybercrime that looks to exploit vulnerable technologies shows that these guidelines aren’t being wholly followed by many organisations.
It’s not a question of whether IT security is going to be ‘more’ or ‘less’ of an issue in the future, it’s about planning for every possible outcome, and if a breach does occur, that it is dealt with quickly.
“Cyber criminals develop new
s methods to infiltrate businesses on a daily basis. From creating new malware to simply target a naïve employee
“It’s not all doom and gloom though. There has been significant improvement in attitude and government backed initiatives over the last few years. For example, Members of the European Parliament (MEP) have approved a new cybersecurity act, which is set to improve IT security by increasing the trust of citizens and businesses. The new act will provide the EU’s cyber security agency Enisa with more resources and establish a common cyber security certification framework. However, this certification scheme is currently voluntary but with intention to become mandatory by 2023. Clearly, still some room for improvement.
“A fundamental mindset that needs to be adopted regardless of industry or company size, is that no system on this planet will completely stop a hacker infiltrating a network. But that doesn’t mean you have to make it easy for them. Basic security hygiene should be adopted by all as it protects businesses against simple attacks such as phishing scams – much like washing your hands can help prevent the common cold. Employee education on email security, installing firewalls and patching known vulnerabilities are simple but very effective steps that businesses can implement to greatly improve a business’s chances of remaining on the front foot.
“Whether companies or hackers are winning the battle is hard to say, as the overarching war unfortunately does not have an end in sight. Cybercriminals and the businesses they try to hack are stuck in a perpetual arms race and so IT teams need to focus their efforts on ensuring their businesses stay up and running.“IT administrators have to make important decisions every day and therefore need management solutions that use automated vulnerability scans and patch management to reduce time spent on endless repetitive tasks To stay armed against criminals, a combination of expertise and automation is necessary. There is no one hundred percent protection – but ensuring that businesses can focus their efforts on reducing the chances of a successful attack while their technology is performing checks and patches for them automatically, will keep the hackers on the back foot and give companies a fighting chance.”
“Despite this well-established trend, it is ever-more surprising that we have seen a continued reliance on password-based authentication, despite knowing that password rest at the heart of the vast majority of data breaches – and also that user credentials stolen from a breach are sold and re-used for account compromise via credential stuffing attacks. Moving to ‘traditional’ multi-factor authentication means such as sending one-time passcodes via SMS is certainly an improvement over passwords alone, but these approaches are still vulnerable to account takeovers via social engineering and/or technical attacks.
“All that being said, the good news is that the tide is turning for IT security, as we are currently seeing major organisations across the globe coming together to drive standards that enable the replacement of weak password-based authentication in favour of a much stronger, yet user-friendly approach that leverages public key cryptography via devices that consumers use every day – such as biometrics on their smartphones. A recent major milestone in this development is Android being FIDO2 Certified, providing a simpler and secure biometric login for users on more than a billion devices supporting Android 7.0 and beyond. This follows similar adoption of FIDO Authentication in Windows Hello and across all major web browsers. Now that the vast majority of consumers can leverage this modern approach to strong user authentication, service providers can start to turn on these capabilities – which will help drive a mass shift away from passwords in order to prevent the large-scale security breaches that continue to threaten the integrity of the connected economy itself.”
‘Attacks from the cyber world continue to cause immense damage and with the growth of digitisation and the Internet of Things, cyber criminals have more options than ever before. In order to combat this mounting threat, it is indispensable that companies look to achieve a state of sustainable cyber resilience in their IT security. Instead of individual security measures, a comprehensive concept is required that reduces the attack surface preventively and sustainably, maintaining operations even in the event of a security incident. In order to achieve this, companies must be able to identify weak points at an early stage, prioritise them efficiently and eliminate them.
‘Risks can never be completely eliminated, but they can be managed. Sustainable cyber resilience is not about building an impregnable fortress around an IT landscape - such a system would be far from flexible. Rather, IT security must be resistant to outside attacks. Technologies such as vulnerability management, which detect and constantly monitor weaknesses, are fundamental here.
‘Vulnerability management helps organisations uncover and eliminate technology and infrastructure vulnerabilities. In practice, vulnerability management has shown that companies can reduce their attack surface by 99.9 percent. Indeed, Verizon’s Data Breach Investigations Report shows that 999 out of 1000 successful attacks are based on vulnerabilities that have been known about for over a year.
‘Looking to the future of IT security, it is no longer enough to react to the latest malware incidents and adjust security systems accordingly. Instead of a reactive approach, sustainable prevention is needed: a paradigm shift must take place from cyber security to sustainable cyber resilience.
‘Management should assume ultimate responsibility and supervision of cyber resilience and keep abreast of new dangers and trends. Cyber resilience needs to be integrated into corporate strategies to ensure that all business processes are reviewed for potential risks and protected against attacks to prevent outages across the enterprise.’
Richard Menear the CEO of Burning Tree, looks at organisations rushing their move onto cloud based system and assesses whether cyber security has improved to encompass this
There is no doubt that IT Security has evolved over the last few years, but has it improved? Is it even possible to prevent cyber attacks from occurring? Or are data breaches inevitable?
IT Security is a constant battle, which is just as much about managing the subsequent fallout as having the right preventative methods in place!
New technologies, evolving threats
Recently, the IT sector has seen significant movement in the direction of the Cloud, driven by a desire to reduce costs, strengthen security and improve agility in the age of remote working. Migrating to the Cloud also provides businesses with the opportunity to take advantage of new architecture patterns and to re-engineer legacy services to make them more efficient and secure.
While there are many benefits of using the Cloud, it also brings increased risk when the pressure to migrate sees businesses rush to ‘lift and shift’ existing on-premise services to the Cloud with little or no re-engineering – leaving them vulnerable to attack. This is why it is vital for companies to seek the help of expert security specialists who can ensure the correct processes have been put in place to move systems and services securely.
As IT technologies such as the Cloud become more sophisticated, cyber criminals are constantly adapting their methods to match. Although it may seem like data breaches are inevitable, this arguably depends on whether or not your business is in the crosshairs of the ‘bad guys’. High-profile organisations or political parties will be a more appealing target for cyber criminals than little-known local charities or independent businesses.
Never trust, always verify
One thing is clear: as the world shifts increasingly to digital ways of working, cyber criminals won’t stop. IT Security will be as much, if not more, of an issue in the future as it is today. Every year, thousands of businesses lose billions due to data breaches and face potentially irreparable damage to their reputation. This is unlikely to change anytime soon but the chances of experiencing a breach are much slimmer if the correct software and processes are implemented.
As a result, organisations are increasingly moving to a zero-trust model, tapping into new tools and executive buy-in to improve their practices and security posture. This ‘never trust, always verify’ approach assumes those inside a corporate network are just as untrustworthy as those outside it – helping to keep businesses secure even as the safety perimeter continues to shift.
The data processing landscape is exciting to watch as different technologies evolve and alternative frameworks become more sophisticated, while the amount and speed of data generated increases by the hour.
By Aljoscha Krettek, Co-founder & Engineering Lead at Ververica (former data Artisans).
Where it all began
The invention of computers enabled information and data to be processed. In the early days, computer scientists had to write custom programs for processing data. After assembly languages, Fortran, C and Java, the 1970s saw traditional relational database systems (IBM) emerging, that enabled SQL and increased the adoption of data processing by wider audiences. SQL contributed to a larger adoption of data processing since organisations no longer had to rely upon programmers to write bespoke programs and analyse data.
Big Data arrives
When Google released MapReduce, we ushered into the era of big data. The MapReduce paper explained a simple model based on two primitives - Map and Reduce. These primitives allowed parallel computations across a massive number of machines. MapReduce brought this capability to a wider audience.
Apache Hadoop came next as an open-source implementation of the framework (initially implemented at Yahoo!) that was widely available in the open source space and accessible to a large audience. Many Big Data players had their origins within the Hadoop framework, that also brought a new paradigm into the data processing space: the ability to store data in a distributed file system or storage which could then be interrogated at a later point.
Apache Spark Is Born
Apache Spark became the next step in Big Data. Spark allowed additional parallelization and brought batch processing to the next level. This concept puts data into a storage system for scheduled computations. In this case your data sits somewhere while you periodically (daily, weekly or hourly) run queries to glean results based on past information. These don’t run continuously and have a start and an endpoint. As a result, you have to re-run them on an ongoing basis, for up-to-date results.
The advent of stream processing
Big Data saw another refinement with the introduction of stream processing via Apache Storm. This was the first widely used framework (there were other research systems and frameworks that were brought to life at the same time, but Storm was the one that saw the most adoption). Apache Storm enabled developing software programs and applications that could run continuously (24/7). Contrary to the batch processing approach, where programs and applications have a beginning and an end. With stream processing, computations run continuously on data streams and produce outcomes in real-time, at the very moment data is generated. Apache Kafka further advanced stream processing (originally introduced at LinkedIn) as a storage mechanism for a stream of messages. Kafka acted as a buffer between data sources and the processing system (like Apache Storm).
Although Apache Storm initiated a radically new approach to data processing, many data and analytics leaders questioned the framework’s ability to scale effectively and provide a single source-of-truth in the framework. This changed with the introduction of Apache Flink with strong consistency guarantees, high-throughput, exactly-once semantics and low latency. With more data and analytics leaders believing in Flink’s computational abilities, the framework became a prominent stateful stream processing framework adopted by developers in some of the largest and most innovative technology companies of the world. The capabilities and use cases of stream processing expand exponentially as more and more companies adopt the new paradigm. For example, Flink allows you to develop a fraud detection program that runs 24/7; It captures events in a matter of milliseconds and provides insight in real-time, thus i.e. preventing fraud from actually happening in the first place. Enabling real-time insight into what is happening in the world is a big shift in data processing that for the first time, allows organisations to grasp and act on events as and how they manifest in the real world.
2019 and beyond
In a recent report by IDC (the global datasphere), analysts predicted that by 2025, 30 percent of all data generated will be real-time, and that 6 billion consumers will interact with data every day. Stream processing will be one of the core enabling technologies to drive this new wave of real-time data across organizations of all industries.
Data is the lifeblood of every modern business, perfectly illustrated by digitally native companies such as Lyft and Uber whose business model is based on live streams of data. Such organisations provide an unmatched customer experience that carves out as the new standard, that other companies aspire to match in order to compete successfully. The ability to react to data in real-time, offer a personalised experience to each customer, based on their unique preferences and history, respond to issues instantaneously and constantly improve business operations is now widely expected across every industry.
In what ways will we see stream processing technologies having an impact for enterprises moving forward?
Stream processing is, for example a relatively easy way to build a GDPR compliant data infrastructure. Classical "data at rest" architectures make it extremely complex to reason about where sensitive data exists. More companies will employ streaming data architectures that work on data in motion, thereby making it easier to keep sensitive information isolated in application state for a limited time - making them naturally compliant.
Cyber security is grabbing headlines and will continue to be a critical part of the information technology landscape. In order to detect and prevent security breaches, cyber security solutions need to look for anomalies in the metrics and usage patterns across network infrastructure, applications and services - in real-time. The adoption of stream processing will continue to rise in all things cybersecurity because the technology is a great match for these requirements with its ability to collect and aggregate events, track complex patterns, evaluate and adjust machine learning models with data in real-time.
The promised bright future of 5G, coupled with the proliferation of sensors and IoT devices, will create more demand on real-time streaming data and use cases that need instant reaction to events. Stream processing will be used as an efficient way to realize "edge computing." This will happen because streaming data architecture is a great match both for processing data in advance (on devices or gateways) and running event-driven logic on the edge.
The explosion of AI applications will make distributed stream processing a necessity. Aside from pure machine learning techniques, stream processing will become a central piece to assemble complex feature vectors that are core input to sophisticated machine learning predictors. Distributed, high-performance stream processing frameworks will become necessary in order to efficiently model and pre-process increasingly complex real-time data at scale for Machine Learning models and algorithms.
The Internet of Things (IoT) is typically associated with smart homes, or vehicles. One compelling, and perhaps lesser known, use case for the Internet of Things is asset tracking. The IoT can enable items to be tracked anywhere, any place, at any time. It can also provide information on the item’s state and environment, including the level of humidity or light exposure, or evidence of tampering. Mohsen Mohseninia, VP of Market Development, Europe at Aeris, and Richard Jennings, CEO and co-founder at TrackerSense, demonstrate the scope of IoT asset tracking with five unusual use cases.
Those in the business of selling and shipping ice cream face quite the logistical challenge – how to transport the popular icy desert across terrain of varying climates. No one likes melted ice cream, especially not when it hits profit margins. If ice cream arrives at its destination spoiled, it could cost the manufacturer thousands. Therefore, it’s important to be aware of temperature changes during transit. For example, in one case of spoiled ice cream, an IoT enabled asset tracker pinpointed the exact moment when the temperature changed and caused the ice cream to defrost - it was when the shipment was transferred from one vehicle to another at a depot. This type of information has the potential to save hundreds of thousands of pounds, in addition to protecting a company’s reputation.
Vaccines & Medicines
The pharmaceutical sector ships time critical and extremely sensitive assets. Take for example vaccines. Pharmaceutical companies must now ensure that a shipment of vaccines is closely monitored and secure in order to ensure its safe arrival at its destination, wherever that may be in the world. With advances in cellular technology, GPS and location based technology, it is now possible to pinpoint the exact location of these vital assets, thereby enabling quick and decisive action should an issue occur.
The IoT can also track environmental variables such as temperature, humidity and exposure to light, which may damage medical cargo.
It is hard to think of a more precious, and time sensitive, delivery to track than blood or organs scheduled for transplant. In addition to monitoring temperature and other environmental variables, tracking can enable vital minutes to be saved. It is possible for the delivery company to geofence a hospital, enabling an automatic alert to be sent as soon as an organ arrives, or even when it is close to arrival, helping to ensure hospital staff are on hand to collect the delivery. In this case, every second counts.
The delivery of exam papers is not a particular risky business, however returning the completed answers from students is a different story! It is not uncommon for sacks of completed examinations to go missing if not tracked. It may be the case that a courier has lost a package, or it has been left behind in a depot. This can cost an examination board thousands per delivery. Trackers can locate the assets and also, importantly, detect tampering by monitoring if the package has been exposed to light / opened.
Lastly, and perhaps most sobering, is the use of IoT to track the transportation of human remains. For example, companies that specialise in turning ashes into precious jewellery, IoT assists the accurate tracking of such a sensitive cargo. This can help to ensure compliance with shipping regulation, which can vary according to each country.Through a combination of sophisticated sensor technology, and advances in cellular technology, the scope for accurate asset tracking, via IoT, has increased significantly, delivering organisations valuable savings in terms of speed, cost, service and reputation.
Is IT security improving, and will it continue to improve, or are the bad folks winning the battle? Are data breaches inevitable, and therefore it's all about managing the fallout from these, or is it possible to prevent security breaches from occurring? DW asks vendors for their views on the digital security landscape. Part 5.
Data is one of the most valuable currencies in existence today. Companies can thrive or die based on data, and attackers—from run-of-the-mill hackers to cybercrime syndicates, to nation states—aggressively target it, says Richard Agnew, VP EMEA at Code42.
“It’s no wonder that an entire industry of tools and practices exists for the sole purpose of securing and protecting data. However, data loss and data breaches are still a constant concern.
“At the same time that security solutions have become more advanced, cybercriminals also have been busy iterating and staying a step ahead of vendor defences. This has created a constant ‘one-upmanship’ game for bad actors, and is speeding innovation for vendors. It’s important to realise though that just because an organisation might have the latest security solution implemented, it doesn’t make it invulnerable to data loss.
“Organisations today should, therefore, look at data and data loss more holistically rather than just putting prevention strategies in place. Take the security of a home, for example. It’s highly likely that most people have locks on doors to prevent thieves from getting in, but many also will bolster their home security with alarm systems, surveillance cameras and home insurance to create a well-rounded home security strategy. Data security should be no different and likewise be a well-rounded protection plan.
“It is also vital to recognise that not all threats to data exist solely from external sources. Insider threat is a long-time, but growing problem that needs to be taken very seriously. While not always done with malicious intent, negligence from members of staff can cause just as much chaos for IT security teams as threats from malicious actors. Without the correct safeguards in place, such as next-gen DLP tools, which can monitor and collect specific instances of files and folders as they move throughout the organisation, the IT security team’s job becomes far harder trying to find the root cause of a breach.
“As with a home security strategy, an organisation’s cybersecurity strategy needs to be balanced and focused on protection rather than just prevention. It’s about looking at security from the outside in and building a security portfolio that can comprehensively protect all data—without discrimination—and help the business recover, if necessary. This is the only way to safeguard your business today and in the future.”
“In the latter part of 2018, Attivo Networks surveyed 465 security professionals to assess whether we were winning or losing the battle against cybercriminals. The first question was whether they believed that 100 days of dwell time, the time an attacker remains undetected, is too high, low, or about right. 16% believed it was too low, 45% about right, only 39% thought it was high. The next question asked whether the respondents felt that we were get better or worse. 30% believed dwell time would decrease, 32% said it would increase, and 23% stated no expected change. Disturbingly, 15% were not tracking this security metric at all, basically flying blind to whether attackers are in their network.”
“As you peel back the layers on this, several systemic problems are unveiled. The first, organizations are not making the necessary investments in detection technology and in gathering adversary intelligence required to quickly find and eradicate threats. While positive strides being made, they are often negated by rapidly evolving attack surfaces such as cloud, and unprecedented levels of newly interconnected IoT, OT, and ICS devices and hubs. These innovations are outpacing the security controls designed to protect them. The situation is also compounded by an “Amazon” for criminals, which serves as a black-market commercialized marketplace for best practice sharing, the purchase of attack tools, and the easy sale of stolen goods. Additionally, cybercriminals have now begun actively exploiting new levels of compute power and artificial intelligence to bust through traditional security measures.”
“Are we really winning or losing? Reported breaches were down slightly in 2019 driven by increased investments and executive/board level attention to the matter. However, organizations are going to have to move quickly to stay ahead of their adversary. Many will state that is full-on cyber warfare. This doesn’t necessarily mean organizations need to attack or hack-back, but instead implement newer innovations for the early detection and gathering of intelligence on the adversary. Commercial deception is one of these technologies that has come into vogue over the last couple years. Unlike other detection tools that simply deflect an attack, the defender can now understand how an attacker got in, apply gathered attack data to eradicate the threat and mitigate their ability to return.”
With the surge in ransomware attacks across various industries in the past few years, companies are continuously introduced to new and more advanced IT security systems. Unfortunately, it isn’t only the security systems that are improving – the attacks are also becoming more sophisticated and varied than ever before, as Marie Clutterbuck, Chief Marketing Officer, Tectrade, explains.
What recent reports also show is that there is a general lack of awareness among companies about both the threat and the IT systems in place. Surveys show that many companies are over-confident in their data protection abilities, and largely unaware of what type of IT security systems are in place and what data the company holds.
2017 is widely considered to have been the worst year in data breach history with organisations such as WannaCry, Petya and SamSam dominating the headlines. Today, with increasingly intelligent cyber criminals coming up with new methods every day, cyberattacks accompanied by ransomware are expected to rise even more in the coming years. The number of zero-day attacks is expected to increase from one per week in 2015 to one per day by 2021. The attacks are also expected to become more targeted towards certain industries. According to a recent report, the overall infection numbers have declined by 26 percent, but they have increased by 9 percent in business. This is largely because hackers know that businesses house data that is much more valuable and that their critical systems cannot afford to be taken offline. For these reasons, hackers know that businesses will likely have the funds to pay the ransom and that they will have a pressing need to do so.
Zero-day attacks exploit unknown vulnerabilities, meaning there is no direct way to detect, let alone defend against them. That begs the fundamental question, how can we protect ourselves against the unknown? The simple answer to this overarching question is: we can’t. Just like in trying to maintain good health, it makes sense to take sensible precautions that minimise the chances of getting ill. However, as any medical professional can advise, the focus must quickly switch to promoting a fast recovery as soon as the infection has taken hold.
With these recent trends in mind, we argue that a cyber-attack is no longer a question of if but when. Whilst maintaining effective data protection systems is as important as ever, businesses in particular need to shift focus on how to recover data when the worst happens. By recovering data as quickly as possible, businesses can ensure that they can get their systems back online without halting operations or having to pay a fee – and doing so fast.
In my opinion it is not possible to prevent data breaches, it is only possible to try to minimise exposure and attack surface, says Pascal Geenens, Radware EMEA security evangelist for your feature on IT Security.
Timely detection of potential anomalous behaviour in on-premise and cloud instances can prevent exposure and reduce the attack surface by tuning defences based on the findings. Event collection and threat detection are kind of the last resort if all else fails to keep malicious agents out – it is not a matter of IF, but WHEN protections will fail.
Having a trail of forensic events will speed up the incident response and help contain the fallout of the breach by being able to rapidly answer the questions of ‘how’ and ‘what’. Answering ‘how’ helps to improve defences so it does not happen again, while ‘what’ allows for the correct management and communication of the breach providing correct estimation of risk and impact on subjects of the breach.
It all seems obvious and event collection is readily available - most organisations are adept at it. The pain point today however is the number of anomalies that are raised to security operations based on indicators of compromise that were triggered.
Security operations are not able to keep up with the flood of false positives to analyse. There is a need for automated solutions, solutions that are smarter and can correlate events over very large time spans.
This automation is to be found in Machine and Deep Learning systems. These systems are not without their challenges though and require expertise to build, operate and maintain over time. I’m a strong believer in reducing the amount of data at the source and only storing highly correlated and important events in a global correlation system – building hierarchical solutions where point-solutions at the edge provide the logic for anomaly detection that is closely developed with expertise of the problem at hand.
Organisation have multiple layers on-prem to perform collection and detection (end-points, servers, applications, IAM, DNS, gateways). Add to that list multiple cloud instances and it’s easy to see how collecting centrally will become a major and costly operation.
Having the detection and reduction of events closer to the edge and only upstreaming relevant events for central storage will enable organisations to focus their efforts on what is important without losing themselves in massive amounts of data.
The key is using vendors and experts to do the work of normalising and building specialised solutions for the problem at hand at the edge, while your security team focuses on a custom layer of algorithms to correlate the edge-provided alerts into actionable intelligence.
Document Logistix creates document management solutions that help to eliminate the use of paper, improve records management and automate business processes. Its software powers the operations of some of the world’s most demanding, high document volume businesses, including major logistics companies like DHL and CEVA. These customers entrust Document Logistix with the handling of their information – much of which is highly sensitive or confidential – so security is a high priority.
Seeking a higher level of confidence in its application security testing, the company turned to WhiteHat Security to secure its DevOps environment and automate its processes. Document Logistix uses WhiteHat Sentinel Source for static application security testing (SAST) and WhiteHat Sentinel Dynamic for dynamic application security testing (DAST). The company also relies on the security experts with WhiteHat’s Threat Research Center (TRC) for added assurance in uncovering security vulnerabilities.
Document Logistix won the prestigious Document Manager publication’s award for the 2018 Product of the Year: Workflow and BPM. Since 1996, Document Logistix has supplied its uniquely affordable and scalable Document Manager software to a variety of SME and blue-chip clients around the globe. The company’s UK and EMEA operations are headquartered in Milton Keynes, UK, which is also the central point of product development, technical support and training. The US branch of the company is headquartered in Austin, Texas, and has major contracts with the Texas Department of Public Safety, the Virginia State Police and various agencies in other states.
“Our application is basically a portal for sharing documents. It’s not a banking application – we don’t store credit card information – but document management can be equally if not more vulnerable to people trying to gain access to things they shouldn’t see,” said Tim Cowell, founder, Document Logistix
Document Logistix’s application, Document Manager, provides a flexible platform for completely paperless business processes and highly efficient archiving. Not designed for any particular market, Document Manager is highly customizable for a large range of business processes. This could be for something as mundane as proof of delivery, where the risk of data loss is fairly minimal, or for more sensitive information like human resources records and personnel management, where the possibility exists for people looking at records they should not be viewing. This has become even more important since the EU’s General Data Protection Regulation (GDPR) went into effect, as there are financial penalties for non-compliance. Another example of document sensitivity among Document Logistix’s customers are attorneys general in the US using Document Manager for disclosure purposes, publishing prosecution case material to defense attorneys. Failure to protect such information could lead to a mistrial, potentially preventing prosecution of a felon, so stakes are high.
While protecting customers’ data has always been a priority for Document Logistix, it lacked a true solution for security testing of its application. A number of clients performed their own penetration testing, submitting a list of issues to Document Logistix, and Document Logistix would respond by providing them a new build of its application. The company also had its own people manually checking code for security vulnerabilities, but this proved to be a time intensive and costly practice, as code had to constantly be updated to keep up with new hacker techniques and new vulnerabilities.
“The biggest problem was the huge unknown. Our customers are high profile and high risk. We needed a solution that gave us a better process,” said Cowell.
Document Logistix implemented SAST using WhiteHat Sentinel Source to scan code for errors and ensure a more secure product design. Later, it added DAST using WhiteHat Sentinel Dynamic, providing them with automatic detection and assessment of code changes and alerting for newly discovered vulnerabilities, as well as reporting and intelligence metrics.
“With DAST, we have confidence in saying to our customers ‘this is what was done to make your information more secure,’ and they know that every time there’s a new build of the application, it gets a new test,” said Cowell.
In addition, WhiteHat TRC provides Document Logistix with an added layer of protection against security vulnerabilities. At the end of each day, any new code written is uploaded to the TRC, where it is checked by a WhiteHat security expert, and an automated report identifying any anomalies is then sent back to Document Logistix, so they can take any necessary actions.
The combination of SAST and DAST provides Document Logistix with a platform for testing its application and DevOps environment, and automating the processes required to comply with the complex rules of paper and electronic document management. This includes full auditability of its application, the ability to plan workflows, perform complex retention policy management, and define policies for certain classes of documents, including what documents should or should not be disclosed, and to whom.
The WhiteHat Application Security Platform has given Document Logistix full confidence in the security of its products and its ability to protect its customers’ information.
“Working with WhiteHat gives us added credibility with customers because we’ve raised the question of security first. It becomes a non-issue, because they understand we’re serious about our duty to protect their data,” said Cowell.
Document Logistix also points to the cost effectiveness and ease of implementing the WhiteHat Application Security Platform.
“We do three to four releases a year, and testing is very expensive, so performing testing on each release isn’t reasonable. This is a very cost-effective solution, because the testing process is ongoing. This path has had the least amount of impact on productivity,” said Cowell.
Is IT security improving, and will it continue to improve, or are the bad folks winning the battle? Are data breaches inevitable, and therefore it's all about managing the fallout from these, or is it possible to prevent security breaches from occurring? DW asks vendors for their views on the digital security landscape. Part 6.
Information security practices are always improving. Companies are much more disciplined; dedicated to implementing and measuring best practices; and improving user education, says Joan Pepin, CISO & VP of Operations at Auth0
They have to be in this day and age. As an industry, security makes consistent improvements year over year, as theories that we've formalised academically hit the road, get confronted by reality, and in-turn, evolve into better theory.
That being said, InfoSec tooling and tech are constantly running alongside (and oftentimes behind) the scale, processing power, user experience, and other innovations in the tech space. It is getting better, but in clumsy lurches. Social media has out-innovated InfoSec by a huge factor over the last decade, and created a whole new business of capturing eyeballs and directing traffic to where it should go. I used to say I hadn't seen any significant improvements in two decades, but I've come down a little off that. There are now some products that seem a little better and a little different, but companies have a way to go.
In the meantime, hackers are getting more sophisticated too. Has the balance of power shifted significantly? Probably not, but we're in arms race territory now. Both technology and hackers are getting better, but that doesn't save those whose lives have been ruined by cybercrime.
Companies first need to make a proactive investment in security and privacy generally, and get off their front foot, rather than their back foot. A lot of big consumer brands don’t want to make the upfront investment in security because they value time to market and quarter over quarter profits, instead of the privacy of their consumers and a long-term investment in trust.
Achieving budget for security and privacy initiatives requires alignment in the boardroom. More often than not, there’s a disconnect between the C-suite and cybersecurity experts. If we are to minimise the risk of data breaches, Chief Information Security Officers (CISOs) can no longer be seen as a necessary evil. Once there’s buy-in at the top, a visionary CISO is critical for extending a culture of security to the rest of the organisation, to ensure the execution of security strategies doesn’t become siloed within the IT department.
The advice I always give is follow security best practices: What is the data? Who should be able to access it? What controls are in place to manage that? Is the data encrypted? Is there a good business reason to collect, transfer, or store each piece of data, as each piece of data represents risk, it should have some value to counterbalance that risk? Are you encrypting data at rest and in transit? Have you had these systems penetration tested? What is the threat model? Technologies will change, but the fundamental information security principles of availability, confidentiality, and integrity will not.
From an IT perspective, the concept of the secure enterprise is a myth. While it appears from the global headlines that the threat actors continue to make ground, organisations are also keeping pace. Stated another way, how many breaches would we be reading about if the state of IT security was not evolving as quickly as the nefarious methods employed by the cybercriminals? asks Matt Middleton-Leal, General Manager EMEA and APAC at Netwrix for your feature on IT Security.
“That having been said, no security pro should be resting on his or her laurels. Security professionals should be working towards the goal of ensuring that the enterprise is more secure today than it was yesterday and will be more secure tomorrow than it is today. That’s the best that can be hoped for. To that end, the focus of the IT security industry, from professionals to analysts to vendors, has matured from protecting the perimeter to protecting the identity to now focusing on data security. How are today’s security-conscious enterprises enacting data security? It’s a multi-step programme.
“Data security begins with the notion that not all data is created equal. That is, your corporate PowerPoint template is probably not as important as your customer list or the CAD drawings of your next-generation widget. The key to data security is to first discover or find the important data. Once that data has been found, it must be classified by its level of sensitivity, secured by examining who has access to it, and validated to ensure only the right people can access it. In addition, smart enterprises will understand that data security is not a project; rather, it’s a programme that must continue as enterprises will always be creating data, both structured and unstructured, that needs to be classified, secured and access validated.
“And while data security is more important today than it has been in the past, it is by no means an enterprise security panacea. “Defence in depth” has been and remains an industry best practice. Perimeter defence is still important. Identity and access management remains vital and by layering in a data security component to the security programme, enterprises can be assured that they will be more secure tomorrow than they are today.”
Jonny Milliken, manager, research team at Alert Logic, answers the questions: Whether IT security is improving? Are the bad folks winning the battle?
“IT security is in a constant state of improvement. As large scale, global reaching security incidents over the last couple of years – The DNC hack, the WannaCry incident, The Equifax and Marriott breaches – have firmly enshrined security in the public mindset, vendors and organisations have had to sit up and take notice of the wide ranging impacts a security incident can have on both businesses and individuals. Organisations can’t always take advantage of security innovation when they are faced with a myriad of tools that they don’t know how to integrate, deploy and manage. Managed security services have expanded their offerings to help organisations of all sizes manage security in this complex environment.
However, for every innovation in security, there’s an equally barnstorming innovation in cybercrime. Whether it’s increasingly inventive phishing and social engineering, strains of malware capable of bypassing even sophisticated security systems or a lucky hacker exploiting a decade-old vulnerability which has been neglected for years, the bad guys only have to get lucky once to cause havoc for businesses. The rise of the dark web as a marketplace for criminality has not helped matters, with the proliferation of ransomware-as-a-service or phishing-as-a-service kits meaning even non-technical users can launch cyberattacks. In short, the push and pull between threat actors and security teams is far from over.”
Whether data breaches are inevitable, and therefore it's all about managing the fallout from these, or is it possible to prevent security breaches from occurring?
“I think in many respects the complexity of business function in our times means that data breaches are inevitable. With potentially 100s of different businesses in a digital supply chain, all utilising data in their own ways, the likelihood of one of these organisations failing to enact adequate protection is high particularly when considering the lapses we see from even the largest and technically savvy businesses – Just this month we’ve learned that Facebook held passwords in plain text for months. This, when combined with bad password hygiene on the behalf of consumers makes it unlikely data breaches will be a thing of the past anytime soon. This does not however mean we simply accept them. Limiting the potential causes of data breaches by encouraging a culture of patching, cybersecurity-auditing and hygiene within an organisation will help hugely.
, Having security technology and teams in place to detect and respond to threats—whether in-house or through a managed security provider—is essential. In addition, we will see the impact of legislative changes such as GDPR and the California Consumer Privacy Act which compel businesses to better protect their data by threatening their bottom lines for non compliance.”
It’s not digital transformation, it’s radical innovation that needs to be addressed, according to Ramsés Gallego, past board director of ISACA, Office of the CTO, Symantec.
We are living through times of change and in the search for success and customer attraction, disruptive technology is making inroads into businesses with the promise of better protection and deeper control and visibility. Disciplines such as machine learning and artificial intelligence are becoming the norm when it comes to provide faster detection and a more robust approach to security, since it is now technologically possible and economically viable to talk not just about “correlation” but “super-correlation” and to enhance detection rates.
However, we need to ask ourselves where the limit is to all this. We should start asking the right questions to the right people at the right time… or soon we will be asking it to a bot. In a time where software programs such as AlphaGo –the one made by Google’s DeepMind- is capable of teaching itself with no previous knowledge of the game… and winning, we have to start thinking about “The Rise of the Machines” and how far that future-present is from us.
We must celebrate the forward-thinking society we have become and applaud the developments in artificial intelligence but as I wrote on my Twitter feed some weeks ago, we must regulate AI… before it regulates us. Do you think that machines actually have any rights? Do you think that your datacenter wants to be turned off? Seriously. Today, the answer is, “Of course, not. They do not know.” Tomorrow, the answer might be, “They do not want to. Because they are self-aware, they know they are, they exist.” And that changes everything.
There are (massive) datacenters around the world that are self-aware and are capable of “defending” themselves. Truth to be told, they do so with rules and policies established by humans, but it is just a matter of time that realize that they what they want is to serve, to be. I’ve seen 3D printers that produce their own pieces and supplies when they need… and two Facebook machines that had to be shut down since they started talking a language humans could not understand—but they did. They created their own language and were actually exchanging messages.
While you, dear reader, might think this is too far away, I am writing these lines to serve as a wake-up call. Trust me, the next big thing is being crafted as you are reading this.
Forward-thinking technologies such as blockchain or quantum computing are changing the way we create and keep secrets; disruptive tech like Space Computing (the one that can potentially put servers in the space, where it’s not clear under which regulations and legislation we can operate there) is being considered by private entities. No question that, as I tweeted quite recently, “Next is Now. Now is the Next New. New is Next. Now.”
’Considering the current diversity and sophistication of cyber threats, many organisations are still not prioritising protection and may not even seek more efficient solutions, explains Ross Brewer, VP & MD EMEA of security firm LogRhythm.
‘This, of course, is making it easier for cybercriminals. In a recent research survey we undertook on this topic, we found that only 15 per cent of IT professionals in UK large enterprises are confident in their ability to defend themselves against current threats. It also pointed out that 30 per cent of companies lack the time and staff to identify and mitigate these threats. This indicates that many organisations still lack the appropriate measures – in both technology and process – to meet the challenges of detecting and mitigating evolving cyberthreats in short order.
‘’The traditional prevention-centric security tools that many companies are equipped with only create an illusory impression of safety. It is not an effective approach to cybersecurity as hackers, using increasingly sophisticated methods, easily bypass and breach perimeter defences.
‘’As such, a breach at one time or another is inevitable. It’s whether IT teams remain unaware of the anomalous activity within their networks and attackers stay undetected that the chances of a potentially damaging data exposure rise dramatically.
‘’However, this isn’t to say the future is all doom and gloom. Damage caused by cyberattacks can be mitigated if companies begin to acknowledge flaws in their current defences and ensure that the focus of security efforts shifts from prevention to detection. To improve detection you need visibility of what’s happening in your entire infrastructure and the efficiency to act upon unusual behaviour that are true threats. AI and machine learning solutions offer enhanced capabilities in recognising and predicting certain types of patterns, which humans can’t do. AI can make decisions faster and more accurately and with the ever-expanding volume of different activities and data on today’s networks, automation is now a critical factor for many organisations if not all.
‘’Security technologies will increasingly use machine learning to identify patterns in their data, enabling them to stop malicious hackers from roaming networks for long periods of time. Although, at present, not enough organisations understand the advantages that automation can bring to support their cyber defences. Hopefully they will draw lessons from recent incidents and look for more advanced solutions.’’
Many people cringe at the phrase "digital transformation." Why? It’s partially because the term itself sounds like it was belched forth from the fires of Mount Marketing. But more so because a lot of people think digital transformation is powered by technology like Kubernetes, public cloud, artificial intelligence, storage arrays, and blockchain.
By Richard Seroter, architecture and solution design expert at Pivotal.
Tech does matter, of course. Where before enterprises were unsure if IT or technology mattered (hence the outsourcing trend of a decade ago), today’s companies are focused on how to solve complex technical problems in software. But digital transformation is much more about fostering a culture that frequently ships new software, craves customer feedback, iterates constantly, and delivers customer value. This means placing focus on user intent, eliminating waste in the customer experience, respecting people, and fast flow.
Let's take an example: say someone wants to go on holiday and relax. There are countless microtransactions or activities for finding plane tickets, booking accommodations, finding a pet sitter, locking up the house, getting to the airport, and so on. They want to spend as little time as possible on all those activities so that they can spend more time on the value they are after, in this case relaxing. Digital transformation should be about helping them spend as little time as possible on activities that take away from your relaxation.
Companies that are getting digital transformation right find four keys to their success:
Transitioning a traditional enterprise into a modern software company means not taking infrastructure for granted, but rather moving to modern platforms that promote and enhance an organisation's core expertise and values. Digital transformation also must include wide-reaching organisational change to foster true disruption, including the adoption of agile and lean methodologies as replacements for traditional approaches to how software and services are built and delivered.
Although it’s impossible to predict when any business will succeed, those undertaking digital transformation have a leg up because they are switching to an outcomes-based approach that makes technology truly useful to their customers.
Today, maximising the efficiency of the data centre is every operators’ goal. Pressure from government regulations, customer requirements, investors and other stakeholders is driving the industry towards effective design and utilisation of technical space within the facility.
Kao Data, recently the largest new independent entrant into the UK wholesale colocation data centre market, entered with technical leadership gathered from prominent global organisations. All aspects of the major development were considered from a technical and economic stand-point to ensure that the campus offered highly efficient space that was compliant with leading industry guidelines and standards, including ASHRAE TC9.9, OCP ‘Ready’ Facility, BREEAM, and a wide range of ISO requirements.
Paul Finch, COO at Kao Data says: “In a growth data centre industry, driven by consumer demand for data, the ASHRAE Thermal Guidelines provide a unique opportunity for data centre owners and operators to bolster their corporate social responsibilities, reducing the environmental impact of data centres in a manner which also makes both commercial and economic sense. Unlocking the free-cooling capability of Kao Data Campus was a key strategy to enable us to build a fully resilient facility which could provide a flexible response to customer requirements while delivering a class leading, ultra-low 1.2 PUE at all ITE loads.”
Working Collaboratively at Kao Data Campus
Kao Data had made the decision early on, to appoint as lead contractor the highly experienced M&E design and build contractor, JCA Engineering. This decision allowed a highly collaborative working relationship to develop, where engineering principles were used to quickly solve construction challenges.
With guidance from JCA, Kao Data investigated the opportunities for free-cooling alternatives for its campus’ IT Halls. The two organisations believed that utilising non-mechanical cooling for the data centre was achievable in the UK, in compliance with the thermal guidelines. This would provide an opportunity to significantly reduce overall energy use at Kao Data Campus, whilst strengthening its corporate social responsibility and reducing the environmental impact of its data centres.
FläktGroup, has a long relationship with JCA Engineering, in consultancy, design and implementation of large volume, air handling systems (AHU), chiefly with the DencoHappel product lines.
Discussing the journey to a free-cooling IT space, Tom Absalom, managing director, JCA Engineering, states, “JCA quickly gained the confidence of Kao Data’s senior management, as both sides were working towards a common goal, which at the time was unusual in the UK data centre market, one where mechanical cooling was eliminated from the design.”
JCA and Kao Data visited the FläktGroup production site and were confident that the quality and capabilities of the AHUs were appropriate for the climate requirements for the data centre’s IT Space. The FläktGroup AHU design team worked closely with JCA and Kao Data to understand the specific requirements of the Harlow based Kao Data Campus and designed an efficient free-cooling system that offers the flexibility to provide a controlled environment within the technical space of the 8.8MW data centre which was able to meet server manufacturer warranty requirements.
Paul Finch, COO, Kao Data stated, “We recognised that data centre availability and reliability can be complementary to both energy efficiency and sustainability. Reliable, effective and efficient cooling is intrinsic to data centre operations, and we knew FläktGroup’s reputation both for the quality and range of their industry-leading cooling solutions. Together with JCA, we were in discussion with them very early in the planning cycle for Kao Data Campus and they quickly rose to the top of our shortlisted suppliers.”
Meeting the Cooling Requirements at Kao Data London One
FläktGroup worked with JCA and Kao Data to analyse the potential IT load variables across the data centre IT space to determine the most effective cooling solution. With a potential energy load of 2.2MW per data hall, the client was focussed on delivering the vast-majority of that energy to IT use and not detract from the energy envelope to power refrigerated cooling systems. Power is the single largest operational cost within a data centre and of that, the mechanical cooling system is usually the biggest percentage, often higher than the energy used to run the servers and other IT equipment.
Yan Evans, Global Director of Data Centre Solutions at FläktGroup stated, “Understanding the client’s varied requirements including energy efficiency, reduced environmental impact and cost of operations. Our solution was based on Adia-DENCO Evaporative Cooling Units, which are able to cool data centre environments without mechanical refrigeration equipment such as compressors and still achieve all three of the metrics set by Kao Data.”
Adia-DENCO utilises local climate conditions for enhanced energy efficiency operation. Adiabatic cooling coupled with large plate heat exchangers, enables energy efficient customers, such as Kao Data, to achieve exceptional EER (Energy Efficiency Ratio) values.
FläktGroup Adia-DENCO in Operation at Kao Data
Now operational, phase one of the Kao Data campus, has implemented N+1 Indirect Evaporative Cooling (IEC), utilising 13x Adia-DENCO AHUs, providing highly efficient climate control within the Kao Data London One data centre. This configuration enables concurrent maintenance of the IEC’s and also offers the required resilient back-up capability in the unlikely scenario of a unit failure.
To maximise the internal space for IT equipment, the cooling systems are mounted externally on the building adjacent to the IT Hall. The system is installed with dampers, weather guards and a weather roof to protect the equipment from the environment. In this wall configuration, any air that enters the unit is returned to the same side. Ambient air enters in the bottom of the unit and flows upwards through the plate heat exchanger. Air returning from the white space enters in the top of the unit and is drawn downwards through the plate heat exchanger, allowing for cross flow against the ambient air (See diagram). Each IT Hall is approximately 9,500sqft and the 13-AHUs provide a net cooling capacity of 2.2MW (N) and airflow of 13.5m3/sec per unit into the data centre.
The installed units use reverse osmosis water which allows for the water to be reused for adiabatic cooling several times. This dramatically reduces flush cycles from every hour to typically once a week. This, combined with intelligent control software, allows for water use to be minimised, improving WUE (Water Usage Effectiveness) at Kao Data Campus.
Within the IT Halls, customer cabinets are in enclosures and utilise hot aisle containment, which enables higher density racks, up to 20kW, to be optimised using IEC cooling technology. The cooling infrastructure is designed around a conservative temperature differential of12°K. Applying the combined expertise of Kao Data, JCA and FläktGroup design team to the IT cooling system has enabled a re-imagining of the equipment needed and has greatly reduced the mechanical complexity of the data centre. This subsequently has positive effects such as increased energy efficiency (the Kao Data London One data centre operates with an ultra-low PUE <1.2 even at partial load), significantly reduced maintenance requirements and increased system reliability.
Paul Finch, summing up the development experience, said, “Our thoughts on the capabilities of free-cooling were brought to fruition through our development process with JCA Engineering and FläktGroup. Their expertise in this area is first class and the products met our highest expectations. The IEC system has introduced a level of energy savings that benefits our customers and our business, meeting our objectives to ensure first class levels of resilience and flexibility with low cost of operations.”
As all in the industry are aware, the rate at which cloud computing technology is being adopted has seen it firmly entrench itself as a vital platform for many organisations keen to develop innovative products and services.
By Eltjo Hofstee, Managing Director Leaseweb UK Ltd.
Analyst firm Gartner found that cloud computing ranks among the highest on its disruption scale, with these services set to continue to grow in popularity, acting as a necessary enabler for future disruptions.
Some businesses however remain sceptical of cloud computing, despite the infrastructure proving to enhance or replace all parts of a business’s IT environment while promoting improved efficiency, the opportunity for expansion, and greater flexibility.
For those looking for the best of both worlds, hybrid cloud solutions have naturally become the ‘it’ thing. Hybrid cloud is a cloud computing environment that uses a mix of on-premises, colocation, private cloud and public cloud services with orchestration between all the platforms. Hybrid cloud solutions provide greater flexibility for companies seeking both hosting resources and the ability to run classified applications privately for example.
Many are already reaping the benefits of hybrid cloud, but if you are on the fence about making it the basis for your IT infrastructure, it is good to remember a few of its more notable benefits.
1. Easy and efficient scalability
Scaling IT infrastructure can be extremely expensive, making this difficult for most small or newly-thriving businesses.
Hybrid cloud environments, however, allow businesses to scale in order to accommodate specific workloads. Businesses can implement automation rules in the cloud that ensure the ability to scale resources up and down as dictated by business demand. This customisation ensures an optimised environment that performs efficiently to take advantage of unlimited resources based on demand-driven usage.
2. What downtime? Even during disasters
Keeping your business running at all times is crucial if you want your brand to succeed. Your data should still be accessible even during a disaster, but there’s more to this than simply backing up and replacing content on a cloud platform.
That’s why hybrid cloud solutions are often considered the key components in business continuity solutions. Hybrid clouds ensure critical data is replicated to a cloud in a different location, thus providing data insurance in the event of any natural or technological disaster.
3. Reducing your time to market
One of the primary reasons that organisations choose to move to the cloud is to make it easier to expand their business into new regions. Hybrid cloud platforms provide the agility needed to quickly enter new markets at affordable costs.
Companies of all sizes are able to get a jump on global initiatives, as the cost of investment is greatly reduced through on-demand self-service. Cloud resources can be automated and spun up swiftly to grow when needed without wasting any unnecessary resources. Thanks to the reductions in time and low costs of entry provided by hybrid cloud solutions, companies of all sizes can increase their competitive advantages.
4. Improved security
Although security fears are decreasing as cloud usage grows, security measures remain a top priority and an ongoing challenge that needs to be managed properly in any organisation.
Incorporating a private cloud solution within your hybrid cloud ensures you have more control over the design and architecture of the system, while providing a higher level of data security in comparison to public cloud solutions.
5. Gain the competitive edge
When it comes to innovative technologies, hybrid cloud solutions offer far more opportunities than any other type of infrastructure, providing benefits in the form of customer satisfaction due to more customised infrastructure.
Adopting cloud solutions can give your organisation a leg up on the competition, too — in fact, the Cloud Industry Forum’s most recent research indicates that 70% of respondents are either currently experiencing or anticipate having a competitive edge from using cloud services.
Cloud adoption is giving no sign of slowing down and business that don’t embrace it in some way, shape or form will undoubtedly be left behind. But cloud should work for your business not against it. Pick the right combination of cloud and dedicated servers for your business needs and reap the benefits of the flexibility it brings.
Is IT security improving, and will it continue to improve, or are the bad folks winning the battle? Are data breaches inevitable, and therefore it's all about managing the fallout from these, or is it possible to prevent security breaches from occurring? DW asks vendors for their views on the digital security landscape. Part 7.
While data breaches and cyberattacks are on the rise, they are an uncomfortable topic for many organisations, especially when it comes to phishing, says David Mount, Director, Europe at Cofense.
“Businesses across the globe have attempted to tackle threats through huge investments in next-gen technology and increased employee awareness training, but to no real avail. The problem? While organisations think they know what phishing attacks look like and how to best defend against them, the reality is, threat actors are changing their tactics so quickly, businesses just can’t keep up.
“This fast-paced threat landscape has left a capability gap between the technology organisations are employing to stop phishing attacks, and the actors exploiting weaknesses. This gap is not only leaving a huge window of opportunity for threat actors, but also means businesses can’t truly understand whether their IT security is improving or whether the ‘baddies’ are winning.
“It’s time for organisations to accept some uncomfortable truths about routine approaches to phishing defence and think differently. Historically, phishing defence has been around reducing click rates however, looking at how the market has matured, this isn’t a great indicator of programme success, and definitely doesn’t help organisations understand if their defence is improving. The reality is, even the best security awareness programmes will never deliver a zero click rate. Businesses need to accept and understand this as well as the three inevitabilities when it comes to phishing. Firstly, they will get phished, mainly because every organisation has data that is of value to someone else – even if it’s to inform more targeted spear phishing attacks. Secondly, some phishing emails will get through your defence programmes into users inboxes and lastly, there will be users who click on these emails.
“Once business leaders have accepted this, they have to focus on the problem differently. Rather than focusing solely on technology defences or training the users who fall for phishing emails, organisations need to ensure their programs adequately address those users who are either good at spotting phishing attacks or those that can be conditioned to spot them. By enabling and empowering these users to recognise evolving phishing attacks and flag them in a timely manner, rather than relying on technology to do so, IT teams are able to more quickly and efficiently put actions in place to stop the attacks before other users click on them. This visibility is key to stopping a point of compromise from becoming a breach – it’s the future of phishing defence.”
To ask the question- is IT security improving? Or are the ‘bad folks’ winning the battle? - would be to assume that there is one straight answer. The truth is that the answer is never so binary; yes, the industry as a whole is improving, and as such, IT is developing at an outstanding rate, but unfortunately, breaches are inevitably unavoidable, explains Enrique Velazquez, Product Manager of Security and Compliance, Cogeco Peer 1.
“ Over the last decade we’ve witnessed corporate and government data breaches increasing in frequency, impacting the data of billions of people around the world. As a result, we’ve seen an increased urgency of putting security first from leaders and the general public alike. But, this responsibility of shifting the balance of digital security and compliance from the IT director or CTO over to newly appointed C-Level positions or even across organizations, brings with it a degree of challenges.
This general awareness and increased focus has translated into advanced data protection and security improvements for every aspect of an organization. With this, we’ve seen large enterprises often falling behind broad corporate policy mandates by degree of process interpretation, as the sprawl of security vendors and decentralization of IT both innovates as well as exposes even more vulnerabilities than it often closes.
Subsequently, with the multitude of breaches that have happened recently, the term ‘breach fatigue’ has been coined to describe the indifference that can come from the common occurrence of such events. So, while enforcers, businesses and the general public alike are starting to tackle data protection and security issues, there is still a vast amount of work and investment that needs to be put into security.
A starting point I often recommend, is to simply stop calling it security. That term is too vague. What the reality of the problem demands, is a full risk- management program, with IT security best practices forming a basis of defence against infrastructure threats through application stacks. But beyond that, and of equal importance, is the need to develop and test your own incident management processes and procedures.
It would be naïve to assume that developing a risk management program, training a team, and even increasing or investing in an IT security budget, is a ‘one and done’ fix. Security should be viewed as a constant investment by businesses, improved upon iteratively, supported by automation and tools to ingrain security practices into operational teams and processes (such as SIEM, IRM, or ITSM), and enforced by compliance controls and regulatory bodies.
The collective inaction over the past decade has led to a security deficit that will take significant amounts of time and money to make up for, as well as dedicated resource commitments to process improvement or even additional teams to address the new landscape of data integrity and governance.
Businesses and institutions need to learn from others’ mistakes, in order to reduce the risk of possible data breaches. It’s unlikely that improvements will not stem from making data breaches impossible, rather it’s best to accept the likelihood of a data breach and invest in security measures as a response to tackle this possibility.”
The reality today is that organisations need to have a 'post-breach' mindset because it is inevitable that adversaries will get into their network, according to Sam Curry, chief security officer at Cybereason.
“It’s what the enterprise does to eliminate the risk and remove it quickly and efficiently. This can be achieved today through a continuous threat hunting program where you have eyes on your network at all times. Today, 'Infrastructure' beaches are still the most common. There are those that have had infrastructure breaches and those that don't know they have been victimised. 'Information' or material breaches are a completely different beast and please don't confuse the two. There it breaks down because many companies have not been victimised by material breaches in the past.
In the bigger picture and on a day-to-day basis, it is very foolish for enterprises to think they are immune because hackers are persistent, they are accessing sophisticated tools, they are using more and more third parties to do their dirty work and new automated hacking tools are readily available. And most importantly, hackers have a lot of time on their hands. Today, the hacker still has an advantage over the vast majority of organisations because they only have to be right once to successfully find an enterprise exhibiting proper security hygiene. Whereas, the enterprise has to be right 100 percent of the time protecting the lifeblood of its organisation, its data.”
As we continue to enter further into the digital era, technology has become ubiquitous and integral. Not only do we as individuals carry our own powerful hand-held computers to organise and enrich our lives; organisations employ a wealth of complex technology in order to run and manage their assets effectively and efficiently.
However, as we become more advanced, so too do the threats we face. The news is now littered with reports of breaches, many affecting some of the world’s most notable companies and institutions, from British Airways, to Yahoo, through to the German Government and – if some reports around the 2016 US elections are to be believed – potentially even democracy itself.
Many organisations are starting to take notice, and are thus becoming more aware of the threat vectors that exist in our ever-evolving digital landscape. The IT security glass may very well be half empty, though, for any organisations that cannot align advancements in their technology with equally mature cybersecurity postures.
There are three fundamental areas in which organisations can struggle when it comes to filling the IT security glass:
· Complexity of information for the organisation. From threat intelligence, compliance and regulations to security testing and audits, the amount of information that an organisation is required to digest and base investment decisions on is growing. Not only does this impact the level of resources and skills required from the internal IT team, but it is confusing for the extended team of stakeholders. The maze of information and limited visibility across the overall IT infrastructure can leave an organisation vulnerable.
· Unpredictable and ineffective spending. With no clear reporting model, organisations are basing their investment decisions on the results of the latest penetration test or security audit, or on pressure from existing or new regulations in force. This never-ending project-based model doesn’t allow for continuity and intelligent spend over time. The traditional cybersecurity spend becomes a pattern of testing, part-fixing, requesting more budget, spending budget, testing – and repeat.
· Confusing and growing compliance landscape. Between the European Union General Data Protection Regulation, PCI Security Standards Council compliance, Cyber Essentials and ISO standards, the compliance landscape is a minefield for any organisation. Although achieving compliance enables organisations to achieve a level of best practice and is a helpful negotiation tool for budget requests, it doesn’t mean that an organisation is completely protected. The constant changes in regulations also require up-to-date knowledge and skills within the IT team.
Filling the IT security glass
Employing cybersecurity maturity (CSM) is the key to turning the IT security glass from half empty to entirely full. But what exactly is it? CSM is the ability for an organisation to make cybersecurity decisions in a way that considers all relevant factors within a changing technology and threat landscape; the ability to improve defences continuously whilst the organisation operates and transforms.
Organisations that invest in creating a concise and accurate view of their cybersecurity state, and can communicate this clearly throughout the organisation, see the benefits in terms of confidence and more informed, collaborative decision-making around the value of cyber-investment.
Measuring the current state of cybersecurity maturity
There are a few variations and grading scales for measuring CSM, with the most common being the COBIT maturity scale. Recent research using the COBIT scale found that only 22 per cent of IT security professionals surveyed believed their CSM level to be optimised. Almost 20 per cent stated their level of maturity as non-existent, ad hoc or didn’t know.
This growing lack of control and visibility directly impacts how informed and prepared an organisation is to deal with either attempted or successful attacks. If a Chief Information Security Officer (CISO) wants to have an informed business conversation with their executives about risk, they need the same level of confidence in their presentation of cyber-performance data and reporting as the finance director would have in the numbers they bring to the board.
Is IT security going to be more or less of an issue in the future?
So, back to the original question: is IT security going to be more or less of an issue in the future? The answer is entirely dependent on the decisions your organisation makes over the coming weeks, months and years. New technologies – and the risks they present – are only going to become more complex, and so organisations that stand still will see IT security become more of an issue as time goes by. But organisations that take cybersecurity seriously and employ CSM to ensure that they stay on top of the latest developments will find that they can take advantage of disruptive technologies without exposing themselves to unnecessary risk.
Whether your IT security glass is half empty or entirely full is up to your organisation and the strategic cybersecurity decisions it takes. Making positive IT security changes today will benefit you and your organisation for years to come.
Martin Warren, Cloud Solutions Marketing Manager EMEA, NetApp, begins: With our feet firmly under the GDPR table almost 12 months since the implementation date, data breaches continue to dominate headlines as businesses evolve to put data privacy first.
On a positive note, in February we found that UK companies’ data privacy awareness continues to develop. We surveyed over 500 IT decision makers and looked at a variety of data privacy facets including GDPR and the impact Brexit could have on data sharing. Our research found that the majority of UK companies (68%) say that their level of concern for data privacy has increased since GDPR implementation. While 30% say that their level of concern has been the same since May 2018, a mere 2% claim that their level of concern has reduced. It is reassuring to see that businesses treat this issue seriously, and increased awareness will ultimately lead to increased activity.
While businesses and consumers alike grapple with the logistics of IT security, the number of data breach notifications has significantly increased. With data breaches rolling in and GDPR now being called upon to investigate the compliance errors at hand in the EU, the rest of the world is looking on with bated breath. There are various examples of poor data management, in the form of breaching EU data protection rules – and as a result, more and more high-profile breaches are bring brought to our attention.
The continued uncertainties around Brexit are affecting business in numerous ways, including in their data management preparations. It is therefore encouraging to see many UK businesses already focussing on data regulation and privacy, to avoid large-scale data breaches, fines or non-compliance in any way. As more details around Brexit materialises day by day, the best course of action for companies is to continue to build solid data protection and data governance processes, to ensure compliance with current legislation and preparedness for any future developments.
Companies need to get their house in order before they can face an incredibly complex data and regulation landscape. Successful data management starts with knowing where your data is – if you don’t know, you won’t be able to manage it or know what its individual requirements are. It is illusionary to think that security breaches can be prevented entirely from occurring, Threats and the bad guys inevitably become more sophisticated. However, businesses can take the necessary steps to ensure their organisations and employees are compliant and protected, so they can mitigate the chances and occurrences of a breach.
While we’ve all seen our fair share of infrastructure technology to help us automate deployment, scaling, and management – from the classic VMware to open source tools such as OpenStack – there’s rarely been such a speedy and widespread adoption of one as with Kubernetes. And we think it has spread for a reason.
Moving from Stateless to Persistent in Kubernetes
Storage is a concern that only comes into people’s focus once they have handled all the rest – networking, security, etc. But once Kubernetes is to go into production, the storage question becomes pressing and essential: How do I persistently store my containers’ data? Data is, after all, the lifeblood of most any business application. So, unless you have a proper storage solution in place, the usefulness of containers is limited.
This post is about one way to answer the question on how to manage data for stateful applications. And we’d claim, it’s the most forward-looking and, eventually, the easiest way to set up and run a storage infrastructure for Kubernetes. Instead of talking about it, let’s jump right in and have a look at how easy it is to set up a modern storage environment for a Kubernetes cluster.
3-Step Storage Setup for Kubernetes
There are really just 3 broad steps to your fully featured installation. Given that you have a Kubernetes cluster up and running, just do the following:
Step 1: Deploy a next-generation data center file system operator and fill in the config file. Start with entering which services should be running on which hosts. Then choose the nodes that should run the data service, the metadata service, and the registry. The registry node will be an ephemeral bootstrap registry, its purpose is to get the system started. Once finished, the Operator will do its automagic.
Step 2: Tell Kubernetes to apply the operator config. The initial node starts said ephemeral bootstrap registry and also the API and the storage system console. Once it’s done, log into the bootstrap registry, check that all metadata and data services are registering and that they’re showing unformatted devices. Choose 3 nodes and format their registry devices; the registry pods will then start and create a replicated cluster. As soon as the two other registries are online, it’s safe to shut down and delete the ephemeral bootstrap node – since by now, the registry service is persisted to the disks we had chosen before.
Step 3: Format the data and metadata devices by using the web console. Then go ahead and create volumes or set up storage classes for dynamic volume provisioning. When done, the storage cluster is ready to use and you can deploy application pods that use these volumes. You can do so on any nodes where the Operator started a client pod.
Those were a lot of words, but the actual process is quite simple and straightforward and, thanks to the recently introduced Kubernetes Operator framework, greatly reduces the complexity that comes with interdependent infrastructure deployments and management.
About the Authors:
Sebastian’s working on product marketing at Quobyte. He also does some coding for the web. And he knows first-hand about split-brain problems, being a philosopher by training and a marketer/coder by trade. Matthias Grawinkel is a software engineer for Quobyte, with interests in distributed storage and database systems, DevOps and datacenter automation.
Make sure your redundant IT remains an asset not a liability
By Steve Hone, CEO, The DCA
Getting new or updated technology is something most of us look forward to. Once you have made the decision to replace or decommission your hardware what’s the best way to safely and responsibly dispose of it? As a firm advocate of a circular economy I would personally investigate reselling or donating, after all just because technology is at end of life for me this does not mean others would not find a use for my redundant IT assets. Several of the articles in this month’s Journal explore the options open to you and explain how disposal can be achieved in a safe and responsible way.
On my travels I have seen many a skip piled high with decarded IT kit, much of this electronic waste ends up in landfill which not only has a serious environmental impact but can also land you in very hot water from a data protection perspective. If you are tempted to take the easy route there is literally no telling who might be able to get their hands on your hardware and potentially any data which is locked inside. So, if you are wrestling with the challenge of what to do with your old tech think very carefully and seek advice if you are unsure.
I find it ironic that large sums of money are invested to protect equipment from attack whilst it’s in service but often as soon as the kit becomes ‘end of life’ it simply drops off the radar with very little resources or budget being provided to manage this ongoing risk appropriately. Under GDPR regulation, organisations must be able to track every asset to the point where all data has been eradicated and ensure they have a formal disposal policy in place. There are plenty of professional accredited organisations who can help you dispose of your IT assets in a safe and responsible way. Many, I am pleased to report are DCA Members.
Unfortunately, there are still some unscrupulous people out there who just turn up at a business and falsely claim to have accreditations they do not hold so if you are using a third party make sure you have the answers to the following questions:
It is vital you ask these questions, if you fail to do this it will be impossible to know where in the world your hardware will end up.
Funnily enough, one of the first articles published by the DCA back in 2011 referenced a BBC Panorama Programme called “Track my Trash”. Having installed tracking sensors in a number of electronic devices the team sat back and tracked what happen. During the same investigation, the researchers also saw confidential documents from the Environment Agency that indicated 77% of all the UK electronic waste (which includes discarded PCs and servers) is illegally exported. The waste hardware was stripped, and any data found was then sold on. The team tracked the electronic devices 3000 miles away by the way to West Africa!
If not carefully managed, the retirement of your redundant IT could significantly damage your organisations reputation and lead to heavy fine. The largest fine to date being handed out to Google by the French data protection watchdog CNIL for €50Million, although this is an extreme case, it’s important to note that the same obligations apply to all businesses irrespective of their size or turn over.
The benefits of having a formal disposal policy is considered best practice by the DCA.
A policy provides a framework to explicitly define disposal policy, roles, responsibilities, the execution and record-keeping requirements which must be adhered to. The policy enables an organisation to navigate complex international laws and industry regulations, accounting rules, sanitation and disposal options in order that Technology can be properly removed from inventory and risks are appropriately managed.
Thank you again to those who submitted articles for this edition of the DCA Journal. The theme for the next edition of the DCA Journal is Insight - focussing on New Industry Trends and innovations.
If you would like to submit content for consideration, please contact Amanda McFarlane on 0845 873 4587 or email firstname.lastname@example.org
By Brendan O’Reilly, Blygold UK
Brendan O’Reilly, Sales Director, Blygold UK discusses the merits of anti-corrosion treatments for HVAC Heat Exchangers and explains how this can result in a ROI due to energy savings alone within 12 months.
The performance and sustainability of HVAC heat exchangers has been a subject for debate given their construction and external exposure to the elements, corrosion is inevitable, as are the subsequent losses in performance and shortened equipment life cycle.
However, these factors can be combated with the application of an effective anti-corrosion protection for the casing, the framework and the coil block. This protection not only prevents further corrosion but maintains performance, improves functional performance, reduces energy consumption [CO2], improves cleanliness and extends the serviceable life.
Approved by equipment manufacturers and supported by BSRIA studies, the patented anticorrosion treatments for heat exchanger coil blocks, contain metal particles, which support improved heat transference and therefore energy efficiency. The coatings for the casings and framework consist of a polyurethane penetrating primer and top coat.
The Protective coatings can be applied either at designated premises prior to installation or on site. There is limited down time and disruption to service, building function and building users, with payback usually less than 12 months.
There are numerous detailed case studies available, including:
Virgin Media Hayes Middlesex – Blygold Climacheck Monitoring
Case study of energy savings on a chiller with coated heat exchanger coil blocks
St James Hospital Middlesbrough
Here is an example of treated casings, framework and heat exchangers.
The coating company on behalf of Carillion, undertook to repair the rusted areas on the framework and coat the corroding coil blocks. As you can see from the pictures this was achieved by deep cleaning and acid washing the framework, applying a penetrating primer (Refamac 3509) and finishing with a Polyurethane top coat with specified RAL colour (Refamac 3510). The coils were coated in a wax based coating, containing bronze in the pigmentation.
The work was warranted for 5 years but will last considerably longer. The ROI was within 12 months in energy savings alone. The performance of the unit was brought back to where it was when installed and has maintained that performance since. The work was carried out in 2009 and the coating company goes in once a year to wash down the coils and monitor the coatings, they are still in excellent condition. As a result of this work the coating company has been engaged on other projects in the UK.
Find out more about Blygold UK here.
By Robbert Hoeffnagel, Green IT Amsterdam
Currently, only 10 percent of the so-called 'critical raw materials' used in data centres are recovered. If we want to further reduce the impact of data centres on the environment and our living environment, the percentage of devices and materials that are re-used or recycled will have to be drastically increased. That is why a group of companies, universities and other parties - including Green IT Amsterdam - are starting a research programme under the name 'CEDaCI' into circular models for data centres. Organisations from the four main data centre countries in Europe - the Netherlands, Germany, France and the United Kingdom - are participating in the project.
"North-West Europe - and in particular the UK, Germany, France and the Netherlands - is the EU's data centre hotspot," says Julie Chenadec, Project Manager at Green IT Amsterdam. "Servers and other hardware in data centres often have a replacement period of 1 to 5 years. This contributes substantially to the production of 11.8 megaton WEEE per year. These four letters stand for 'Waste Electrical & Electronic Equipment. This makes WEEE one of the fastest growing waste streams in the European Union”.
This waste contains so-called critical raw materials (CGs). These CGs are also referred to as 'critical raw materials'. These are raw materials that are of great technological and economic importance and whose supply is vulnerable to interruption. "With the CEDaCI project we facilitate the creation of a circular economy for data centres in North-West Europe. This circular economy reduces the impact of data centres on the environment. This will be possible if we are able to recover more raw materials, reduce the use of new raw materials and develop a safe and economically healthy chain for critical raw materials”.
Currently, only 10% of critical raw materials are recycled and recovered. CEDaCI wants to increase this to 40% for the baseline (107 tonnes) at the end of the project in 2021. And further to 400% or 242 tons of WEEE after 10 years.
"At the moment, the greatest environmental impact of data centres comes from the substantial use of energy," says Chenadec. "This is being addressed through improved operational efficiency and the use of renewable electricity generation technologies. However, given the enormous growth, the impact of data centres on the availability of resources such as the critical raw materials mentioned should not be overlooked”.
Over the lifetime of a data centre an estimated 15 percent of the environmental impact comes from the building and its installations, while 85 percent comes from IT equipment. The impact is high because equipment is typically renewed every 1 to 5 years. "Although accurate data is not published, the data centre industry makes a significant contribution to the global total of 11.8 million tonnes of waste electrical and electronic equipment (WEEE)," says Chenadec. "This is one of the fastest growing waste streams in the EU. WEEE contains critical raw materials of high economic importance and vulnerable to supply disruption. In addition, production is energy-intensive and thus contributes to the environmental impact of the sector”.
Both the speed and volume of growth of 'digital waste' is unprecedented, but this is not accompanied by the development of a recycling infrastructure. Moreover, it is clear that the reuse of components, as well as the recycling and reuse of materials, is low.
Chenadec: "Currently, recycling of WEEE in North-West Europe is limited to 26.9 percent in the United Kingdom, 26.3% in France, 36.9% in Germany and 38.1% in the Netherlands. A large part of the remaining equipment is exported and reprocessed or sent to landfills. These exports waste millions of tonnes of valuable resources from this sector every year or are no longer accessible. While some of these substances are dangerous and have harmful effects on the environment and the living environment. Yet these materials are often simply considered as 'waste'. It is important that these critical raw materials remain available or become available for reuse, precisely because access to them is threatened and substitution by other materials is currently not feasible".
For Green IT Amsterdam CEDaCI is more or less the successor of the ReStructure project. The latter project was purely Dutch and aimed to map the entire chain involved in the responsible use and disposal of IT and other data centre equipment. "ReStructure also looked at the possibilities of creating digital marketplaces where used data centre equipment can be sold or bought," explains Chenadec.
Robbert Hoeffnagel is communications manager at Green IT Amsterdam
CEDaCI is a European research project aimed at developing a circular economy for data centres. The project started in January 2019 and runs until 2021.
These are the participants in the project:
By Mat Jordan, Head of EMEA, Procurri UK Limited
In a world increasingly focused on both reducing, reusing and recycling but continuing to indulge in commercialism, it can be difficult to identify the right path to take when it comes to your company’s approach to IT disposal, but also refurbishment and re-use. Priorities can be difficult, and whilst we’d all love to be able to recycle everything no longer required for business, this isn’t always achievable.
At Procurri, we specialise in IT Asset Disposition and have a worldwide presence working on just that: so we see a lot of the industry’s quirks, contradictions, and the newest developments in ethical and eco-friendly processes and initiatives. With our four values; Excellence, Innovation, Commitment, Integrity; we immerse ourselves entirely in leading by example in IT solutions to continue to add value – and delight! – for customers worldwide.
Somehow when it comes to the disposal of small IT equipment, we all remember to consider re-use options and as an ITAD specialist, we’re often approached and asked about the possibility of data erasure and re-sale. However, when Data Centres and other large facilities or paraphernalia are involved, it is commonly written off as such a big job to dispose of, re-locate, install or deinstall, that the prospect of actually gaining value from the old assets is forgotten.
It’s fair to say that the teams at Procurri are experienced enough in all things ITAD to not be easily phased by a ‘big job’; even when it involves international work, huge installs, or large teams of engineers. After all, we’ve been in the business for over 10 years, and a lot of equipment has shrunk in size considerably since we started out! Even if things aren’t spectacularly easier to install and de-install than they were a decade ago, the facilities for refurbishment, recycling and environmentally-friendly disposal have improved greatly; with more options and opportunities available than ever.
The Data Centre Alliance remains the principal organisation for the data centre industry, and this wide-ranging overview of the industry allows them to advise and support those within the sector on the best possible ways to maintain the highest levels of corporate social responsibility when it comes to end-of-life IT equipment. Working closely with the DCA, Procurri is able to aid companies of all shapes and sizes, and respective IT requirements to match, all across the globe with their reuse, recycling and refurbishment approaches. And that we do: Procurri has over 400 expert staff in 14 offices worldwide, with their services covering over 100 countries. Our 6 regional warehouses cover over 163,000sq ft, backed up by 800 local warehouses. Our over 3,000 customers enjoy support on over 60,000 IT assets, across 3,500 sites and since we began have refurbished 558,000 assets with our help.
It’s this strategic relationship with the DCA alongside our strong international presence that allows us to exceed customer expectations time and time again by presenting value from hardware that many would otherwise consider defunct. Sustainability needs not be a ‘nice-to-have’ add-on to equipment disposal any longer; but rather, the norm.
Day-to-day, we work with a whole host of companies to offer them on-the-ground support with their IT whenever they need it. The first of Procurri’s three pillars of work is Independent Maintenance Services. Our staff are able to build up a vast technical knowledge through this work and this feeds beautifully into the remaining two pillars, which integrate closely; IT Asset Disposition and Hardware Resell.
There are a variety of options available for those looking to derive value from IT equipment as it comes to the end of its life and we have experience in all approaches. No matter how big or small, businesses are now able to factor in sustainability to their IT moves, purchases and disposal, with all bases covered.
Sustainable Data Centre Services
One of the larger scale jobs that we take on at Procurri is the re-location or migration of Data Centres. These can be extremely complex projects, requiring in-depth product management, the organisation of appropriate transportation and the negotiation of multi-country legislation regarding the refurbishment, data cleanse and resell of products. We are extremely fortunate to have had the opportunity to work with a broad range of data centres in numerous locations – and with each challenge the team take on, they continue to learn and improve their skills. As a result, we really feel like we’ve hit the point where there’s no job too big or too small for us to take on!
We understand the operational deluge that moving data centres can cause; and we’ve seen first-hand the often devastating effects that not managing it correctly can have on business-as-usual. Unlike some other firms, Procurri puts boots on the ground when and where you need them: de-installing old equipment when it needs to be and installing anything new upon arrival. This kind of service is not a quick job, but one always worth doing properly; which is why we only allow fully qualified technicians to take it on! Weighing up all of this workload, it’s easy to see why many don’t then consider sustainability practices as an extra – so it’s just as well that we’re here to guide and assist throughout; implementing greener value-adding initiatives without the customer having to take on any more time, expense or effort.
In some circumstances, you can’t wait or be without hardware, or you quickly need to extend your capacity beyond your means. In these instances, Procurri offers quick ship rentals – hiring out hardware of all types from a week to a year, shipped to arrive with you next day. At the end of the hire period you can choose to return it or retain it; and our standard third-party maintenance support services remain in place throughout.
WEEE Disposal, Data Erasure and keeping customers safe
There’s no doubt that many companies WANT to be sustainable and more ethical in their IT practices from all perspectives, but with more of us than ever working in the tech sector being clued up on the dangers of improper disposal, it can all too often be brushed off as a risk. Add in the ever-changing international legal and political climate, and the little fact that every country in the world has their own, often unique, legislation regarding it all… and you’re in for a hard and fast “it’s just not worth the effort” response.
This is where our global presence really comes into its own. Having local knowledge and an international brand really allows us to tap into the specialist expertise required to navigate the legal systems and best practice guidance of each country as needed, ensuring a level of security unrivalled by anyone else. Our Data Erasure systems are already considered the most stringent and comprehensive around, and when you add in high WEEE capabilities alongside a fleet of secure unmarked purpose-built transportation, technologically-advanced warehouses and a constantly evolving suite of security features, you know that we leave nothing to chance.
The future looks bright for Procurri, and with our continued focus on sustainability, it can look brighter for us all. The commercial slogan goes “reduce, recycle, reuse…” and we consider ourselves at the forefront of implementing that even in a tricky industry like IT!
By Kevin Towers, CEO of Techbuyer
The birth of the circular economy’s first unicorn – a company valued at $1 billion – in March was proof positive that making best use of resources is coming into the mainstream. The success of Rent the Runway, a subscription service for high end fashion, suggests we are moving away from the “take, make, waste” model to an approach that remakes, reuses and shares.
The circular economy is new approach that incorporates five broad models of sustainability. It advocates:
The net result is less virgin materials being used and more value realised from the materials we already have. Accenture estimates the economic value of this to be €1.3 trillion by 2030. Little wonder that the finance institutions are advocating in favour of a circular approach.
“Businesses that adopt a circular philosophy are more likely to survive and thrive going forward. They are likely to be more future-oriented in their strategic vision too,” wrote Head of Sustainable Finance Americas at ING, Anne van Riel in a recent Business Insider article.
The question now is how the data centre sector can capitalise on the circular advantage with new approaches to hardware. More and more reports are promoting the values of refurbishment (product life extension) a better choice for the environment as well as budgets.
A report submitted to the UN in October found that remanufacturing and comprehensive refurbishment reduces greenhouse gasses by 79-99%. It also saves 82-99% raw material requirement and results in 69-85% less energy usage. Production waste is 80-95% lower… and there are cost savings of 15-85% to the customer too. With these incentives in place there is a compelling case towards incorporating more refurbishment and remanufacturing in data centres. However, it still has some way to go before making the mainstream.
“I think there’s a growing awareness of the circular economy amongst businesses, and a growing frustration with the inability to do anything about it at the moment,” says Colin Curtis, Managing Director of TBL Services, an organisation that specialises in helping businesses support the UN Global Sustainable Goals and has been working with Techbuyer. “We are seeing an increasing number of our clients working to support the UN Goals focus on Goal 12 – Responsible Consumption and Production - as a means to hasten the move towards a more sustainable solution when it comes to materials use and reuse but the nuts and bolts of this are not easy.”
Part of the issue is a lack of understanding. With Moore’s law at the forefront for so many years, there is a perception that the latest and greatest must be installed at regular intervals to capitalise on efficiency gains. However, this argument is more about energy than materials usage and with the increasing use of green energy to power data centres it seems less and less important.
There is now evidence that Moore’s Law may be slowing down. Besides which, many incremental gains from the use of micro technology over recent years are limited by the laws of Physics. CPUs power has been boosted by technologies such as hyperthreading. However, there is a finite number of threads that can be added generation by generation. Much of the most recent increased processing power in servers is said to be a result of memory and storage upgrades rather than the CPU.
More importantly, there are massive logistical challenges to refurbishment on a large scale. Many of the major server manufacturers like Dell and Cisco have a refurbished offering. However, it is difficult to grow this alongside the business of selling new.
For one thing, large multinational companies deliver 90% of their products to market through channel partners via a two or three tier distribution model. Knowing which products are installed by which end-user is a problem, let alone recover them. It is not clear who should be responsible for collection – the manufacturer or the reseller. Added to this, developing reverse logistics chains, refurbishment facilities or recycling requires extra resources – resources that are diverted from the core business of manufacturing new.
This is where the secondary market becomes particularly useful. Almost as old as the IT market itself, it has evolved to trade in and add value to decommissioned equipment. Finance arms of the companies like HP and Dell sell equipment that has reached the end of three-year accounting periods to the secondary market.
A thriving trade market standardises prices, making it a viable business option, and government is taking steps to support it through legislation. An example of this is the recently passed EU ecodesign directive, which politicians in the UK are talking about mirroring post Brexit. The ruling forces firmware updates to be freely available up to eight years after the manufacture date. Components are now fully supported by quality refurbished suppliers with software updates. Alongside the three year warranty companies like Techbuyer supply, it is a powerful recommendation for secondary use.
Access to fully supported second use servers and component parts gives data centre managers the opportunity to imitate Google in much smaller facilities. In 2015, the hyperscaler released a report called “Circular Economy at work In Google data centers”. This stated that 75% of components used in the spares programme and 52% of components consumed in its Machine Upgrades program were refurbished inventory. It also said 19% of servers deployed were remanufactured machines.
Google explained that it is a manufacturer of its servers, and so has the expertise built into its business model. Most data centres are reliant on external manufacturers: big name brands like HP and Cisco, OCP channels or a combination of the two.
Data centre managers rarely have access to the facilities or expertise to carry out large scale refurbishment in-house. Companies like Techbuyer have grown thanks to this demand for a “plug in” refurbishment option. We buy retired servers, storage and networking from a variety of sources, strip everything down to component parts and restore to as new condition. Companies come to us for replacements and upgrades as well as complete servers built from new, refurbished or a combination of the two. The result is that IT equipment is kept in use for as long as it lasts.
Whatever cannot be reused is ethically recycled, and with the research and development underway in this sector, the news is good here too. Labs in Australia, the UK and Europe are developing more effective and cleaner ways to recover materials as we speak. With so much in the pipeline, the future looks bright for a more sustainable future.
Cooling is one of the main considerations for maintaining the full functionality and performance of a datacentre, so if your system was installed ten years ago or more, chances are, an upgrade is on the horizon. Mike Hayes, Applications Specialist at FläktGroup, discusses which parts of a cooling system's infrastructure can be reused and why some aspects should be replaced.
As needs grow or change, data centre infrastructure can quickly become inadequate for current and future requirements. Whilst a new data centre may seem like the solution, often a refit or refresh can increase compute capacity, reduce latency and increase efficiencies.
When it comes to cooling, equipment installed some 10 or more years ago – often central chiller systems or old generations of direct expansion (DX) units - will in most cases, be coming to the end of its useful life. Outdated DX units are likely to contain a refrigerant that is now being phased out for future use by EU legislation. Central chiller systems which pump water around datacentres present a high risk if a leakage occurs. Even if an old system is still running fine, they are far more power hungry than modern solutions. As pressure mounts to reduce energy consumption, operating costs and carbon footprint, many data centres now need more energy efficient ways to remove the heat generated and maintain internal temperatures at an optimum.
However, data centre improvement projects present many challenges. Upgrades need to take place while the data centre is ‘live’ and projects may be hindered by physical constraints such as the positions of existing equipment or infrastructure services, as well as working space and access. With this in mind, how can datacentre managers make the most of existing infrastructures without compromising on the benefits offered by new cooling technology and systems?
Today, newer and more energy efficient solutions, including free cooling, are increasingly being used in the data centre industry. Modern DX units, with variable speed fans and compressors, are also very popular due to the fact that they only use as little as 10 per cent of the server load to provide the cooling. In contrast, legacy chillers or outdated DX systems often take up to 50 per cent of the server load for cooling.
Although legacy cooling units will often require wholesale replacement, sometimes parts of the existing infrastructure can be retained and utilised. For example, if the legacy chillers are no longer needed, their pipework which serves individual units can be used for free cooling systems in a new upgraded solution. However, given that 100 per cent free cooling is only available up to an outdoor temperature of around 15°C, a DX circuit with new piping would also be needed to give total reliability. But because the old chilled pipework is no longer part of the critical infrastructure, if an issue occurs then there is no impact on the ongoing performance of the data centre.
In addition to pipework, it is also important not to overlook floor grilles. Standard types have a 50 per cent free area to allow the high air volume for densities up to 10kW per rack. If there is a cluster of racks with densities exceeding 20kW – which is not uncommon for applications using blade servers - a floor grille with 80 per cent free area should be installed.
Finally, with space at a premium within on-premise datacentre environments, it may be difficult to accommodate services infrastructure. However, new cooling technology can save space because these systems often don’t need a plant room, which in the past has been taken up by large chillers and pumping systems. New solutions will often require external space for condensers, but always with a footprint no larger than the systems that they replace.
Making the switch
There is no doubt that introducing an entirely new cooling methodology will require careful planning to mitigate risks and safeguard investments. A project that replaces cooling equipment should be phased and scheduled in advance. If space allows, new additional units should be installed first so that there is no erosion of cooling capacity at any time. However, if this is not an option, standby units may be used to take on the cooling load whilst other units are being replaced.
Work during times of peak data processing should be avoided, just in case a problem arises during the migration. If your customer is a bank, for example, avoid the end of the month when salaries are being processed. For a telecommunications company client, peak traffic might be during a high-profile sporting event, so that would be a time to avoid.
Making the decision to modernise cooling systems in data centres might initially sound like a hefty capital investment. However, with careful planning, some of the existing infrastructure can be reused without impacting on reliability. Plant space previously dedicated to cooling systems can be freed up, and no additional outdoor space is required. Coupled with the opportunity to make significant energy savings and boost reliability, the benefits offered by modern cooling technology far outweighs the capital cost, and a return on the investment can be made very quickly, often within two years.
For more information on FläktGroup’s complete range of climate control, air handling and ventilation products, visit www.FläktGroup.com