• IBM Consulting

    DBA Consulting can help you with IBM BI and Web related work. Also IBM Linux is our portfolio.

  • Oracle Consulting

    For Oracle related consulting and Database work and support and Migration call DBA Consulting.

  • Novell/RedHat Consulting

    For all Novell Suse Linux and SAP on Suse Linux questions releated to OS and BI solutions. And offcourse also for the great RedHat products like RedHat Enterprise Server and JBoss middelware and BI on RedHat.

  • Microsoft Consulting

    For Microsoft Server 2012 onwards, Microsoft Client Windows 7 and higher, Microsoft Cloud Services (Azure,Office 365, etc.) related consulting services.

  • Citrix Consulting

    Citrix VDI in a box, Desktop Vertualizations and Citrix Netscaler security.

  • Web Development

    Web Development (Static Websites, CMS Websites (Drupal 7/8, WordPress, Joomla, Responsive Websites and Adaptive Websites).

19 June 2019

Enterprise Linux 8 is here!

How IBM’s Red Hat Acquisition Redefines the Cloud - Enterprise Linux 8 is here!

Four years ago Red Hat released its flagship product - RHEL 7 - dominating the Enterprise Linux world. Now the much anticipated major release of Red Hat Enterprise Linux 8, promises to be just as popular with today’s cloud and container-based IT world. Join Red Hat Solutions Architect, John Walter, as he walks through the game-changing features and benefits of this new release.

Red Hat Enterprise Linux 8 Overview

As a member of Red Hat’s training and certification team, John will discuss concepts covered in RH354, the training course developed to support RHEL 8. This new course is valuable to operators, managers, system administrators and other IT professionals currently working with RHEL 7 and looking to migrate to RHEL 8.

IBM recently announced its $34 billion acquisition of Red Hat. This blockbuster deal signals a sea change in cloud as IT managers broaden their view of what cloud is and how it’s most effectively deployed. With Red Hat, IBM is positioned to lead in the multi cloud world.

Prior to this acquisition, IBM already was a full-featured cloud provider, with both on-prem and cloud-based elements in their portfolio.

Red Hat Enterprise Linux 8: New Features and Benefits Overview

With Red Hat, IBM now adds two new dimensions:

Fluid application and data migration:  Red Hat brings Linux-based tools, including containers, the OpenShift container platform, and Kubernetes orchestration. Part of the multi cloud promise is to make clouds interchangeable with seamless workload and data migration. This helps make that possible.
Multi-cloud interoperability: Orchestration, of course, is only half of the puzzle. The other half is broad platform interoperability, something the open source Red Hat platform was built to deliver.
Why Enabling the Multi Cloud Matters

TECHNICAL INTRODUCTION TO RHEL 8 Ron Marshall Senior Solutions Architect February 2019
Red Hat, Inc. (NYSE: RHT), the world's leading provider of open source solutions, today announced the general availability of Red Hat Enterprise Linux 8, the operating system designed to span the breadth of deployments across enterprise IT. For any workload running on any environment, Red Hat Enterprise Linux 8 delivers one enterprise Linux experience to meet the unique technology needs of evolving enterprises. From deploying new Linux workloads into production to launching digital transformation strategies, the next-generation enterprise is built on top of the world’s leading enterprise Linux platform.

Spanning the entirety of the hybrid cloud, the world’s leading enterprise Linux platform provides a catalyst for IT organizations to do more than simply meet today’s challenges; it gives them the foundation and tools to launch their own future, wherever they want it to be.
Stefanie Chirasvice President And General Manager, Red Hat Enterprise Linux, Red Hat

Red Hat Enterprise Linux 8 is the operating system redesigned for the hybrid cloud era and built to support the workloads and operations that stretch from enterprise datacenters to multiple public clouds. Red Hat understands that the operating system should do more than simply exist as part of a technology stack; it should be the catalyst for innovation.

What's New In Red Hat Enterprise Linux 8

5 new features of RHEL8 in 12 minutes

RHEL 8  - The intelligent OS for Hybrid Cloud
Any Cloud Any Workload One OS

In this demo managed by Michele Naldini - Senior Solution Architect we'll briefly review 5 new features of Rhel 8: 

1) new cockpit interface 
2) use podman to start a rootless containers (managed by unprivileged user)
3) use buildah to build a custom image starting from UBI (universal base images):  https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/8/html-single/building_running_and_managing_containers/index#how_are_ubi_images_different
4) build a blueprint and custom rhel images from cockpit to use any app any where 
5) terminal recording with tlog to enhance security and review activities performed on your rhel system

From Linux containers and hybrid cloud to DevOps and artificial intelligence (AI), Red Hat Enterprise Linux 8 is built to not just support enterprise IT in the hybrid cloud, but to help these new technology strategies thrive.

As the importance of hybrid cloud and multicloud deployments grow, the operating system must evolve as well. According to IDC1, 70 percent of customers already deploy multicloud environments and 64 percent of applications in a typical IT portfolio today are based in a cloud environment, whether public or private. Red Hat views the operating system as the keystone to this IT innovation and more, especially as Red Hat Enterprise Linux is poised to impact more than $10 trillion in global business revenues in 2019, according to a Red Hat-sponsored IDC report.

Agile Integration with APIs and Containers Workshop

Red Hat Enterprise Linux 8: Intelligent Linux for the hybrid cloud

For more than 15 years, Red Hat has helped enterprises innovate on Linux, first in their datacenters and now across the hybrid cloud. As datacenters grow in scale and scope and workload complexity builds, the skills required to deploy and maintain Linux-based production systems become increasingly critical. With the announcement of Red Hat Enterprise Linux 8, this intelligence and expertise is now built-in to Red Hat Enterprise Linux subscriptions by default with Red Hat Insights, delivering Red Hat’s Linux expertise as-a-service.

See how Red Hat’s newest tools and technologies help customers conquer their own audacious goals—the same way they’ve helped us attain ours. 

Hear Red Hat's executive vice president and president, Products and Technologies, discuss Red Hat's three bold goals, the objectives our customers have set out to accomplish, and what results they've achieved so far. Plus, we recognize the individual efforts of this year's Red Hat Certified Professional of the Year. 

Red Hat Enterprise Linux 8 - May 8 - Red Hat Summit 2019

Red Hat Insights helps proactively identify and remediate IT issues, from security vulnerabilities to stability problems. It uses predictive analytics based on Red Hat’s vast knowledge of open technologies to help administrators avoid problems and unplanned downtime in production environments.

Managing systems dispersed across a variety of on-premise and cloud-based infrastructure can present a significant challenge to IT organizations. Red Hat Smart Management, a layered add-on for Red Hat Enterprise Linux, helps IT teams gain the benefits of hybrid cloud computing while minimizing its inherent management complexities. Combining Red Hat Satellite for on-premise systems management and cloud management services for distributed Red Hat Enterprise Linux deployments, Red Hat Smart Management provides rich capabilities to manage, patch, configure and provision Red Hat Enterprise Linux deployments across the hybrid cloud.

Red Hat Enterprise Linux 8: Blazing a faster path to modern applications

To meet evolving business demands, IT organizations are looking to new workloads, from artificial intelligence (AI) to the Internet-of-Things (IoT), to drive competitive advantages in crowded marketplaces. Linux provides the innovative muscle to power these differentiated services, but only Red Hat Enterprise Linux 8 delivers this innovation along with a hardened code base, extensive security updates, award-winning support and a vast ecosystem of tested and validated supporting technologies.

Best practices for optimizing Red Hat platforms for large scale datacenter deployments on DGX systems

Red Hat Enterprise Linux has always been known as the most stable and secure foundation for applications. However, in the past it was hard to get the most up-to-date languages and frameworks that developers wanted without compromising that stability. Red Hat Enterprise Linux 8 introduces Application Streams - fast-moving languages, frameworks and developer tools are updated frequently in this stream without impacting the core resources that have made Red Hat Enterprise Linux an enterprise benchmark. This melds faster developer innovation with production stability in a single, enterprise-class operating system.

Red Hat Enterprise Linux 8: Introducing a world of opportunity for everyone

Linux continues to be the number one operating system for developers building the next generation of enterprise applications. As these applications move into production, stability, enhanced security and testing/certification on existing hardware and environments become paramount needs. This shifts the onus from developers to operations teams and, paired with the trend of Linux being looked to as a primary platform for production applications, makes Linux administration and management skills critical for modern datacenters. Red Hat Enterprise Linux 8 is designed to lower the barrier to entry for Linux, enabling greater accessibility for Windows administrators, Linux beginners and new systems administrators without fear of the command line.

Using Leapp and Boom to Upgrade to the RHEL 8

Red Hat Enterprise Linux 8 abstracts away many of the deep complexities of granular sysadmin tasks behind the Red Hat Enterprise Linux web console. The console provides an intuitive, consistent graphical interface for managing and monitoring Red Hat Enterprise Linux system, from the health of virtual machines to overall system performance. To further improve ease of use, Red Hat Enterprise Linux supports in-place upgrades, providing a more streamlined, efficient and timely path for users to convert Red Hat Enterprise Linux 7 instances to Red Hat Enterprise Linux 8 systems.

Triangle Kubernetes Meetup - Performance Sensitive Apps in OpenShift

Red Hat Enterprise Linux 8 also includes Red Hat Enterprise Linux System Roles, which automate many of the more complex tasks around managing and configuring Linux in production. Powered by Red Hat Ansible Automation, System Roles are pre-configured Ansible modules that enable ready-made automated workflows for handling common, complex sysadmin tasks. This automation makes it easier for new systems administrators to adopt Linux protocols and helps to eliminate human error as the cause of common configuration issues.

Red Hat Enterprise Linux: Enabling the world of possibilities without sacrificing security

IT innovation is rooted in open source, with Linux often serving as the catalyst for major advancements in enterprise technology, from Linux containers and Kubernetes to serverless and AI. Backed by a more secure, hardened open source supply chain, Red Hat Enterprise Linux 8 helps pave the way for IT organizations to adopt production-ready innovation by deploying only the necessary packages for specific workloads. This enhances the adoption of emerging technologies while helping to minimize potential risk.

To enhance security, Red Hat Enterprise Linux 8 supports the OpenSSL 1.1.1 and TLS 1.3 cryptographic standards. This provides access to the strongest, latest standards in cryptographic protection that can be implemented system-wide via a single command, limiting the need for application-specific policies and tuning.

With cloud-native applications and services frequently driving digital transformation, Red Hat Enterprise Linux 8 delivers full support for the Red Hat container toolkit. Based on open standards, the toolkit provides technologies for creating, running and sharing containerized applications. It helps to streamline container development and eliminates the need for bulky, less secure container daemons.

Every datacenter. Every cloud. Every application.

Red Hat Enterprise Linux 8 drives a thriving partner ecosystem, as is expected of Red Hat Enterprise Linux, encompassing thousands of certified applications, Linux container images, hardware configurations and cloud providers. Building on the deep partnerships forged by Red Hat with other IT leaders and through extensive testing, Red Hat Enterprise Linux 8 drives added value for specific hardware configurations and workloads, including the Arm and POWER architectures as well as real-time applications and SAP solutions.

NVIDIA GTC 2019: Red Hat and the NVIDIA DGX: Tried, Tested, Trusted

Red Hat Enterprise Linux 8 forms the foundation for Red Hat’s entire hybrid cloud portfolio, starting with Red Hat OpenShift 4 and the upcoming Red Hat OpenStack Platform 15. Also built on Red Hat Enterprise Linux 8 is the forthcoming Red Hat Enterprise Linux CoreOS, a minimal footprint operating system designed to host Red Hat OpenShift deployments.

Red Hat Enterprise Linux 8 is also broadly supported as a guest operating system on Red Hat hybrid cloud infrastructure, including Red Hat OpenShift 4, Red Hat OpenStack Platform 15 and Red Hat Virtualization 4.3.

Enabling the multi-cloud may seem like a curious move for a cloud provider. After all, it makes it easier for your customers to go to the competition! But it’s a savvy move for two reasons:

Users are going to the multi-cloud anyway

In a recent IDC survey, IT managers report that 75% of workloads would ideally run in a diverse cloud world, not just on a single public cloud. The customers will go to the provider that enables their preferred model.

Multi Cloud combines public cloud and private cloud

Enabling the multi cloud drives dominance

Being open actually drives customers to you. Gartner stated this about the multi cloud:
“Most organizations will pursue a multi cloud strategy, although most will also designate a primary cloud provider for a particular purpose, and are likely to have 80% or more of those types of workloads in their primary provider.”
By this thinking, the vendor who best enables the multi-cloud will also reap the preponderance of the revenue.

Cloudian Delivers on the Multi Cloud Vision Today

Cloudian has been promoting the multi cloud vision since January 2018. With the launch of HyperStore 7, Cloudian began supporting multi cloud deployments across private cloud and public clouds including AWS, GCP, and Azure.

Cloudian links divergent environments with:

  • A single view of data, combining private + public clouds
  • Common API across clouds
  • Single-point management
IBM, in fact, mirrored these same values in their Red Hat announcement. Here’s a deal summary, annotated with Cloudian points:

IBM Red Hat multi cloud benefits mirror Cloudian object storage benefits

HyperStore 7 Converges the Clouds

HyperStore 7 is a scale-out object storage platform and multi-cloud controller in a single software image. Deployed on prem or in the cloud, it enables all cloud types:

  • Private cloud: Deploy on-prem or as a hosted private cloud for scalable storage
  • Hybrid cloud: Link to any cloud (AWS, GCP, Azure) and replicate or migrate data using policy-based tools — without middleware or 3rd party software
  • Multi cloud: Deploy in multiple clouds to provide single-API connectivity and a common management framework
Combine these capabilities to create whatever management model your use case demands.

Multicloud architecture combines object storage and cloud interoperability

The Multi-Cloud Takes Shape

With Red Hat, IBM has advanced the multi-cloud conversation, further validating an important market direction. Ultimately, both consumers and cloud providers will benefit as open solutions expand the possibilities for everyone.

In the early days of cloud, the providers were walled gardens with unique APIs and proprietary management tools. The web also started as a walled garden (anyone remember Prodigy and AOL?). While web fortunes were made in those early days, the fastest part of the web growth curve starting after the walls came down. The same could well happen here.

Learn more about Cloudian HyperStore Multi-Cloud at https://cloudian.com.

RHEL 8 (Red Hat Enterprise Linux 8) was released in Beta on November 14, 2018, with new features and improvements as compared to the antecedent – RHEL 7.

Newly introduced cool features of RHEL 8

Going Atomic with your Container Infrastructure

Improved System Performance

Red hat includes many container tools in RHEL8. It brings support for Buildah, Podman, and Skopeo.
System management boost up with the composer features. This feature facilitates organizations to build and deploy customRHEL images.
RHEL 8 brings support for the Stratis filesystem, file system snapshots, and LUKSv2 disk encryption with Network-BoundDisk Encryption (NBDE).
The new Red Hat Enterprise Linux Web Console also enhances the management of RHEL. It enables administrators to deal with bare metal, virtual, local and remote Linux servers.


The new security is also a key element of RHEL 8. The addition of support for the Open SSL 1.1.1 and TLS 1.3 cryptographic standard makes RHEL 8 remarkable.
By integrating the new features, Red Hat makes it easier for the system administrator to manage. The administrator can switch between modes (default, legacy, future, and fips) by using the new update -crypto-policies command.
System-wide cryptographic policies are functional by default.
Application Streams
With the idea of Application stream, RHEL8 is following the Fedora Modularity lead.
With the release of Fedora 28, earlier this year, Led Fedora Linux distribution (Red Hat’s community) introduced the concept of modularity.
Without waiting for the next version of the operating system, User  space components will update in less time than core operating system packages.
Installations of many versions of the same packages (such as an interpreted language or a database) are also available by the use of an application stream.

Red Hat Enterprise Linux 8 - Develop and Deploy faster | DevNation Live


The biggest single change in RHEL 8 system performance is the new upper limit on physical memory capacity.
RHEL 8 has an upper limit of 4PB of physicalmemory capacity. It is much higher than the RHEL 7, which is having a physicalupper limit of 64TB of system memory per server.

Focused Features of RHEL 8

  • For desktop users, Wayland is the default display server as a replacement of the X.org server. Yet X.Org is still available.
  • RHEL 8 supports PHP 7.2
  • In RHEL8, Nginx 1.14 is available in the core repository
  • Shared copy-on-write data extents are supported by XFS.
  • Iptables are replaced by the nftables as a default network filtering framework.
  • The new version of YUM4 comes with RHEL 8 which is based on DNF.
  • It is compatible with the YUM v3 (which is present in RHEL 7).
  • It provides fast performances and less installed dependencies.
  • To meet specific workload requirements, it provides more choices of package version.
  • RPM v4.14 is available in RHEL 8. Before starting the installation; RPM validates the whole package contents.

Along With the addition of new technologies, this new release removes some of the older technologies.
  • Python is not installed by default. The default implementation is Python 3.6.
  • Limited support for python 2.6.
  • KDE support has been deprecated.
  • The Up-gradation from KDE on RHEL 7 to GNOME on RHEL 8 is unsupported.
  • Removal of Btrfs support.
Major difference between RHEL 7 and RHEL 8

Red Hat Enterprise Linux 8 architecture

To simplify your development experience, Red Hat Enterprise Linux 8 has three pre-enabled repositories:

  • BaseOS —“mostly” has operating system content
  • Application Streams (AppStream) — most developer tools will be here
  • CodeReady Builder — additional libraries and developer tools
  • Content in BaseOS is intended to provide the core set of the underlying operating system functionality that provides the foundation for all installations. This content is available in the traditional RPM format. For a list of BaseOS packages, see RHEL 8 Package Manifest.

Application Streams, essentially the next generation of Software Collections, are intended to provide additional functionality beyond what is available in BaseOS. This content set includes additional user space applications, runtime languages, databases, web servers, etc. that support a variety of workloads and use cases. The net for you is to simply use the component and version that you want. Once there’s market demand, newer stable versions of components will be added.

Linux containers

Linux containers are a critical component of cloud-native development and microservices, so Red Hat’s lightweight, open standards-based container toolkit is now fully supported and included with Red Hat Enterprise Linux 8. Built with enterprise IT security needs in mind, Buildah (building containers), Podman (running containers), and Skopeo (sharing/finding containers) help developers find, run, build and share containerized applications more quickly and efficiently—thanks to the distributed and, importantly, daemonless nature of the tools.

Deploy a modern data platform with SQL Server 2019 on Red Hat Enterprise Linux 8

Introducing Universal Base Image

Derived from Red Hat Enterprise Linux, the Red Hat Universal Base Image (UBI) provides a freely redistributable, enterprise-grade base container image on which developers can build and deliver their applications. This means you can containerize your app in UBI, and deploy it anywhere. Of course, it will be more secure and Red Hat supported when deployed on Red Hat Enterprise Linux, but now you have options. There are separate UBI 7 and UBI 8 versions for Red Hat Enterprise Linux 7 and 8, respectively. Read more about them in the Red Hat Universal Base Image introduction.

Red Hat Enterprise Linux 8 developer resources
Over the past few months, we have produced a number of how-to documents specifically for Red Hat Enterprise Linux 8. Here’s a list in case you missed them:

  • Intro to Application Streams—a primer about how Red Hat Enterprise Linux 8 has been re-architected with developers in mind
  • Red Hat Enterprise Linux 8 Cheat Sheet—your quick reference to new Red Hat Enterprise Linux 8 commands, and a list of the more common developer tools
  • Introduction to Builder Repo—read what it is and why you’ll find it handy
  • Installing Java 8 and 11—no more to say
  • Set up your LAMP stack—with Apache, MySQL, and PHP
  • Building containers without daemons—intro to using Podman, Buildah, and more.
  • XDP part 1 & part 2
  • Network debugging with eBPF
  • Quick install on VirtualBox
  • Quick install on bare metal
  • Python in RHEL 8
  • Quick install: Node.js
  • What, no python in RHEL 8?
  • Quick install: Python
  • Image Builder: Building custom system images
  • Introduction to Red Hat Universal Base Image (UBI)

What's New in Red Hat Satellite

Red Hat Developer Subscriptions

Red Hat Developer members have been enjoying no-cost developer subscriptions for 3+ years now, and RHEL 8 is now automatically part of that. If your company wants developer support, there are several Red Hat Enterprise Linux Developer Subscriptions options with Red Hat support, too.

Red Hat Enterprise Linux 8 was unveiled, the latest and greatest edition of Red Hat's signature operating system. Red Hat is billing it as being "redesigned for the hybrid cloud era and built to support the workloads and operations that stretch from enterprise datacenters to multiple public clouds." 

That's not surprising coming from a company that's been billing itself as a cloud company instead of as a Linux company, which is how it got its start, for a number of years. It was already a long-time proponent of hybrid cloud five years ago when RHEL 7,  the previous major release, was first ready for download, and that was a time when the cloud was just getting into high gear, containers were just starting to show their promise, and "DevOps," "agile," and "microservices" had not yet become the buzzwords of the decade.

These days, the company earns much of its money building tailored hybrid cloud systems for enterprises, so designing RHEL 8 to help users take advantage of cloud native technologies and DevOps workflows was a no-brainer, as it plays into Red Hat's hand. It's also central to IBM, which shelled out $34 billion to buy Red Hat, hoping to buoy its own aspirations for dominance in the hybrid cloud arena.

"Clouds are built on Linux operating systems, by and large, and containers not only require a Linux operating system underneath them, but also most containers actually have a Linux distribution in them," Gunnar Hellekson, Red Hat's senior director of product management, told Data Center Knowledge at Red Hat Summit. "The choices that we make in Red Hat Enterprise Linux 8 are focused not just on the existing Red Hat Linux traditional use cases, but also focusing on these new cloud and container use cases as well."

General Future aHead with: Jim Whitehurst-Ginni Rometty Q&A - May 7 - Red Hat Summit 2019

Keeping "traditional use case" customers happy, those running Linux on-premises or in colocation facilities to support monolithic legacy applications, is also near the top of Red Hat's agenda, since plain vanilla support contracts remain the single largest source of income for the company. The company was quick to reassure traditional enterprise users attending the summit that RHEL 8 remains the rock steady operating system it's always been, and pointed out that it ships with "tens of thousands" of hardware configurations and thousands of ISP applications, both of which are especially important to traditional on-prem users.

But what Red Hat was selling was the new operating system's cloud native prowess, along with its added support for the DevOps workflow.

Helping DevOps

While the emphasis was on RHEL's new cloud and container capabilities, the most useful new features might be the improvements made in the way the OS interoperates with the DevOps model, which seeks to combine development and operations into a single unit. Red Hat's focus is to ease the burden on the ops side, freeing up teams to devote more time and energy to the dev side of the equation, while also addressing the changing face of data center workforces.

"In this latest release we've included a tool called the Web Console, which is a graphical interface to point and click your way through some basic systems management tasks," Hellekson said, "hopefully lowering the barrier of entry for people who are new to Linux."

Features such as Web Console, along with System Roles which supply consistent interfaces to automate routine systems administration tasks, are important to the DevOps model, where team members with little traditional admin experience often need to handle Linux administrative tasks.

Ansible-based System Roles were introduced in the last RHEL release, but have been expanded in RHEL 8, with particular emphasis placed on making sure automated system tasks will survive an upgrade to the next latest-and-greatest RHEL when it comes along.
"In the past, the problem has been when you move to a new version of the operating system you have to redo all your automation, because of new interfaces, things being named differently, and so on," 
Hellekson explained. 
"But with System Roles we're creating stability across the major releases so you don't have to retool when you do a new update."

This should be especially useful to DevOps teams going forward, since Red Hat plans for major versions of RHEL to be released more often, perhaps as often as every three years.

Another added feature to aid DevOps teams is Application Streams, which keeps databases, interpreters, and other third-party software bundled and supported in RHEL updated to the latest version, with control given to deny the update and stick with the version being used, or even to roll back to previous versions.

Red Hat OpenShift 4: The Kubernetes platform for big ideas.

RHEL and the Hybrid Cloud

For cloud and containers, RHEL 8 includes features that normally would have to be installed and managed separately.

"Baked into the operating system we have what we're calling the Container Toolkit, which includes tools like Podman, Buildah, and Skopeo," 

Hellekson said. 

"We make these available in the operating system because we know customers rely on us to provide them that kind of basic fundamental tooling in order to build things like OpenShift, or even OpenStack."

Also important for hybrid cloud deployments, RHEL 8 makes it easy to build gold, also called "master," images for everything from bare metal to virtual machines to public clouds. This is important because even relatively small deployments will now usually need to scale across diverse platforms, at least from on-premises to cloud.

OpenShift 4 Full Serverless Workflow: Knative Eventing, Serving, and Building

"If you're building a gold image, you have to build it one way for a physical server, then you have to build a virtual machine in a different way, and you're going to do it differently for the cloud provider," he said. "We have a tool upstream we call the Composer and a product we call the Image Builder, and this allows the customer to create a blueprint for their gold RHEL image."

The Image Builder can be accessed either through the command line or a GUI. When accessing through the interface, he said that with one button "a customer can make an ISO  for physical servers, it'll make a virtual machine image for VMs, it will make an Amazon and Azure image, and so forth."

Hellekson also stressed the amount of effort that Red Hat has exerted to make sure the user experience is consistent across platforms, with no architecture-specific surprises.

"The hardware world has gotten a lot more fragmented than it was in the past," he said. "You have new architectures, like Power and Arm, that are beginning to ascend. You have the public cloud providers trying to compete against both each other and against the on-premise hardware providers, so they're trying to distinguish themselves with things like GPU acceleration, FPGAs, and things like that. The trick to being an operating system in an environment like that is you have to take all comers. You have to enable all of these different variants of the platforms and still provide that consistent experience."

Road Ahead OpenShift Kubernetes & Beyond with Brian Gracely at OpenShift Commons Gathering 2019

21 May 2019

The Next Decade in Quantum Computing—and How to Play

The Next Decade in Quantum Computing

The experts are convinced that in time they can build a high-performance quantum computer. Given the technical hurdles that quantum computing faces—manipulations at nanoscale, for instance, or operating either in a vacuum environment or at cryogenic temperatures—the progress in recent years is hard to overstate. In the long term, such machines will very likely shape new computing and business paradigms by solving computational problems that are currently out of reach. They could change the game in such fields as cryptography and chemistry (and thus material science, agriculture, and pharmaceuticals) not to mention artificial intelligence (AI) and machine learning (ML). We can expect additional applications in logistics, manufacturing, finance, and energy. Quantum computing has the potential to revolutionize information processing the way quantum science revolutionized physics a century ago.

Quantum Computing

Every company needs to understand how quantum computing discoveries will affect business.

The future of quantum computing

The full impact of quantum computing is probably more than a decade away. But there is a much closer upheaval gathering force, one that has significance now for people in business and that promises big changes in the next five to ten years. Research underway at multiple major technology companies and startups, among them IBM, Google, Rigetti Computing, Alibaba, Microsoft, Intel, and Honeywell, has led to a series of technological breakthroughs in building quantum computer systems. These efforts, complemented by government-funded R&D, make it all but certain that the near to medium term will see the development of medium-sized, if still error-prone, quantum computers that can be used in business and have the power and capability to produce the first experimental discoveries. Already quite a few companies are moving to secure intellectual property (IP) rights and position themselves to be first to market with their particular parts of the quantum computing puzzle. Every company needs to understand how coming discoveries will affect business. Leaders will start to stake out their positions in this emerging technology in the next few years.

Toward a Quantum Revolution for Computing

This report explores essential questions for executives and people with a thirst to be up-to-speed on quantum computing. We will look at where the technology itself currently stands, who is who in the emerging ecosystem, and the potentially interesting applications. We will analyze the leading indicators of investments, patents, and publications; which countries and entities are most active; and the status and prospects for the principal quantum hardware technologies. We will also provide a simple framework for understanding algorithms and assessing their applicability and potential. Finally, our short tour will paint a picture of what can be expected in the next five to ten years, and what companies should be doing—or getting ready for—in response.

How Quantum Computers Are Different, and Why It Matters

The first classical computers were actually analog machines, but these proved to be too error-prone to compete with their digital cousins. Later generations used discrete digital bits, taking the values of zero and one, and some basic gates to perform logical operations. As Moore’s law describes, digital computers got faster, smaller, and more powerful at an accelerating pace. Today a typical computer chip holds about 20x109 bits (or transistors) while the latest smartphone chip holds about 6x109 bits. Digital computers are known to be universal in the sense that they can in principle solve any computational problem (although they possibly require an impractically long time). Digital computers are also truly reliable at the bit level, with fewer than one error in 1024 operations; the far more common sources of error are software and mechanical malfunction.

Qubits can enable quantum computing to achieve an exponentially higher information density than classical computers.

Quantum computing random walks and adiabatic computation

Quantum computers, building on the pioneering ideas of physicists Richard Feynman and David Deutsch in the 1980s, leverage the unique properties of matter at nanoscale. They differ from classical computers in two fundamental ways. First, quantum computing is not built on bits that are either zero or one, but on qubits that can be overlays of zeros and ones (meaning part zero and part one at the same time). Second, qubits do not exist in isolation but instead become entangled and act as a group. These two properties enable qubits to achieve an exponentially higher information density than classical computers.

There is a catch, however: qubits are highly susceptible to disturbances by their environment, which makes both qubits and qubit operations (the so-called quantum gates) extremely prone to error. Correcting these errors is possible but it can require a huge overhead of auxiliary calculations, causing quantum computers to be very difficult to scale. In addition, when providing an output, quantum states lose all their richness and can only produce a restricted set of probabilistic answers. Narrowing these probabilities to the “right” answer has its own challenges, and building algorithms in a way that renders these answers useful is an entire engineering field in itself.

Future Decoded Quantum Computing Keynote

That said, scientists are now confident that quantum computers will not suffer the fate of analog computers—that is, being killed off by the challenges of error correction. But the requisite overhead, possibly on the order of 1,000 error-correcting qubits for each calculating qubit, does mean that the next five to ten years of development will probably take place without error correction (unless a major breakthrough on high-quality qubits surfaces). This era, when theory continues to advance and is joined by experiments based on these so-called NISQ (Noisy Intermediate-Scale Quantum) devices, is the focus of this report. (For more on the particular properties of quantum computers, see the sidebar, “The Critical Properties of Quantum Computers.” For a longer-term view of the market potential for, and development of, quantum computers, see “The Coming Quantum Leap in Computing,” BCG article, May 2018. For additional context—and some fun—take the BCG Quantum Computing Test.)

The Emerging Quantum Computing Ecosystem

Quantum computing technology is well-enough developed, and practical uses are in sufficiently close sight, for an ecosystem of hardware and software architects and developers, contributors, investors, potential users, and collateral players to take shape. Here’s a look at the principal participants.


Universities and research institutions, often funded by governments, have been active in quantum computing for decades. More recently, as has occurred with other technologies (big data for example), an increasingly well-defined technology stack is emerging, throughout which a variety of private tech players have positioned themselves.

Efficient Synthesis of Universal Probabilistic Quantum Circuits

An increasingly well-defined technology stack is emerging.

At the base of the stack is quantum hardware, where the arrays of qubits that perform the calculations are built. The next layer is sophisticated control systems, whose core role is to regulate the status of the entire apparatus and to enable the calculations. Control systems are responsible in particular for gate operations, classical and quantum computing integration, and error correction. These two layers continue to be the most technologically challenging.Next comes a software layer to implement algorithms (and in the future, error codes) and to execute applications. This layer includes a quantum-classical interface that compiles source code into executable programs. At the top of the stack are a wider variety of services dedicated to enabling companies to use quantum computing. In particular they help assess and translate real-life problems into a problem format that quantum computers can address.

The actual players fall into four broad categories.

End-to-End Providers. These tend to be big tech companies and well-funded startups. Among the former, IBM has been the pioneer in quantum computing and continues at the forefront of the field. The company has now been joined by several other leading-edge organizations that play across the entire stack. Google and more recently Alibaba have drawn a lot of attention. Microsoft is active but has yet to unveil achievements toward actual hardware. Honeywell has just emerged as a new player, adding to the heft of the group. Rigetti is the most advanced among the startups. (See “Chad Rigetti on the Race for Quantum Advantage: An Interview with the Founder and CEO of Rigetti Computing,” BCG interview, November 2018.)

Each company offers its own cloud-based open-source software platform and varying levels of access to hardware, simulators, and partnerships. In 2016 IBM launched Q Experience, arguably still the most extensive platform to date, followed in 2018 by Rigetti’s Forest, Google’s Cirq, and Alibaba’s Aliyun, which has launched a quantum cloud computing service in cooperation with the Chinese Academy of Sciences. Microsoft provides access to a quantum simulator on Azure using its Quantum Development Kit. Finally, D-Wave Systems, the first company ever to sell quantum computers (albeit for a special purpose), launched Leap, its own real-time cloud access to its quantum annealer hardware, in October 2018.

Particle Physics On a Chip the Search for Majorana Fermions -  Leo Kouwenhoven

Hardware and Systems Players. Other entities are focused on developing hardware only, since this is the core bottleneck today.  Again, these include both technology giants, such as Intel, and startups, such as IonQ, Quantum Circuits, and QuTech. Quantum Circuits, a spinoff from Yale University, intends to build a robust quantum computer based on a unique, modular architecture, while QuTech—a joint effort between Delft University of Technology and TNO, the applied scientific research organization, in the Netherlands—offers a variety of partnering options for companies. An example of hardware and systems players extending into software and services, QuTech launched Quantum Inspire, the first European quantum computing platform, with supercomputing access to a quantum simulator. Quantum hardware access is planned to be available in the first half of 2019.

Software and Services Players. Another group of companies is working on enabling applications and translating real-world problems into the quantum world. They include Zapata Computing, QC Ware, QxBranch, and Cambridge Quantum Computing, among others, which provide software and services to users. Such companies see themselves as an important interface between emerging users of quantum computing and the hardware stack. All are partners of one or more of the end-to-end or hardware players within their mini-ecosystems. They have, however, widely varying commitments and approaches to advancing original quantum algorithms.

Specialists. These are mainly startups, often spun off from research institutions, that provide focused solutions to other quantum computing players or to enterprise users. For example, Q-CTRL works on solutions to provide better system control and gate operations, and Quantum Benchmark assesses and predicts errors of hardware and specific algorithms. Both serve hardware companies and users.

The ecosystem is dynamic and the lines between tech layers easily blurred or crossed.

Particle Physics On a Chip the Search for Majorana Fermions

The ecosystem is dynamic and the lines between layers easily blurred or crossed, in particular by maturing hardware players extending into the higher-level application, or even service layers. The end-to-end integrated companies continue to reside at the center of the technology ecosystem for now; vertical integration provides a performance advantage at the current maturity level of the industry. The biggest investments thus far have flowed into the stack’s lower layers, but we have not yet seen a convergence on a single winning architecture. Several architectures may coexist over a longer period and even work hand-in-hand in a hybrid fashion to leverage the advantages of each technology.


For many years, the biggest potential end users for quantum computing capability were national governments. One of the earliest algorithms to demonstrate potential quantum advantage was developed in 1994 by mathematician Peter Shor, now at the Massachusetts Institute of Technology. Shor’s algorithm has famously demonstrated how a quantum computer could crack current cryptography. Such a breach could endanger communications security, possibly undermining the internet and national defense systems, among other things. Significant government funds flowed fast into quantum computing research thereafter. Widespread consensus eventually formed that algorithms such as Shor’s would remain beyond the realm of quantum computers for some years to come and even if current cryptographic methods are threatened, other solutions exist and are being assessed by standard-setting institutions. This has allowed the private sector to develop and pursue other applications of quantum computing. (The covert activity of governments continues in the field, but is outside the scope of this report.)

Quantum Algorithm - The Math of Intelligence

Quite a few industries outside the tech sector have taken notice of the developments in, and the potential of, quantum computing, and companies are joining forces with tech players to explore potential uses. The most common categories of use are for simulation, optimization, machine learning, and AI. Not surprisingly, there are plenty of potential applications.

Despite many announcements, though, we have yet to see an application where quantum advantage—that is, performance by a quantum computer that is superior in terms of time, cost, or quality—has been achieved.

However, such a demonstration is deemed imminent, and Rigetti recently offered a $1 million prize to the first group that proves quantum advantage.

Investments, Publications, and Intellectual Property

The activity around quantum computing has sparked a high degree of interest.2 People have plenty of questions. How much money is behind quantum computing? Who is providing it? Where does the technology stand compared with AI or blockchain? What regions and entities are leading in publications and IP?

With more than 60 separate investments totaling more than $700 million since 2012, quantum computing has come to the attention of venture investors, even if is still dwarfed by more mature and market-ready technologies such as blockchain (1,500 deals, $12 billion, not including cryptocurrencies) and AI (9,800 deals, $110 billion).

The bulk of the private quantum computing deals over the last several years took place in the US, Canada, the UK, and Australia. Among startups, D-Wave ($205 million, started before 2012), Rigetti ($119 million), PsiQ ($65 million), Silicon Quantum Computing ($60 million), Cambridge Quantum Computing ($50 million), 1Qbit ($35 million), IonQ ($22 million), and Quantum Circuits ($18 million) have led the way.

A regional race is also developing, involving large publicly funded programs that are devoted to quantum technologies more broadly, including quantum communication and sensing as well as computing. China leads the pack with a $10 billion quantum program spanning the next five years, of which $3 billion is reserved for quantum computing. Europe is in the game ($1.1 billion of funding from the European Commission and European member states), as are individual countries in the region, most prominently the UK ($381 million in the UK National Quantum Technologies Programme). The US House of Representatives passed the National Quantum Initiative Act ($1.275 billion, complementing ongoing Department of Energy, Army Research Office, and National Science Foundation initiatives). Many other countries, notably Australia, Canada, and Israel are also very active.

The money has been accompanied by a flurry of patents and publishing. (See Exhibit 4.) North America and East Asia are clearly in the lead; these are also the regions with the most active commercial technology activity. Europe is a distant third, an alarming sign, especially in light of a number of leading European quantum experts joining US-based companies in recent years. Australia, a hotspot for quantum technologies for many years, is striking given its much smaller population. The country is determined to play in the quantum race; in fact, one of its leading quantum computing researchers, Michelle Simmons, was named Australian of the Year 2018.

Two things are noteworthy about the volume of scientific publishing regarding quantum computing since 2013. (See Exhibit 5.) The first is the rise of China, which has surpassed the US to become the leader in quantity of scientific articles published.3 The second is the high degree of international collaboration (in which the US remains the primary hub). The cooperation shows that quantum computing is not dominated by national security interests yet, owing in large part to consensus around the view that cryptographic applications are still further in the future and that effective remedies for such applications are in the making. The collaboration activity also reflects the need in the scientific community for active exchange of information and ideas to overcome quantum computing’s technological and engineering challenges.

Simplifying the Quantum Algorithm Zoo

The US National Institute for Standards and Technology (NIST) maintains a webpage entitled Quantum Algorithm Zoo that contains descriptions of more than 60 types of quantum algorithms. It’s an admirable effort to catalog the current state of the art, but it will make nonexperts’ heads spin, as well as those of some experts.

Quantum algorithms are the tools that tell quantum computers what to do. Two of their attributes are especially important in the near term:

Speed-Up. How much faster can a quantum computer running the algorithm solve a particular class of problem than the best-known classical computing counterpart?
Robustness. How resilient is the algorithm to the random “noise,” or other errors, in quantum computing?

There are two classes of algorithm today. (See Exhibit 8.) We call the first purebreds—they are built for speed in noiseless or error-corrected environments. The ones shown in the exhibit have theoretically proven exponential speed-up over conventional computers for specific problems, but require a long sequence of flawless execution, which in turn necessitate very low noise operations and error correction. This class includes Peter Shor’s factorization algorithm for cracking cryptography and Trotter-type algorithms used for molecular simulation. Unfortunately, their susceptibility to noise puts them out of the realm of practical application for the next ten years and perhaps longer.

Why the Deutsch-Josza Algorithm?

Deutsch–Jozsa algorithm. ... Although of little practical use, it is one of the first examples of a quantum algorithm that is exponentially faster than any possible deterministic classical algorithm. It is also a deterministic algorithm, meaning that it always produces an answer, and that answer is always correct.

Deutsch-Jozsa Algorithm

The Deutsch-Jozsa algorithm was the first to show a separation between the quantum and classical difficulty of a problem. This algorithm demonstrates the significance of allowing quantum amplitudes to take both positive and negative values, as opposed to classical probabilities that are always non-negative.

The Deutsch-Jozsa problem is defined as follows. Consider a function f(x)
 that takes as input n
-bit strings x
 and returns 0
 or 1
. Suppose we are promised that f(x)
 is either a constant function that takes the same value c∈{0,1}
 on all inputs x
, or a balanced function that takes each value 0
 and 1
 on exactly half of the inputs. The goal is to decide whether f
 is constant or balanced by making as few function evaluations as possible. Classically, it requires 2n−1+1
 function evaluations in the worst case. Using the Deutsch-Jozsa algorithm, the question can be answered with just one function evaluation. In the quantum world the function f
 is specified by an oracle circuit Uf
 (see the previous section on Grover’s algorithm, such that Uf|x⟩=(−1)f(x)|x⟩

Quantum Circuits and Algorithms

To understand how the Deutsch-Jozsa algorithm works, let us first consider a typical interference experiment: a particle that behaves like a wave, such as a photon, can travel from the source to an array of detectors by following two or more paths simultaneously. The probability of observing the particle will be concentrated at those detectors where most of the incoming waves arrive with the same phase.

Imagine that we can set up an interference experiment as above, with 2n
 detectors and 2n
 possible paths from the source to each of the detectors. We shall label the paths and the detectors with n
-bit strings x
 and y
 respectively. Suppose further that the phase accumulated along a path x
 to a detector y
 equals C(−1)f(x)+x⋅y
, where

is the binary inner product and C
 is a normalizing coefficient. The probability to observe the particle at a detector y
 can be computed by summing up amplitudes of all paths x
 arriving at y
 and taking the absolute value squared:
Normalization condition ∑yPr(y)=1
 then gives C=2−n
. Let us compute the probability Pr(y=0n)
 of observing the particle at the detector y=0n
 (all zeros string). We have Pr(y=0n)=|2−n∑x(−1)f(x)|2
If f(x)=c
 is a constant function, we get Pr(y=0n)=|(−1)c|2=1
. However, if f(x)
 is a balanced function, we get Pr(y=0n)=0
, since all the terms in the sum over x
 cancel each other.
We can therefore determine whether f
 is constant or balanced with certainty by running the experiment just once.
Of course, this experiment is not practical since it would require an impossibly large optical table! However, we can simulate this experiment on a quantum computer with just n
qubits and access to the oracle circuit Uf
. Indeed, consider the following algorithm:
Step 1. Initialize n
 qubits in the all-zeros state |0,…,0⟩
Step 2. Apply the Hadamard gate H
 to each qubit.
Step 3. Apply the oracle circuit Uf
Step 4. Repeat Step 2.
Step 5. Measure each qubit. Let y=(y1,…,yn)
 be the list of measurement outcomes.
We find that f
 is a constant function if y
 is the all-zeros string. Why does this work? Recall that the Hadamard gate H
 maps |0⟩
 to the uniform superposition of |0⟩
 and |1⟩
. Thus the state reached after Step 2 is 2−n/2∑x|x⟩
, where the sum runs over all n
-bit strings. The oracle circuit maps this state to 2−n/2∑x(−1)f(x)|x⟩
. Finally, let us apply the layer of Hadamards at Step 4. It maps a basis state |x⟩
 to a superposition 2−n/2∑y(−1)x⋅y|y⟩
. Thus the state reached after Step 4 is |ψ⟩=∑yψ(y)|y⟩
, where ψ(y)=2−n∑x(−1)f(x)+x⋅y

Using QISkit: The SDK for Quantum Computing

This is exactly what we need for the interference experiment described above. The final measurement at Step 5 plays the role of detecting the particle. As was shown above, the probability to measure y=0n
 at Step 5 is one if f
 is a constant function and zero if f
 is a balanced function. Thus we have solved the Deutsch-Jozsa problem with certainty by making just one function evaluation.
Example circuits
Suppose n=3
 and f(x)=x0⊕x1x2
. This function is balanced since flipping the bit x0
 flips the value of f(x)
 regardless of x1,x2
. To run the Deutsch-Jozsa algorithm we need an explicit description of the oracle circuit Uf
 as a sequence of quantum gates. To this end we need a Z0
 gate such that Z0|x⟩=(−1)x0|x⟩
 and a controlled-Z gate CZ1,2
 such that CZ1,2|x⟩=(−1)x1x2|x⟩
.  Using basic circuit identities (see the Basic Circuit Identities and Larger Circuits section), one can realize the controlled-Z gate as a CNOT sandwiched between two Hadamard gates.

Quantum information and computation: Why, what, and how

A Potential Quantum Winter, and the Opportunity Therein

Like many theoretical technologies that promise ultimate practical application someday, quantum computing has already been through cycles of excitement and disappointment. The run of progress over past years is tangible, however, and has led to an increasingly high level of interest and investment activity. But the ultimate pace and roadmap are still uncertain because significant hurdles remain. While the NISQ period undoubtedly has a few surprises and breakthroughs in store, the pathway toward a fault-tolerant quantum computer may well turn out to be the key to unearthing the full potential of quantum computing applications.

Some experts thus warn of a potential “quantum winter,” in which some exaggerated excitement cools and the buzz moves to other things. Even if such a chill settles in, for those with a strong vison of their future in both the medium and longer terms, it may pay to remember the banker Baron Rothschild’s admonition during the panic after the Battle of Waterloo: “The time to buy is when there’s blood in the streets.” During periods of disillusionment, companies build the basis of true competitive advantage. Whoever stakes out the most important business beachheads in the emerging quantum computing technologies will very likely do so over the next few years. The question is not whether or when, but how, companies should get involved.

More Information






















23 April 2019

Run the SAP solutions you already use on Azure

Run your largest SAP HANA workloads on Azure. Handle transactions and analytics in-memory on a single data copy to accelerate your business processes, gain business intelligence and simplify your IT environment.

Advance to the Cloud and Beyond with SAP S/4HANA

SAP HANA on Azure offers:

On-demand M-series virtual machines certified for SAP HANA with scale up to 4 TB.
Purpose-built SAP HANA instances that scale up to 20 TB on a single node.
Scale out SAP HANA capabilities up to 60 TB.
A 99.99 per cent service-level agreement for large instances in a high-availability pair, and a 99.9 per cent SLA for a single SAP HANA large instance.

The SAP HCM Evolution from On-Premise to the Cloud

Provide self-service visualisation for data in your SAP ERP Central Component (ECC), SAP Business Warehouse (BW) and S/4HANA using the SAP HANA and SAP BW connector in Power BI Desktop. Access tools and services that help you:

Copy data from SAP HANA into Azure data services using the Azure Data Factory connector for SAP HANA, SAP ECC and SAP BW.
Store data inexpensively using Azure Data Lake Storage.
Learn from your data using Azure Databricks and Azure Machine Learning service.

Explore the Real Advantages of a Public Cloud ERP with SAP S/4HANA Cloud

Discover how to integrate SAP application data with Azure, including guidance on Azure Data Factory and SAP, Power BI and SAP, Azure Analysis Services and SAP, Azure Data Catalog and SAP, and more.

Focus will be on the following areas:

  • Using Azure Data Services for Business Insights
  • SAP and Azure: Drivers for the data-driven enterprise
  • Azure and SAP Integration Architecture
  • Azure and SAP Integrations

SAP Analytics 2019 Strategy and Roadmap

You’ve known it for some time—all SAP NetWeaver landscapes will eventually migrate to SAP HANA and the cloud. But many things still need to work side by side until that journey is complete. How many S/4HANA systems will you need to deploy? How many short-lived S/4HANA systems will be needed for project purposes? This guide provides valuable tools and processes for successfully migrating your mission-critical SAP landscapes to the cloud.

SAP S/4HANA 1709 - Highlights with Rudolf Hois

Download this white paper and learn how Azure makes it possible to:
Migrate your existing SAP systems running on all SAP supported databases.
Deploy S/4HANA systems on demand and pay only for active infrastructure usage.
Scale infrastructure of older NetWeaver systems as your processes move into S/4HANA.
Protect your data on all SAP supported databases, such as Oracle, SQL Server, or IBM DB/2.
Ensure high availability for your production SAP instances with support for SQL Server AlwaysOn, HANA System Replication (HSR) and Oracle Dataguard.

Migrating SAP Applications to Azure

As organizations shift from simply extracting and storing data to gaining valuable insights, SAP customers continue to benefit from end-to-end data and analytics capabilities that work together.

SAP offers complete data and analytics solutions that provide everything you need to help your organization make faster, more intelligent decisions. We’re excited to share more product innovation details that will help bring data and analytics even closer together for our customers in 2019.

Gartner recently recognized SAP as a leader in the January 2019 Magic Quadrant for Data Management Solutions for Analytics for the seventh consecutive year. We see this position as supported by a strong multi-cloud strategy, multi-model data processing engines, and machine learning and artificial intelligence (AI) capabilities. The ability to process transactions and analytics on a single platform has given many organizations, including Asian Paints, information at their fingertips to better serve their customers.

The SAP Cloud Platform Open Connectors service by Cloud Elements

SAP is also positioned as a visionary in this year’s Magic Quadrant for Analytics and Business Intelligence Platforms. This is the first time a cloud-based analytics product from SAP has been included as the only SAP product in this Magic Quadrant. We’re enthusiastic for the cloud analytics momentum and wanted to provide a look ahead to some of our development priorities in 2019.


The shift from Data to Intelligence

Now in its fourth year of development, product use and adoption continue to rise as more customers take advantage of new capabilities and SAP business applications embed analytics functionality within their offerings to provide customers with automated insights, supporting our customers’ strategy to deliver an intelligent enterprise. The vision of SAP Analytics Cloud is clearly resonating, with SAP’s viability rated higher than any other customer group in the survey.

SAP’s strategy to bring all analytics together in one cloud-based platform for business intelligence (BI), enterprise planning, and augmented analytics is a clear market differentiator. SAP is investing heavily in its smart analytics capabilities and is positioned for where we believe Gartner anticipates the market is heading with augmented analytics. Enhancements in smart search help users find the right content, and algorithms guide data discovery and pattern detection with no data scientist required.

SAP’s business domain expertise for lines of business and industries is unmatched. This includes data models, stories, visualization, templates, agendas, and guidance on using data sources – all packaged with SAP Analytics Cloud. This pre-built content helps customers develop use cases and quickly realize value.

SAP’s strategy to support hybrid customer scenarios for cloud and on-premise analytics with SAP Analytics Hub, part of SAP Analytics Cloud, is an advantage for customers. SAP Analytics Hub offers a single point of access to all analytics content – be it SAP or non-SAP content – no matter where it resides.

SAP Cloud Platform – Data & Storage - Overview

Quality, performance, and customer support are all continued areas of focus for the SAP Analytics Cloud team. SAP has recently added a quarterly product release cycle for SAP Analytics Cloud in addition to the two-week release cycles. This gives customers the choice of receiving new innovations with SAP Analytics Cloud at a pace that works best for their organization. You can find out about all the latest features and enhancements with the product updates for SAP Analytics Cloud.

Our Customer Success team supports customers in their adoption and use of the product. The team continues to expand its operations, helping companies around the world with dedicated support. Customers can meet the team and get started with the Welcome Guide.

Our customers’ feedback is a top priority for us and plays a decisive role in our planning. In 2019, we are also focusing on further modelling improvements, with enhancements coming in the Q1 2019 time frame, and other developments to continue improving the overall user experience of data modelling.

On the topic of data connections, we have recently added 100 data sources via the SAP Analytics Cloud agent, providing the same level of data connectivity SAP BusinessObjects customers have had for years. These enhancements complement recent investment made in significantly increasing data source connectivity for SAP Analytics Cloud customers. In addition to this, the team has enhanced connectivity to the trusted business data sources of SAP S/4HANA, SAP Business Warehouse, and SAP BusinessObjects. Learn more about data connections in SAP Analytics Cloud.

SAP S/4HANA Finance and the Digital Core

We listened when our customers and partners asked us to provide more capabilities to embed and extend analytics. SAP Analytics Cloud is in the early stages of extending application programming interfaces (APIs) and also recently added APIs for SAP Analytics Cloud on the SAP API Business Hub, offering easier access to stories. Furthermore, SAP Analytics Cloud plans to include application design capabilities to further enhance this in the Q2 time frame. This functionality will allow developers and partners to build, embed, and extend their intelligent analytics applications, further supporting SAP’s customer strategy for the Intelligent Enterprise.

Azure Data Factory offers SAP HANA and Business Warehouse data integration

Azure Data Factory is a cloud-based data integration service that orchestrates and automates the movement and transformation of data, which support copying data from 25+ data stores on-premises and in the cloud easily and performantly. Today, we are excited to announce that Azure Data Factory newly enables loading data from SAP HANA and SAP Business Warehouse (BW) into various Azure data stores for advanced analytics and reporting, including Azure Blob, Azure Data Lake, Azure SQL DW, etc.

What’s new

SAP is one of the most widely-used enterprise software in the world. We hear you that it’s crucial for Microsoft to empower customers to integrate their existing SAP system with Azure to unblock business insights. Azure Data Factory start the SAP data integration support with SAP HANA and SAP BW, which are the most popular ones in the SAP stack used by enterprise customers.

SAP HANA Academy - Replicating SAP System data in SAP HANA with SLT

With this release, you can easily ingest data from the existing SAP HANA and SAP BW to Azure, so as to build your own intelligent solutions by leveraging Azure’s first-class information management services, big data stores, advanced analytics tools, and intelligence toolkits to transform data into intelligent action. More specifically:

The SAP HANA connector supports copying data from HANA information models (such as Analytic and Calculation views) as well as Row and Column tables using SQL queries. To establish the connectivity, you need to install the latest Data Management Gateway (version 2.8) and the SAP HANA ODBC driver. Refer to SAP HANA supported versions and installation on more details.
The SAP BW connector supports copying data from SAP Business Warehouse version 7.x InfoCubes and QueryCubes (including BEx queries) using MDX queries. To establish the connectivity, you need to install the latest Data Management Gateway (version 2.8) and the SAP NetWeaver library. Refer to SAP BW supported versions and installation on more details.

99 Facts on the Future of Business in the Digital Economy

Azure Data Services for Business Insights

Azure provides a number of data services ranging from cloud storage to SQL/NoSQL databases to big data, along with various services to process the data. In this whitepaper, we are going to introduce how various Azure services can help organizations discover, make sense, process and use the data to gain competitive advantage and better ROI. The organizational data may be stored in Azure data services, SAP technologies or other third-party services and it is critical to bring them together to realize business value.

SAP Cloud Platform - Big Data Services

The Azure services described in this whitepaper such as Azure Data Catalog, Azure Data Factory, Azure Logic Apps, Azure Analysis Services and Power BI play an important role in extracting the raw data to enable intelligent data driven decision making process. A typical workflow involving Azure services is as follows:

  • Azure Data Catalog helps in data source discovery and in adding context to the data.
  • Azure Data Factory through automated workflows can bring together all the data into a single version of the truth by combining it from various sources. When paired with solutions such as Azure SQL/HDInsight/Spark, the insights from big data can be made consumable using other tools.
  • Azure Logic Apps can help develop workflows that work with Microsoft Business Applications and third-party data. Azure Functions, the serverless architecture for applications, can help in developing custom logic into the integration process.
  • Using Azure Analysis Services, users can develop complex models on top of the data. When coupled with Microsoft Power BI, a self-service visualization tool, semantic models on top of data from various sources can add efficiencies to the process.
  • Power BI can provide powerful visualization to gain insights including predictive and decision insights. These can also be made available through APIs for developers to build applications or integrate with other applications.

Even though these tools can work independently support the entire lifecycle of data flows as they traverse from raw data to intelligent insights and solutions.

SAP PartnerEdge: ISV Partnership for Growth

SAP on Azure: Drivers for the data-driven enterprise

Microsoft and SAP have been working together to make SAP products deeply integrated and certified on Azure, making SAP products first class citizens on Azure. This partnership helps enterprise customers have a more seamless experience developing applications that integrates data from Azure’s Data Services along with the data stored in SAP systems. With SAP on Azure, enterprises can use machine learning and artificial intelligence on data stored in mission-critical systems and other data sources without high operational costs as deeper integration leads to cost savings.

Why SAP on Azure?

As enterprises pursue their data driven transformation, they find value in leveraging data in various Azure data services along with SAP data. A typical data migration may involve exporting SAP data onto an excel spreadsheet and then moving the data to Power BI service to visualize the data to gain insights. Apart from the operational and cost efficiencies in this process, this also leads to data silos and missed opportunities. To streamline the process, customers can use SAP on Azure with SAP HANA certified on-demand virtual machines such as Azure M-series or purpose built SAP HANA on Azure Large Instances. They can also tap into various Azure data services to make the process efficient and remove any data silos. Azure provides a robust enterprise grade platform that offers agility and scale needed for integrating with SAP technologies along with security.

SAP Cloud Platform Integration Suite Q3/Q4 Webinar

Some benefits of running SAP applications on Azure are:

  • On-Demand and scalable: Azure provides compute resources on-demand with a pay per use model. Enterprises can scale the resources up and down based on their usage needs. Additionally, Azure also offers purpose-built HANA large instances that can support large memory implementations. 
  • Intelligent solutions: By taking advantage of Azure data services, analytics service and tools like Power BI, users gain insights from their data. 
  • Security and Compliance: Azure makes it easy to protect your data and solutions using encryption, Azure Security Center and Azure Active Directory, which provides single sign on across multiple services, both on-premises and cloud. With a variety of industry compliance and trust certifications, Azure provides a complete compliance solution. 
  • Global scale: Azure has the largest footprint of any cloud providers with 54 regions to support any scale, enabling organizations to optimize for the best user experience and to meet local data residency requirements. 
  • Enterprise-grade Resilience: Azure provides enterprise grade resilience with redundancy and geo-replication with 99.99% SLA. 
  • Lower TCO: Azure also provides for a 60% cost reduction when compared to traditional, on-premise storage systems. By using SAP technologies on Azure, customers can realize 40-75% TCO cost savings in Dev and Test environments.

Azure and SAP Integration Architecture

The following diagram gives a high-level architecture for running SAP and third-party data within the Azure data platform and Azure Analytics Services to build intelligent solutions.

Azure Data Platform combines the business operational data with streaming and analytics data:

  • To provide a data science service which data scientists, developers and other enterprise applications can use
  • An easy interface to access analyze, and visualize data to gain valuable insights
Whether it is about SAP applications or Azure Big Data services or Azure Analytics services or Azure Data Lake or external data sources, Azure makes it easy to manage the data and enforce policy and governance while providing developers with a seamless interface. With an easy extensibility of the data platform to bring together data from many sources and IoT Hub to collect data from various IoT devices, Azure is empowering customers to derive maximum value from their data.

We will go in depth about integration of SAP data with Azure’s data services and how customers can derive business value from these integrations.

Azure and SAP Integrations

For various Azure services:
1. Azure Data Factory and SAP
2. Power BI and SAP
3. Azure Analysis Services and SAP
4. Azure Data Catalog and SAP
5. Using iPaaS to integrate Microsoft Business Apps With SAP
6. Azure Active Directory and SAP

iPaaS: Bringing Microsoft Business Apps Workflows With SAP

Azure’s Integration Platform (iPaaS) offers an easy way to integrate Microsoft business applications with SAP technologies and other third party applications. Azure Logic Apps and other Azure services like Azure Functions and Azure Service Bus Messaging works together to provide this integration layer. Azure Logic Apps helps to implement and orchestrate visual workflows to leverage business processes using 100+ connections across different protocols, applications and systems running on Azure and on-premises. Azure Functions Service can then be tapped to bring custom logic, either as functions or Microservices, that is then leveraged by Azure Logic Apps. Azure Service Bus Messaging layer can then help decouple various steps in the integration process. If necessary, the API Management service can be used to handle http triggers.

SAP Cloud Platform in the Garage Virtual Event | Internet of Things (IoT) in the Cloud

Azure Logic Apps also connects Office 365 and Dynamics 365 applications with other enterprise applications, SaaS applications, SAP technologies and other third party applications. This helps bring business data stored in Office 365 and Dynamics 365 to work along with other data sources, making it easy to derive critical business workflows.

Azure Logic Apps Connector to SAP

Azure Logic Apps is the cloud based iPaaS offering that helps organization connect disparate data sources including SaaS and enterprise applications. By offering many out of the box integrations, Logic Apps lets you seamlessly connect data, applications and devices across cloud and on-premises to develop complex business workflows.

Key Benefits

  • Simplify and implement complex, scalable integrations and workflows for enterprise applications on the cloud, on-premises and Office 365
  • Brings speed and scalability into the enterprise integration space, Logic Apps scale up and down based on demand
  • Easy user interface with designer
  • Powerful management tools to tame the complexity
  • Easy to automate EAI, B2B/EDI, and business processes
In this section, we will briefly discuss various integration that can help Azure Logic App work with SAP technologies. SAP ERP Central Component (ECC) connector allows Azure Logic Apps to connect to on-premises or cloud SAP resources from inside a logic app. The connector supports message or data integration to and from SAP NetWeaver-based systems through Intermediate Document (IDoc) or Business Application Programming Interface (BAPI) or Remote Function Call (RFC).

The connector allows following three operations:

  • Send to SAP: Send IDoc or call BAPI functions over tRFC in SAP systems.
  • Receive from SAP: Receive IDoc or BAPI function calls over tRFC from SAP systems.
  • Generate schemas: Generate schemas for the SAP artifacts for IDoc or BAPI or RFC.

More information about this you can find use these resources:

  • Azure Logic Apps documentation
  • Connectors for Azure Logic Apps
  • Connect to SAP systems from Azure Logic Apps

SAP Cloud Platform Training | SAP HCP Training | SAP SCP Training

Azure is the right cloud platform to meet the demands of such data-driven enterprises. With the wide variety of services available in the Azure portfolio, organizations can easily bring all the data sources together including data residing in systems such as SAP, analyze the data, author, build models, deliver it for consumption by developer through an API and visualize using powerful and easy to use tools.

Introducing SAP HANA Spatial Services Application (2019 Edition)

Whether it is data commercialization using a Data Science as a Service platform or building a powerful predictive or decision analytics platform, Azure’s diverse set of data and integration services such as Azure Data Catalog, Azure Data Factory, Azure Logic Apps, Azure Analysis Service and Power BI with its tight integration with SAP technologies, are well positioned to help enterprises in their data driven journey. Filly, Azure Active Directory provides the enterprise grade identity management and security that is critical for protecting valuable organizational data. This whitepaper has given you an overview on how these technologies can be used to gain useful insights or build intelligent applications in your journey to get the most value for your SAP data on Azure.

More Information: