• IBM Consulting

    DBA Consulting can help you with IBM BI and Web related work. Also IBM Linux is our portfolio.

  • Oracle Consulting

    For Oracle related consulting and Database work and support and Migration call DBA Consulting.

  • Novell/RedHat Consulting

    For all Novell Suse Linux and SAP on Suse Linux questions releated to OS and BI solutions. And offcourse also for the great RedHat products like RedHat Enterprise Server and JBoss middelware and BI on RedHat.

  • Microsoft Consulting

    For Microsoft Server 2012 onwards, Microsoft Client Windows 7 and higher, Microsoft Cloud Services (Azure,Office 365, etc.) related consulting services.

  • Citrix Consulting

    Citrix VDI in a box, Desktop Vertualizations and Citrix Netscaler security.

  • Web Development

    Web Development (Static Websites, CMS Websites (Drupal 7/8, WordPress, Joomla, Responsive Websites and Adaptive Websites).

20 February 2019

Quantum Supremacy is here to come and Stay!

Quantum Information and Computation for Dummies

Quantum computers are devices capable of performing computations using quantum bits, or qubits.

The first thing we need to understand is what a qubit actually is. A “classical computer,” like the one you’re reading this on (either desktop, laptop, tablet, or phone), is also referred to as a “binary computer” because all of the functions it performs are based on either ones or zeros.

D-Wave seminar at Nagoya University: "An Introduction to Quantum Computing"

On a binary computer the processor uses transistors to perform calculations. Each transistor can be on or off, which indicates the one or zero used to compute the next step in a program or algorithm.

There’s more to it than that, but the important thing to know about binary computers is the ones and zeros they use as the basis for computations are called “bits.”

Quantum computers don’t use bits; they use qubits. Qubits, aside from sounding way cooler, have extra functions that bits don’t. Instead of only being represented as a one or zero, qubits can actually be both at the same time. Often qubits, when unobserved, are considered to be “spinning.” Instead of referring to these types of “spin qubits” using ones or zeros, they’re measured in states of “up,” “down,” and “both.”


Qubits can be more than one thing at a time because of a strange phenomenon called superposition. Quantum superposition in qubits can be explained by flipping a coin. We know that the coin will land in one of two states: heads or tails. This is how binary computers think. While the coin is still spinning in the air, assuming your eye isn’t quick enough to ‘observe’ the actual state it’s in, the coin is actually in both states at the same time. Essentially until the coin lands it has to be considered both heads and tails simultaneously.

A clever scientist by the name of Schrodinger explained this phenomenon using a cat which he demonstrated as being both alive and dead at the same time.

Quantum Computing: Untangling the Hype

Observation theory

Qubits work on the same principle. An area of related study called “observation theory” dictates that when a quantum particle is being watched it can act like a wave. Basically the universe acts one way when we’re looking, another way when we aren’t. This means quantum computers, using their qubits, can simulate the subatomic particles of the universe in a way that’s actually natural: they speak the same language as an electron or proton, basically.

Different companies are approaching qubits in different ways because, as of right now, working with them is incredibly difficult. Since observing them changes their state, and using them creates noise – the more qubits you have the more errors you get – measuring them is challenging to say the least.

This challenge is exacerbated by the fact that most quantum processors have to be kept at near perfect-zero temperatures (colder than space) and require an amount power that is unsustainably high for the quality of computations. Right now, quantum computers aren’t worth the trouble and money they take to build and operate.

In the future, however, they’ll change our entire understanding of biology, chemistry, and physics. Simulations at the molecular level could be conducted that actually imitate physical concepts in the universe we’ve never been able to reproduce or study.

Automatski - RSA-2048 Cryptography Cracked using Shor's Algorithm on a Quantum Computer

Quantum Supremacy

For quantum computers to become useful to society we’ll have to achieve certain milestones first. The point at which a quantum computer can process information and perform calculations that a binary computer can’t is called quantum supremacy.

Quantum supremacy isn’t all fun and games though, it presents another set of problems. When quantum computers are fully functional even modest systems in the 100 qubit range may be able to bypass binary security like a hot knife through butter.

This is because those qubits, which can be two things at once, figure out multiple solutions to a problem at once. They don’t have to follow binary logic like “if one thing happens do this but if another thing happens do something else.” Individual qubits can do both at the same time while spinning, for example, and then produce the optimum result when properly observed.

Currently there’s a lot of buzz about quantum computers, and rightfully so. Google is pretty sure its new Bristlecone processor will achieve quantum supremacy this year. And it’s hard to bet against Google or one of the other big tech companies. Especially when Intel has already put a quantum processor on a silicon chip and you can access IBM’s in the cloud right now.

No matter your feelings on quantum computers, qubits, or half-dead/half-alive cats, the odds are pretty good that quantum computers will follow the same path that IBM’s mainframes did. They’ll get smaller, faster, more powerful, and eventually we’ll all be using them, even if we don’t understand the science behind them.


Quantum algorithms are usually described, in the commonly used circuit model of quantum computation, by a quantum circuit which acts on some input qubits and terminates with a measurement. A quantum circuit consists of simple quantum gates which act on at most a fixed number of qubits[why?]. Quantum algorithms may also be stated in other models of quantum computation, such as the Hamiltonian oracle model.[5]

Quantum algorithms can be categorized by the main techniques used by the algorithm. Some commonly used techniques/ideas in quantum algorithms include phase kick-back, phase estimation, the quantum Fourier transform, quantum walks, amplitude amplification and topological quantum field theory. Quantum algorithms may also be grouped by the type of problem solved, for instance see the survey on quantum algorithms for algebraic problems.[6]

Precision atom qubits achieve major quantum computing milestone

Algorithms based on the quantum Fourier transform
The quantum Fourier transform is the quantum analogue of the discrete Fourier transform, and is used in several quantum algorithms. The Hadamard transform is also an example of a quantum Fourier transform over an n-dimensional vector space over the field F2. The quantum Fourier transform can be efficiently implemented on a quantum computer using only a polynomial number of quantum gates.[citation needed]

Deutsch–Jozsa algorithm
Main article: Deutsch–Jozsa algorithm
The Deutsch–Jozsa algorithm solves a black-box problem which probably requires exponentially many queries to the black box for any deterministic classical computer, but can be done with exactly one query by a quantum computer. If we allow both bounded-error quantum and classical algorithms, then there is no speedup since a classical probabilistic algorithm can solve the problem with a constant number of queries with small probability of error. The algorithm determines whether a function f is either constant (0 on all inputs or 1 on all inputs) or balanced (returns 1 for half of the input domain and 0 for the other half).

Simon's algorithm
Main article: Simon's algorithm
Simon's algorithm solves a black-box problem exponentially faster than any classical algorithm, including bounded-error probabilistic algorithms. This algorithm, which achieves an exponential speedup over all classical algorithms that we consider efficient, was the motivation for Shor's factoring algorithm.

Quantum phase estimation algorithm
Main article: Quantum phase estimation algorithm
The quantum phase estimation algorithm is used to determine the eigenphase of an eigenvector of a unitary gate given a quantum state proportional to the eigenvector and access to the gate. The algorithm is frequently used as a subroutine in other algorithms.

Shor's algorithm
Main article: Shor's algorithm
Shor's algorithm solves the discrete logarithm problem and the integer factorization problem in polynomial time,[7] whereas the best known classical algorithms take super-polynomial time. These problems are not known to be in P or NP-complete. It is also one of the few quantum algorithms that solves a non–black-box problem in polynomial time where the best known classical algorithms run in super-polynomial time.

Hidden subgroup problem
The abelian hidden subgroup problem is a generalization of many problems that can be solved by a quantum computer, such as Simon's problem, solving Pell's equation, testing the principal ideal of a ring R and factoring. There are efficient quantum algorithms known for the Abelian hidden subgroup problem.[8] The more general hidden subgroup problem, where the group isn't necessarily abelian, is a generalization of the previously mentioned problems and graph isomorphism and certain lattice problems. Efficient quantum algorithms are known for certain non-abelian groups. However, no efficient algorithms are known for the symmetric group, which would give an efficient algorithm for graph isomorphism[9] and the dihedral group, which would solve certain lattice problems.[10]

Boson sampling problem
Main article: Boson sampling
The Boson Sampling Problem in an experimental configuration assumes[11] an input of bosons (ex. photons of light) of moderate number getting randomly scattered into a large number of output modes constrained by a defined unitarity. The problem is then to produce a fair sample of the probability distribution of the output which is dependent on the input arrangement of bosons and the Unitarity.[12] Solving this problem with a classical computer algorithm requires computing the permanent[clarification needed] of the unitary transform matrix, which may be either impossible or take a prohibitively long time. In 2014, it was proposed[13] that existing technology and standard probabilistic methods of generating single photon states could be used as input into a suitable quantum computable linear optical network and that sampling of the output probability distribution would be demonstrably superior using quantum algorithms. In 2015, investigation predicted[14] the sampling problem had similar complexity for inputs other than Fock state photons and identified a transition in computational complexity from classically simulatable to just as hard as the Boson Sampling Problem, dependent on the size of coherent amplitude inputs.

Estimating Gauss sums
A Gauss sum is a type of exponential sum. The best known classical algorithm for estimating these sums takes exponential time. Since the discrete logarithm problem reduces to Gauss sum estimation, an efficient classical algorithm for estimating Gauss sums would imply an efficient classical algorithm for computing discrete logarithms, which is considered unlikely. However, quantum computers can estimate Gauss sums to polynomial precision in polynomial time.[15]

Fourier fishing and Fourier checking
We have an oracle consisting of n random Boolean functions mapping n-bit strings to a Boolean value. We are required to find n n-bit strings z1,..., zn such that for the Hadamard-Fourier transform, at least 3/4 of the strings satisfy

{\displaystyle \left|{\tilde {f}}\left(z_{i}\right)\right|\geqslant 1} \left|{\tilde  {f}}\left(z_{i}\right)\right|\geqslant 1
and at least 1/4 satisfies

{\displaystyle \left|{\tilde {f}}\left(z_{i}\right)\right|\geqslant 2} \left|{\tilde  {f}}\left(z_{i}\right)\right|\geqslant 2.
This can be done in Bounded-error Quantum Polynomial time (BQP).[16]

Bob Sutor demonstrates the IBM Q quantum computer

Sounds of a Quantum Computer

Algorithms based on amplitude amplification
Amplitude amplification is a technique that allows the amplification of a chosen subspace of a quantum state. Applications of amplitude amplification usually lead to quadratic speedups over the corresponding classical algorithms. It can be considered to be a generalization of Grover's algorithm.

Grover's algorithm
Main article: Grover's algorithm
Grover's algorithm searches an unstructured database (or an unordered list) with N entries, for a marked entry, using only {\displaystyle O({\sqrt {N}})} O({\sqrt  {N}}) queries instead of the {\displaystyle O({N})} {\displaystyle O({N})} queries required classically.[17] Classically, {\displaystyle O({N})} {\displaystyle O({N})} queries are required, even if we allow bounded-error probabilistic algorithms.

Bohmian Mechanics is a non-local hidden variable interpretation of quantum mechanics. It has been shown that a non-local hidden variable quantum computer could implement a search of an N-item database at most in {\displaystyle O({\sqrt[{3}]{N}})} {\displaystyle O({\sqrt[{3}]{N}})} steps. This is slightly faster than the {\displaystyle O({\sqrt {N}})} O({\sqrt  {N}}) steps taken by Grover's algorithm. Neither search method will allow quantum computers to solve NP-Complete problems in polynomial time.[18]

Quantum counting
Quantum counting solves a generalization of the search problem. It solves the problem of counting the number of marked entries in an unordered list, instead of just detecting if one exists. Specifically, it counts the number of marked entries in an {\displaystyle N} N-element list, with error {\displaystyle \epsilon } \epsilon  making only {\displaystyle \Theta \left({\frac {1}{\epsilon }}{\sqrt {\frac {N}{k}}}\right)} \Theta \left({\frac  {1}{\epsilon }}{\sqrt  {{\frac  {N}{k}}}}\right) queries, where {\displaystyle k} k is the number of marked elements in the list.[19][20] More precisely, the algorithm outputs an estimate {\displaystyle k'} k' for {\displaystyle k} k, the number of marked entries, with the following accuracy: {\displaystyle |k-k'|\leq \epsilon k} |k-k'|\leq \epsilon k.

Solving a linear systems of equations
Main article: Quantum algorithm for linear systems of equations
In 2009 Aram Harrow, Avinatan Hassidim, and Seth Lloyd, formulated a quantum algorithm for solving linear systems. The algorithm estimates the result of a scalar measurement on the solution vector to a given linear system of equations.[21]

Provided the linear system is a sparse and has a low condition number {\displaystyle \kappa } \kappa , and that the user is interested in the result of a scalar measurement on the solution vector, instead of the values of the solution vector itself, then the algorithm has a runtime of {\displaystyle O(\log(N)\kappa ^{2})} O(\log(N)\kappa ^{2}), where {\displaystyle N} N is the number of variables in the linear system. This offers an exponential speedup over the fastest classical algorithm, which runs in {\displaystyle O(N\kappa )} O(N\kappa ) (or {\displaystyle O(N{\sqrt {\kappa }})} O(N{\sqrt {\kappa }}) for positive semidefinite matrices).

Algorithms based on quantum walks
Main article: Quantum walk
A quantum walk is the quantum analogue of a classical random walk, which can be described by a probability distribution over some states. A quantum walk can be described by a quantum superposition over states. Quantum walks are known to give exponential speedups for some black-box problems.[22][23] They also provide polynomial speedups for many problems. A framework for the creation of quantum walk algorithms exists and is quite a versatile tool.[24]

Element distinctness problem
Main article: Element distinctness problem
The element distinctness problem is the problem of determining whether all the elements of a list are distinct. Classically, Ω(N) queries are required for a list of size N, since this problem is harder than the search problem which requires Ω(N) queries. However, it can be solved in {\displaystyle \Theta (N^{2/3})} \Theta (N^{{2/3}}) queries on a quantum computer. The optimal algorithm is by Andris Ambainis.[25] Yaoyun Shi first proved a tight lower bound when the size of the range is sufficiently large.[26] Ambainis[27] and Kutin[28] independently (and via different proofs) extended his work to obtain the lower bound for all functions.

Triangle-finding problem
Main article: Triangle finding problem
The triangle-finding problem is the problem of determining whether a given graph contains a triangle (a clique of size 3). The best-known lower bound for quantum algorithms is Ω(N), but the best algorithm known requires O(N1.297) queries,[29] an improvement over the previous best O(N1.3) queries.[24][30]

Quantum Algorithms for Evaluating MIN-MAX Trees

Formula evaluation
A formula is a tree with a gate at each internal node and an input bit at each leaf node. The problem is to evaluate the formula, which is the output of the root node, given oracle access to the input.

A well studied formula is the balanced binary tree with only NAND gates.[31] This type of formula requires Θ(Nc) queries using randomness,[32] where {\displaystyle c=\log _{2}(1+{\sqrt {33}})/4\approx 0.754} c=\log _{2}(1+{\sqrt  {33}})/4\approx 0.754. With a quantum algorithm however, it can be solved in Θ(N0.5) queries. No better quantum algorithm for this case was known until one was found for the unconventional Hamiltonian oracle model.[5] The same result for the standard setting soon followed.[33]

Quantum Computing deep dive

Fast quantum algorithms for more complicated formulas are also known.[34]

Group commutativity
The problem is to determine if a black box group, given by k generators, is commutative. A black box group is a group with an oracle function, which must be used to perform the group operations (multiplication, inversion, and comparison with identity). We are interested in the query complexity, which is the number of oracle calls needed to solve the problem. The deterministic and randomized query complexities are {\displaystyle \Theta (k^{2})} \Theta (k^{2}) and {\displaystyle \Theta (k)} \Theta (k) respectively.[35] A quantum algorithm requires {\displaystyle \Omega (k^{2/3})} \Omega (k^{{2/3}}) queries but the best known algorithm uses {\displaystyle O(k^{2/3}\log k)} O(k^{{2/3}}\log k) queries.[36]

BQP-complete problems
Computing knot invariants
Witten had shown that the Chern-Simons topological quantum field theory (TQFT) can be solved in terms of Jones polynomials. A quantum computer can simulate a TQFT, and thereby approximate the Jones polynomial,[37] which as far as we know, is hard to compute classically in the worst-case scenario.[citation needed]

Quantum simulation
The idea that quantum computers might be more powerful than classical computers originated in Richard Feynman's observation that classical computers seem to require exponential time to simulate many-particle quantum systems.[38] Since then, the idea that quantum computers can simulate quantum physical processes exponentially faster than classical computers has been greatly fleshed out and elaborated. Efficient (that is, polynomial-time) quantum algorithms have been developed for simulating both Bosonic and Fermionic systems[39] and in particular, the simulation of chemical reactions beyond the capabilities of current classical supercomputers requires only a few hundred qubits.[40] Quantum computers can also efficiently simulate topological quantum field theories.[41] In addition to its intrinsic interest, this result has led to efficient quantum algorithms for estimating quantum topological invariants such as Jones[42] and HOMFLY polynomials,[43] and the Turaev-Viro invariant of three-dimensional manifolds.[44]

Hybrid quantum/classical algorithms
Hybrid Quantum/Classical Algorithms combine quantum state preparation and measurement with classical optimization.[45] These algorithms generally aim to determine the ground state eigenvector and eigenvalue of a Hermitian Operator.

The quantum approximate optimization algorithm is a toy model of quantum annealing which can be used to solve problems in graph theory.[46] The algorithm makes use of classical optimization of quantum operations to maximize an objective function.

Variational Quantum Eigensolver
The VQE algorithm applies classical optimization to minimize the energy expectation of an ansatz state to find the ground state energy of a molecule.[47] This can also be extended to find excited energies of molecules.[48]

BIG THINGS HAPPEN when computers get smaller. Or faster. And quantum computing is about chasing perhaps the biggest performance boost in the history of technology. The basic idea is to smash some barriers that limit the speed of existing computers by harnessing the counterintuitive physics of subatomic scales.

Quantum Computing for Computer Scientists

If the tech industry pulls off that, ahem, quantum leap, you won’t be getting a quantum computer for your pocket. Don’t start saving for an iPhone Q. We could, however, see significant improvements in many areas of science and technology, such as longer-lasting batteries for electric cars or advances in chemistry that reshape industries or enable new medical treatments. Quantum computers won’t be able to do everything faster than conventional computers, but on some tricky problems they have advantages that would enable astounding progress.

It’s not productive (or polite) to ask people working on quantum computing when exactly those dreamy applications will become real. The only thing for sure is that they are still many years away. Prototype quantum computing hardware is still embryonic. But powerful—and, for tech companies, profit-increasing—computers powered by quantum physics have recently started to feel less hypothetical.

The Mathematics of Quantum Computers | Infinite Series

That’s because Google, IBM, and others have decided it’s time to invest heavily in the technology, which, in turn, has helped quantum computing earn a bullet point on the corporate strategy PowerPoint slides of big companies in areas such as finance, like JPMorgan, and aerospace, like Airbus. In 2017, venture investors plowed $241 million into startups working on quantum computing hardware or software worldwide, according to CB Insights. That’s triple the amount in the previous year.

Like the befuddling math underpinning quantum computing, some of the expectations building around this still-impractical technology can make you lightheaded. If you squint out the window of a flight into SFO right now, you can see a haze of quantum hype drifting over Silicon Valley. But the enormous potential of quantum computing is undeniable, and the hardware needed to harness it is advancing fast. If there were ever a perfect time to bend your brain around quantum computing, it’s now. Say “Schrodinger’s superposition” three times fast, and we can dive in.

The History of Quantum Computing Explained
The prehistory of quantum computing begins early in the 20th century, when physicists began to sense they had lost their grip on reality.

First, accepted explanations of the subatomic world turned out to be incomplete. Electrons and other particles didn’t just neatly carom around like Newtonian billiard balls, for example. Sometimes they acted like waves instead. Quantum mechanics emerged to explain such quirks, but introduced troubling questions of its own. To take just one brow-wrinkling example, this new math implied that physical properties of the subatomic world, like the position of an electron, didn’t really exist until they were observed.

If you find that baffling, you’re in good company. A year before winning a Nobel for his contributions to quantum theory, Caltech’s Richard Feynman remarked that “nobody understands quantum mechanics.” The way we experience the world just isn’t compatible. But some people grasped it well enough to redefine our understanding of the universe. And in the 1980s a few of them—including Feynman—began to wonder if quantum phenomena like subatomic particles' “don’t look and I don’t exist” trick could be used to process information. The basic theory or blueprint for quantum computers that took shape in the 80s and 90s still guides Google and others working on the technology.

Mathematics for Machine Learning full Course || Linear Algebra || Part-1

Before we belly flop into the murky shallows of quantum computing 0.101, we should refresh our understanding of regular old computers. As you know, smartwatches, iPhones, and the world’s fastest supercomputer all basically do the same thing: they perform calculations by encoding information as digital bits, aka 0s and 1s. A computer might flip the voltage in a circuit on and off to represent 1s and 0s for example.

Quantum computers do calculations using bits, too. After all, we want them to plug into our existing data and computers. But quantum bits, or qubits, have unique and powerful properties that allow a group of them to do much more than an equivalent number of conventional bits.

Qubits can be built in various ways, but they all represent digital 0s and 1s using the quantum properties of something that can be controlled electronically. Popular examples—at least among a very select slice of humanity—include superconducting circuits, or individual atoms levitated inside electromagnetic fields. The magic power of quantum computing is that this arrangement lets qubits do more than just flip between 0 and 1. Treat them right and they can flip into a mysterious extra mode called a superposition.


Physicist Paul Benioff suggests quantum mechanics could be used for computation.

Nobel-winning physicist Richard Feynman, at Caltech, coins the term quantum computer.

Physicist David Deutsch, at Oxford, maps out how a quantum computer would operate, a blueprint that underpins the nascent industry of today.

Mathematician Peter Shor, at Bell Labs, writes an algorithm that could tap a quantum computer’s power to break widely used forms of encryption.

D-Wave, a Canadian startup, announces a quantum computing chip it says can solve Sudoku puzzles, triggering years of debate over whether the company’s technology really works.

Google teams up with NASA to fund a lab to try out D-Wave’s hardware.

Google hires the professor behind some of the best quantum computer hardware yet to lead its new quantum hardware lab.

IBM puts some of its prototype quantum processors on the internet for anyone to experiment with, saying programmers need to get ready to write quantum code.

Startup Rigetti opens its own quantum computer fabrication facility to build prototype hardware and compete with Google and IBM.

You may have heard that a qubit in superposition is both 0 and 1 at the same time. That’s not quite true and also not quite false—there’s just no equivalent in Homo sapiens’ humdrum classical reality. If you have a yearning to truly grok it, you must make a mathematical odyssey WIRED cannot equip you for. But in the simplified and dare we say perfect world of this explainer, the important thing to know is that the math of a superposition describes the probability of discovering either a 0 or 1 when a qubit is read out—an operation that crashes it out of a quantum superposition into classical reality. A quantum computer can use a collection of qubits in superpositions to play with different possible paths through a calculation. If done correctly, the pointers to incorrect paths cancel out, leaving the correct answer when the qubits are read out as 0s and 1s.

For some problems that are very time consuming for conventional computers, this allows a quantum computer to find a solution in far fewer steps than a conventional computer would need. Grover’s algorithm, a famous quantum search algorithm, could find you in a phone book with 100 million names with just 10,000 operations. If a classical search algorithm just spooled through all the listings to find you, it would require 50 million operations, on average. For Grover’s and some other quantum algorithms, the bigger the initial problem—or phonebook—the further behind a conventional computer is left in the digital dust.

Leading the Evolution of Compute: Quantum Computing

The reason we don’t have useful quantum computers today is that qubits are extremely finicky. The quantum effects they must control are very delicate, and stray heat or noise can flip 0s and 1s, or wipe out a crucial superposition. Qubits have to be carefully shielded, and operated at very cold temperatures, sometimes only fractions of a degree above absolute zero. Most plans for quantum computing depend on using a sizable chunk of a quantum processor’s power to correct its own errors, caused by misfiring qubits.

Recent excitement about quantum computing stems from progress in making qubits less flaky. That’s giving researchers the confidence to start bundling the devices into larger groups. Startup Rigetti Computing recently announced it has built a processor with 128 qubits made with aluminum circuits that are super-cooled to make them superconducting. Google and IBM have announced their own chips with 72 and 50 qubits, respectively. That’s still far fewer than would be needed to do useful work with a quantum computer—it would probably require at least thousands—but as recently as 2016 those companies’ best chips had qubits only in the single digits. After tantalizing computer scientists for 30 years, practical quantum computing may not exactly be close, but it has begun to feel a lot closer.

What the Future Holds for Quantum Computing
Some large companies and governments have started treating quantum computing research like a race—perhaps fittingly it’s one where both the distance to the finish line and the prize for getting there are unknown.

Google, IBM, Intel, and Microsoft have all expanded their teams working on the technology, with a growing swarm of startups such as Rigetti in hot pursuit. China and the European Union have each launched new programs measured in the billions of dollars to stimulate quantum R&D. And in the US, the Trump White House has created a new committee to coordinate government work on quantum information science. Several bills were introduced to Congress in 2018 proposing new funding for quantum research, totalling upwards of $1.3 billion. It’s not quite clear what the first killer apps of quantum computing will be, or when they will appear. But there’s a sense that whoever is first make these machines useful will gain big economic and national security advantages.

Empowering the quantum revolution with Microsoft Q# Language


What's a qubit?
A device that uses quantum mechanical effects to represent 0s and 1s of digital data, similar to the bits in a conventional computer.

What's a superposition?
It's the trick that makes quantum computers tick, and makes qubits more powerful than ordinary bits. A superposition is in an intuition-defying mathematical combination of both 0 and 1. Quantum algorithms can use a group of qubits in a superposition to shortcut through calculations.

What's quantum entanglement?
A quantum effect so unintuitive that Einstein dubbed it “spooky action at a distance.” When two qubits in a superposition are entangled, certain operations on one have instant effects on the other, a process that helps quantum algorithms be more powerful than conventional ones.

What's quantum speedup?
The holy grail of quantum computing—a measure of how much faster a quantum computer could crack a problem than a conventional computer could. Quantum computers aren’t well-suited to all kinds of problems, but for some they offer an exponential speedup, meaning their advantage over a conventional computer grows explosively with the size of the input problem.

Back in the world of right now, though, quantum processors are too simple to do practical work. Google is working to stage a demonstration known as quantum supremacy, in which a quantum processor would solve a carefully designed math problem beyond existing supercomputers. But that would be an historic scientific milestone, not proof quantum computing is ready to do real work.

As quantum computer prototypes get larger, the first practical use for them will probably be for chemistry simulations. Computer models of molecules and atoms are vital to the hunt for new drugs or materials. Yet conventional computers can’t accurately simulate the behavior of atoms and electrons during chemical reactions. Why? Because that behavior is driven by quantum mechanics, the full complexity of which is too great for conventional machines. Daimler and Volkswagen have both started investigating quantum computing as a way to improve battery chemistry for electric vehicles. Microsoft says other uses could include designing new catalysts to make industrial processes less energy intensive, or even to pull carbon dioxide out of the atmosphere to mitigate climate change.

Quantum computers would also be a natural fit for code-breaking. We’ve known since the 90s that they could zip through the math underpinning the encryption that secures online banking, flirting, and shopping. Quantum processors would need to be much more advanced to do this, but governments and companies are taking the threat seriously. The National Institute of Standards and Technology is in the process of evaluating new encryption systems that could be rolled out to quantum-proof the internet.

Special Purpose Quantum Annealing Quantum Computer v1.0

Tech companies such as Google are also betting that quantum computers can make artificial intelligence more powerful. That’s further in the future and less well mapped out than chemistry or code-breaking applications, but researchers argue they can figure out the details down the line as they play around with larger and larger quantum processors. One hope is that quantum computers could help machine-learning algorithms pick up complex tasks using many fewer than the millions of examples typically used to train AI systems today.

Despite all the superposition-like uncertainty about when the quantum computing era will really begin, big tech companies argue that programmers need to get ready now. Google, IBM, and Microsoft have all released open source tools to help coders familiarize themselves with writing programs for quantum hardware. IBM has even begun to offer online access to some of its quantum processors, so anyone can experiment with them. Long term, the big computing companies see themselves making money by charging corporations to access data centers packed with supercooled quantum processors.

What’s in it for the rest of us? Despite some definite drawbacks, the age of conventional computers has helped make life safer, richer, and more convenient—many of us are never more than five seconds away from a kitten video. The era of quantum computers should have similarly broad reaching, beneficial, and impossible to predict consequences. Bring on the qubits.

More Information:










28 January 2019

Oracle Autonomous Database - How Oracle 12 c Became Oracle 18c/19c

Oracle Autonomous Database

Oracle Autonomous Database combines the flexibility of cloud with the
power of machine learning to deliver data management as a service. It
enables businesses to:

• Safely run mission-critical workloads using the most secure, available,
performant, and proven platform - Oracle Database on Exadata

• Migrate both new and existing OLTP or Analytics applications

• Deploy in both the Oracle Public Cloud and on Cloud at Customer in their
own data centers, providing the easiest and safest cloud migration and
hybrid cloud enablement

• Cut administration costs up to 80% with full automation of operations and

• Cut runtime costs up to 90% by billing only for resources needed at any
given time

• Protect themselves from cyber-attacks and rogue employees by
automatically encrypting all data and automatically applying any needed
security updates online

• Guarantee 99.995%1 uptime to ensure mission-critical applications are
always available. Downtime is limited to under 2.5 minutes per month,
including maintenance

Oracle Database Support Best Practices

Like an autonomous car, the Oracle Autonomous Database (Autonomous Database) provides a level of performance and reliability manually managed databases can’t deliver. Compared to a manually managed database, the Autonomous Database costs less to run, performs better, is more available, and eliminates human error.

Larry Ellison Introduces Oracle Autonomous Database Cloud


You tell the Autonomous Database the service level to achieve, and it handles the rest. The Autonomous Database eliminates human labor to provision, secure, monitor, backup, recover, troubleshoot, and tune databases. This greatly reduces database maintenance tasks, reducing costs and freeing scarce administrator resources to work on higher value tasks.

Oracle Cloud Infrastructure Weninars:

Since the Autonomous Database is based on the extremely feature rich and proven Oracle Database, on the Exadata platform, it is able to run both OLTP and analytic workloads up to 100X faster. It includes many performance enhancing Exadata features such as smart flash cache, automatic columnar format in flash cache, smart scan, Exafusion communication over the super-fast InfiniBand network, and automatic storage indexes.

Oracle PaaS: Data Integration Plataform Cloud: Product Overview Animated Video

In addition, when it comes time to upgrade or patch, the Autonomous Database can replay the real production workload on a test database to make sure the upgrade does not have any unexpected side effects on a mission-critical system.

Autonomous database automatically tunes itself using Machine Learning algorithms including automatically creating any indexes needed to accelerate applications. Users get the ultimate simplicity of a “load and go” architecture in which they can simply load their data and run SQL without worrying about creating and tuning their database access structures.


The Autonomous Database is more secure than a manually operated database because it protects itself rather than having to wait for an available administrator. This applies to defenses against both external and internal attacks.

Security patches are automatically applied every quarter. This is much sooner than most manually operated Oracle databases, narrowing an unnecessary window of vulnerability. Patching can also occur off-cycle if a zero-day exploit is discovered. By applying patches in a rolling fashion across the nodes of a cluster, the Autonomous Database secures itself without application downtime.

Oracle Multitenant: New Features in Oracle Database 18c

Patching is just part of the picture. The database also protects itself with always-on encryption. Customers can control their own keys to further improve security.

In the future, Oracle’s Data Masking and Redaction technologies will be used to safeguard sensitive data by concealing it for some users or workloads and masking it on test databases.

Create Autonomous Data Warehouse Cloud Connection with Data Integration Platform Cloud


The Autonomous Database is more reliable than a manually operated database. At startup, it automatically establishes a triple-mirrored scale-out configuration in one regional cloud datacenter, with an optional full standby copy in another region. The Autonomous Database automatically recovers from any physical failures whether at the server or datacenter level. It has the ability to rewind data to a point in time in the past to back out user errors. By applying software updates in a rolling fashion across nodes of the cluster, it keeps the application online during updates of the database, clusterware, OS, VM, hypervisor, or firmware.

If the database detects an impending error, it gathers statistics and feeds them to AI diagnostics to determine the root cause. As a final safety net, the Autonomous Database runs nightly backups for you.

Oracle Database Release Uptake and Patching Strategies
In the future, when it is time to update the Autonomous Database, it will be possible to replay the full
production workload on a parallel testing environment to verify the safety of the update before it is applied to a mission-critical environment.

Oracle will offer a 99.995% uptime guarantee for the Autonomous Database. Oracle understands that
mission-critical systems run 24x7. Unlike other cloud vendors, Oracle provides an uptime guarantee that includes planned maintenance and all other common sources of downtime in its calculations.

Optimized for Different Workloads

Modern automobiles are specialized by workload: family car, van, pickup truck, sports car, etc. In the same way, the Autonomous Database consists of a single set of technologies available in multiple products, each tailored to a different workload:

Data Warehousing. The Oracle Autonomous Database for Data Warehousing is the simplest and most
efficient database for data marts, reporting databases, and data warehousing. Available January 2018.

OCI Level 100 Autonomous Database

OLTP and mixed workloads. 

The Oracle Autonomous Database for OLTP is designed to run mission-critical
enterprise applications, including mixed workloads and real-time analytics, with no compromise on app performance. Coming in 2018.
In the future, Oracle will also bring the autonomous principles of self-driving, self-securing, self-repairing to other kinds of databases:

• NoSQL. Delivers transactional operations on JSON documents and key-value data. Available in 2018.
• Graph. Automatically creates graph representations from tabular and JSON data for discovery of new connections through network analysis. Coming in 2018.

Melbourne Groundbreakers Tour - Hints and Tips

In addition, the Autonomous Database provides IT leaders with a cloud-native enterprise-class foundation for new app and data science development.

• Increase app developer productivity. The Autonomous Database instantly provides app developers
with a platform that offers the variety of data management methods their apps require with the simplicity of a self-managing database. App developers simply push a button to provision a mission critical capable database.
• Simplify data science experimentation. Data science, like all science, boils down to experimentation.

The Autonomous Database’s built-in machine learning capabilities along with its self-driving and selfsecuring capabilities, makes it easy for data science teams to experiment with datasets that are
otherwise locked away in operational silos for performance or security reasons.

Oracle Autonomous Database - Fear Not.... Embrace it - AutonomousDB-01


For IT leaders who want to move enterprise IT to a cloud foundation, the Autonomous Database offers the smoothest and easiest transition.

• Oracle Public Cloud, Cloud at Customer, or both. The Autonomous Database runs in both the Oracle Public Cloud and Cloud at Customer environments. This means IT leaders can have the management ease and subscription pricing of cloud for all enterprise workloads, including those that must stay inhouse for regulatory, data sovereignty, or network latency reasons.

• Go cloud-native without app changes. Because the Autonomous Database is still an Oracle database, existing apps can be quickly and easily moved to this new cloud-native data management platform with no app changes.

With Autonomous Database, major cost savings and agility improvements come quickly, not after
years to decades of application rewrites.

Oracle Autonomous Data Warehouse Cloud Service


The transition to the cloud must improve the availability of mission-critical workloads, not put them at risk.

The Autonomous Database is built on top of the most widely proven and sophisticated database in the world:

Oracle Database. The Oracle Database is capable of running any type of workload in a highly secure,
available, and scalable fashion.

The Autonomous Database runs on the best database platform in the world: Exadata. Exadata is a cloudarchitected scale-out platform that uses the latest technologies including NVMe flash and InfiniBand networking, together with unique database optimizations in storage, compute, and networking to deliver leading performance, scaling, and availability, at the lowest cost.

Oracle on Serverless Computing: Developing FaaS with Oracle

Oracle’s long experience and track record ensures that the transition to the cloud is safe and smooth. The largest enterprises and governments in the world already run all types of mission-critical workloads with Oracle Database on Exadata including:

• Multi-petabyte warehouses
• Ultra-critical applications like financial trading of trillions of dollars daily
• Highly sophisticated and complex business applications like SAP, Oracle Fusion Apps, Salesforce, etc.
• Massive enterprise database consolidations to reduce the cost of fragmented database deployments


Administering a mission-critical database is traditionally very expensive because it requires manual
provisioning, securing, monitoring, patching, backing-up, upgrading, recovering, troubleshooting, testing, andtuning of a complex highly available scale-out deployment with disaster recovery protection. The extensiveautomation provided by Autonomous Database dramatically simplifies these tasks, reducing administration costs up to 80%.

Traditional database deployments need to provision for the peak possible workload and add a substantial margin of safety on top of that. But peak workloads tend to occur infrequently, leaving most of this costly capacity idle the majority of the time. Oracle’s Universal Credits subscription model for cloud deployments allows customers to pay for just the resources they use. Autonomous Database allows elastic adjustment of compute and storage resources so that only the required resources are provisioned at any given time, decreasing runtime costs by up to 90%.

Under the Hood of the Smartest Availability Features in Oracle's Autonomous

New application development often suffers from many months of delays waiting for database provisioning, testing, and tuning. With Autonomous Database, new applications don’t wait at all, saving tens of thousands of dollars per application and enabling much faster innovation.

The Autonomous Database subscription includes many management, testing, and security capabilities that previously had to be licensed separately, including:

• Data Encryption
• Diagnostics Pack
• Tuning Pack
• Real Application Testing
• Data Masking, Redaction and Subsetting
• Hybrid Columnar Compression
• Database Vault
• Database In-Memory (subset) – in Autonomous Data Warehouse
• Advanced Analytics (subset) - in Autonomous Data Warehouse

To implement full data management workflows, other clouds use a combination of multiple specialized databases such as a queuing database, OLTP database, JSON data store, reporting database, analytics database, etc. Each database is independently developed and therefore has its own data model, security model, execution model, monitoring model, tuning model, consistency model, query language, analytics, etc. Data needs to be transformed and copied between these specialized
databases. While moving data between specialized databases can make sense for some extreme
high-end applications, it adds enormous unnecessary cost and complexity to the large majority of

Oracle RAC - Roadmap for New Features

Furthermore, it severely compromises security since protection is limited by the worst
system in the workflow. The Autonomous Database handles all these functions in a single database with no need for complex data movement and provides integrated analytics across all data types.

Behavior Changes, Deprecated and Desupported Features for Oracle Database 18c

Review for information about Oracle Database 18c changes, deprecations, and desupports.

About Deprecated and Desupported Status
In addition to new features, Oracle Database release can modify, deprecate or desupport features, and introduce upgrade behavior changes for your database

Simplified Image-Based Oracle Database Installation
Starting with Oracle Database 18c, installation and configuration of Oracle Database software is simplified with image-based installation.

Initialization Parameter Changes in Oracle Database 18c
Review to see the list of new, deprecated, and desupported initialization parameters in this release.

Deprecated Features in Oracle Database 18c
Review the deprecated features listed in this section to prepare to use alternatives after you upgrade.

Desupported Features in Oracle Database 18c
Review this list of desupported features as part of your upgrade planning.

Terminal Release of Oracle Streams
Oracle Database 18c is the terminal release for Oracle Streams support. Oracle Streams will be desupported from Oracle Database 19c onwards.

Feature Changes for Oracle Database 18c Upgrade Planning
Use these feature changes to help prepare for changes that you include as part of your planning for Oracle Database 18c upgrades.

MAA - Best Practices for the Cloud

Oracle Database 18c – Some important changes

Posted on March 7, 2018 by Mike.Dietrich

I know that Oracle Database 18c is available in the Oracle Cloud and on Exadata engineered systems only right now. But actually I’ve had conversations with some customers who downloaded Oracle 18c on-prem software for Exadata and installed it on their systems. Therefore it may be useful to talk about Oracle Database 18c – Some important changes.

Oracle Database 18c – Some important changes
I will highlight some important changes but of course won’t cover all of them here.

You may recognize the first change after downloading the image: The installation and configuration of Oracle Database software is simplified with image-based installation. You’ll extract a zip file (db_home.zip) into the directory where you’d like your Oracle installation to be located in – and then call the runInstaller. Be aware: it sits now directly in the $ORACLE_HOME, not in the $ORACLE_HOME/oui subdirectory:

runInstaller is a script in this case. The name is kept for persistence.

Melbourne Groundbreakers Tour - Upgrading without risk

Furthermore there are two other new features:

RPM based installation in Oracle 18c
It performs preinstall checks, extracts the database software, reassigns ownership to the preconfigured user and groups, maintains the inventory, and executes all root operations required. Works for a single-instance and client software.

Read-Only Oracle Home in Oracle 18c
In Oracle 18c you can configure read-only homes. In this case all the configuration data and log files reside outside the Oracle home. You can deploy it as a software image across multiple servers.Apart from the traditional ORACLE_BASE and ORACLE_HOME directories, the following directories contain files that used to be in ORACLE_HOME: ORACLE_BASE_HOME and ORACLE_BASE_CONFIG

And interesting thing happens if you call the installer kept for internal purposes in $ORACLE_HOME/oui/bin:
it will start with a different versioning, i.e. as an Oracle 12.2 OUI. The install script runInstaller is in the $ORACLE_HOME directory. And it will greet you with Oracle 18c – and not Oracle 12.2.

Oracle Streams

Yes, the deprecation of Oracle Streams has been announced in the Oracle Database Upgrade Guide a while ago. And Oracle 18c is now the terminal release for Oracle Streams. Beginning with Oracle 19c the feature Oracle Streams won’t be supported anymore. Please note that Oracle Multitenant, regardless of Single- or Multitenant, never implemented Oracle Streams functionality.

Oracle Multimedia

Beginning with Oracle 18c Oracle Multimedia is deprecated now. In case you’d like to remove Oracle Multimedia from your database please see this blog post: Remove Oracle Multimedia. In addition, Multimedia DICOM gets desupported with Oracle 18c as well.

Please note (and thanks Felipe for asking):
The Locator will become a top level component once Oracle Multimedia gets removed and therefore not depend on Multimedia anymore. This will happen in the first release of Oracle where Multimedia does not get installed anymore by default or even removed as part of an upgrade.

Deprecated and Desupported Features  in Oracle Database 18c
Please find the full list of deprecated features in Oracle Database 18c in the Database 18c Upgrade Guide. Furthermore you’ll find a list of desupported features and parameters in Oracle Database 18c within the same book.

The Oracle Autonomous Database #12c #18c

Cool New Features for Developers in 18c and 12c

I normally write an article for each conference presentation I do, but my presentation by this name is a series of live demos, so an article isn't really appropriate. Instead here is a links page to all the articles referenced by the conference presentation of the same name.

This is not supposed to be an exhaustive list of new features, just some that stand out for me, and some that others may not have noticed.

  • JSON Data Guide (12.2)
  • SQL/JSON (12.2)
  • PL/SQL Objects for JSON (12.2)
  • Real-Time Materialized Views (12.2)
  • Row Limiting Clause (12.1)
  • Qualified Expressions (18)
  • Polymorphic Table Functions (18)
  • Approximate Query Processing (12.1, 12.2, 18)
  • Private Temporary Tables (18)
  • External Table Enhancements (12.2 & 18)
  • Case Insensitive Queries (12.2)

Hope this helps. Regards Tim...  ( https://oracle-base.com/articles/misc/cool-new-features-for-developers-in-18c-and-12c )

Faster Insights with Cloudera Enterprise on Oracle Cloud Infrastructure

Qualified Expressions in PL/SQL in Oracle Database 18c

Qualified expressions provide and alternative way to define the value of complex objects, which in some cases can make the code look neater.


  • Qualified Expressions with Record Types
  • Qualified Expressions with Associative Arrays

The basic syntax for a qualified expression is as follows

The typemark is the type name. The aggregate is the data associated with this instance of the type. The data can specified using positional or the named association syntax. That all sounds a bit complicated, but it's similar to using a constructor for a object and will be obvious once you see some examples.

Qualified Expressions with Record Types
Records with large numbers of columns can be a little clumsy to work with. Qualified expressions can simplify code in some circumstances.

The following example shows three ways to populate a record in Oracle 18c. The first method, available in previous releases, involves a direct assignment to each column in the record variable. The second method uses a qualified expression where the aggregate uses positional notation. The third example uses a qualified expression where the aggregate uses the named association syntax.

  TYPE t_rec IS RECORD (
    id   NUMBER,
    val1 VARCHAR2(10),
    val2 VARCHAR2(10),
    val3 VARCHAR2(10),
    val4 VARCHAR2(10),
    val5 VARCHAR2(10),
    val6 VARCHAR2(10),
    val7 VARCHAR2(10),
    val8 VARCHAR2(10),
    val9 VARCHAR2(10)

  l_rec t_rec;
  -- Pre-18c - Direct assignment to record columns.
  l_rec.id   := 1;
  l_rec.val1 := 'ONE';
  l_rec.val2 := 'TWO';
  l_rec.val3 := 'THREE';
  l_rec.val4 := 'FOUR';
  l_rec.val5 := 'FIVE';
  l_rec.val6 := 'SIX';
  l_rec.val7 := 'SEVEN';
  l_rec.val8 := 'EIGHT';
  l_rec.val9 := 'NINE';

  -- 18c - Qualified expression using position notation.
  l_rec := t_rec(1, 'ONE', 'TWO', 'THREE', 'FOUR', 'FIVE', 'SIX', 'SEVEN', 'EIGHT', 'NINE');

  -- 18c - Qualified expression using named association.
  l_rec := t_rec(id   => 1,
                 val1 => 'ONE',
                 val2 => 'TWO',
                 val3 => 'THREE',
                 val4 => 'FOUR',
                 val5 => 'FIVE',
                 val6 => 'SIX',
                 val7 => 'SEVEN',
                 val8 => 'EIGHT',
                 val9 => 'NINE');
The first and last examples show clearly which columns gets which values, but they take a bit more space. The qualified expression using position notation is more compact, but relies on you knowing the order of the columns. In this case it's easy as the type if declared directly above. It would be less obvious if the type were defines in a package specification.

Things look a little different if we are only dealing with a subset of the columns. In the following example the qualified expression using named association looks neater, but still similar to the direct assignment to the record columns.

  TYPE t_rec IS RECORD (
    id   NUMBER,
    val1 VARCHAR2(10),
    val2 VARCHAR2(10),
    val3 VARCHAR2(10),
    val4 VARCHAR2(10),
    val5 VARCHAR2(10),
    val6 VARCHAR2(10),
    val7 VARCHAR2(10),
    val8 VARCHAR2(10),
    val9 VARCHAR2(10)

  l_rec t_rec;
  -- Pre-18c - Direct assignment to record columns.
  l_rec.id   := 1;
  l_rec.val1 := 'ONE';
  l_rec.val9 := 'NINE';

  -- 18c - Qualified expression using position notation.
  l_rec := t_rec(1, 'ONE', NULL, NULL, NULL, NULL, NULL, NULL, NULL, 'NINE');

  -- 18c - Qualified expression using named association.
  l_rec := t_rec(id => 1, val1 => 'ONE', val9 => 'NINE');
The difference becomes more apparent when the same variable is used for multiple sparse records, each referencing different columns in the record. In the following example the same record variable is used twice for each method. In the first pass the val1 and val9 columns are set. In the second pass the val2 and val8 columns are set. After each assignment the values of the val1 and val9 columns are displayed. The qualified expressions represent a new instance of the record, so all the unused columns are blanked explicitly or implicitly. Without the qualified expression it is up to the developer to blank the previous values manually.

  TYPE t_rec IS RECORD (
    id   NUMBER,
    val1 VARCHAR2(10),
    val2 VARCHAR2(10),
    val3 VARCHAR2(10),
    val4 VARCHAR2(10),
    val5 VARCHAR2(10),
    val6 VARCHAR2(10),
    val7 VARCHAR2(10),
    val8 VARCHAR2(10),
    val9 VARCHAR2(10)

  l_rec t_rec;
  -- Pre-18c - Direct assignment to record columns.
  l_rec.id   := 1;
  l_rec.val1 := 'ONE';
  l_rec.val9 := 'NINE';
  DBMS_OUTPUT.put_line('(1) Record1 val1 = ' || l_rec.val1 || '  val9 = ' || l_rec.val9);

  l_rec.id   := 2;
  l_rec.val2 := 'TWO';
  l_rec.val8 := 'EIGHT';
  DBMS_OUTPUT.put_line('(1) Record2 val1 = ' || l_rec.val1 || '  val9 = ' || l_rec.val9);

  -- 18c - Qualified expression using position notation.
  l_rec := t_rec(1, 'ONE', NULL, NULL, NULL, NULL, NULL, NULL, NULL, 'NINE');
  DBMS_OUTPUT.put_line('(2) Record1 val1 = ' || l_rec.val1 || '  val9 = ' || l_rec.val9);

  l_rec := t_rec(1, NULL, 'TWO', NULL, NULL, NULL, NULL, NULL, 'EIGHT', NULL);
  DBMS_OUTPUT.put_line('(2) Record2 val1 = ' || l_rec.val1 || '  val9 = ' || l_rec.val9);

  -- 18c - Qualified expression using named association.
  l_rec := t_rec(id => 1, val1 => 'ONE', val9 => 'NINE');
  DBMS_OUTPUT.put_line('(3) Record1 val1 = ' || l_rec.val1 || '  val9 = ' || l_rec.val9);

  l_rec := t_rec(id => 1, val2 => 'TWO', val8 => 'EIGHT');
  DBMS_OUTPUT.put_line('(3) Record2 val1 = ' || l_rec.val1 || '  val9 = ' || l_rec.val9);
(1) Record1 val1 = ONE  val9 = NINE
(1) Record2 val1 = ONE  val9 = NINE
(2) Record1 val1 = ONE  val9 = NINE
(2) Record2 val1 =   val9 =
(3) Record1 val1 = ONE  val9 = NINE
(3) Record2 val1 =   val9 =

PL/SQL procedure successfully completed.

We can even use a qualified expression in the definition of a default value. In the following example a procedure accepts a record type as a parameter, which has a default value specified using a qualified expression.

  TYPE t_rec IS RECORD (
    id   NUMBER,
    val1 VARCHAR2(10),
    val2 VARCHAR2(10)

  PROCEDURE dummy (p_rec IN t_rec DEFAULT t_rec(id => 1, val1 => 'ONE')) AS

Perth APAC Groundbreakers tour - SQL Techniques

Qualified Expressions with Associative Arrays
When dealing with associative arrays we have the option of assigning values to the individual elements of the associative array, or creating a new associative array using a qualified expression. The following example uses a PLS_INTEGER as the index of the associative array.


  l_tab t_tab;
  -- Pre-18c - Direct assignment to elements of the collection.
  l_tab(1) := 'ONE';
  l_tab(2) := 'TWO';
  l_tab(3) := 'THREE';

  -- 18c - Qualified expression using named association.
  l_tab := t_tab(1 => 'ONE',
                 2 => 'TWO',
                 3 => 'THREE');
This example uses a VARCHAR2 as the index of the associative array.


  l_tab t_tab;
  -- Pre-18c - Direct assignment to record columns.
  l_tab('IND1') := 'ONE';
  l_tab('IND2') := 'TWO';
  l_tab('IND3') := 'THREE';

  -- 18c - Qualified expression using named association.
  l_tab := t_tab('IND1' => 'ONE',
                 'IND2' => 'TWO',
                 'IND3' => 'THREE');

Remember, the qualified expression creates a new instance of the associative array, so any previously defined elements are removed. In this example we create an associative array with three elements, then immediately assign a two element associative array. If we try to reference the element with index 2 we get a NO_DATA_FOUND exception.


  l_tab t_tab;
  -- 18c - Qualified expression using named association.
  l_tab := t_tab(1 => 'ONE',
                 2 => 'TWO',
                 3 => 'THREE');

  l_tab := t_tab(1 => 'ONE',
                 3 => 'THREE');

  DBMS_OUTPUT.put_line('2=' || l_tab(2));
    DBMS_OUTPUT.put_line('I knew this would cause a NDF error!');
I knew this would cause a NDF error!

PL/SQL procedure successfully completed.

In the following example a procedure accepts an associative array as a parameter, which has a default value specified using a qualified expression.


  PROCEDURE dummy (p_tab IN t_tab DEFAULT t_tab(1 => 'ONE', 2 => 'TWO',3 => 'THREE')) AS

Sangam 18 - Database Development: Return of the SQL Jedi

The best upcoming features in Oracle Database 19c

By DBA RJ in Events, Oracle Database General

In the Oracle Open World 2018 event that happened in San Francisco last week, from October 22nd to 25th, much has been said about the trends and strategy paths that Oracle is taking in both OCI and in Oracle Database.

Melbourne Groundbreakers Tour - Hints and Tips

As we DBA's are always excited about the upcoming features, I will share below some of the main things that I've spotted on OOW. Please note that this can change, and we don't even have a beta release yet.

1 - Stability
First of all, it was very clear that Oracle's main focus for the 19c database will be stability. This will be the final release for the "12cR2" family. So it was repeated multiple times: "don't expect to see many new features in this release", what in my opinion is really great.

Since, Oracle has been implementing a lot of core changes in Oracle Database (like multi-tenancy, unified audit, etc) and it's still very hard nowadays to find a stable 12 release to recommend. is my favorite one, however many bugs are unfixed and it lacks a secure PDB layout (PDB escape techniques are pretty easy to explore). 18c will probably be ignored by all as it was a "transition" release, so I hope that 19c becomes the real stable one, as was for 11g release family. Let's see...

Perth APAC Groundbreakers tour - 18c features

Now comes the real features...
2 - Automatic Indexing
This is indeed the most important and one of the coolest features I've even seen in Oracle DB. Once this kind of automation is implemented and released, it will open doors to many other product automations (like automatic table reorganization, automatic table encryption or anything you can imagine).

The automatic indexing methodology will be based on a common approach to manual SQL Tuning. Oracle will capture the SQL statements, identify the candidate indexes and evaluates the ones that will benefit those statements. The whole process is not something simple.

Basically, Oracle will first create those indexes as unusable and invisible (metadata only). Then, outside the application workflow, oracle will ask the optimizer to test if those candidate indexes improve the SQL performance. In case the performance is better for all statements when indexed is used, it will become visible. If performance is worse, it remains invisible. And if it only performs better for some statements, the index is only marked visible for those SQLs (via SQL Patch, maybe).

The automation will also drop the indexes that become obsoleted by the newly created indexes (logical merge) and also remove the indexes that were created automatically but have not been used in a long time. Everything is customizable. For more details, we need to wait for the Beta Release!

3 - Real-time Stats + Stats Only Queries
With those features, you will be able to turn on the real-time stats for some of your database objects and it will be possible to run some SQLs that will query only the object stats, not doing a single logical read! Cool, isn't it?

4 - Data-guard DML Redirect
When you have a physical standby, opened in read-only mode, and plug in it some type of report tool that needs to create an underlying table or insert some log lines to operate, you have a problem. With this feature, you can define some tables (or maybe schema, it's not clear yet) and you will be able to run DMLs on them. So Oracle will redirect that DML to your primary and reflect the changes on your standby, not impacting those tools. This can be dangerous if not configured properly but will also allow us to do many new things.

5 - Partial JSON Update support
When you update a JSON data column, currently Oracle needs to upload the whole new column value to the database and validates. With this, we will now be able to update just a part (like a tag) of a json data.

6 - Schema-only Oracle accounts
With Oracle 18c, it was introduced the passwordless accounts. This means that you could connect to your schema using only a sort of external authentication like Active Directory. Now oracle has gone further, creating the true concept of Schema Only account (meaning there will be no way to authenticate).

Oracle is trying to make the whole Oracle Database "rest aware", meaning that in a very soon future you will be able to perform ALL kinds of DB operations using REST API (like creating a database, creating a user, granting a privilege or adding a new listener port).

8 - Partitioned Hybrid Tables
Remember in very old times when we didn't have partitioned table, and had to implement partition manually using views + UNION ALL of many tables? Thanks god since 8 (released in 1997) we don't need it. Now Oracle finally did one step further, and you can have a partitioned hybrid table, meaning each partition can be of a different type or source (like one partition is external table and other is a traditional data table).

With Oracle 18 XE limited to 12GB data, this feature will be cool as we will probably be able to offload some of the data externally.

9 - EZConnect Improvements
EZConnect is something very useful to make fast connection calls without requiring a TNS. Problem is that, until now, if you want to use some value pairs likes SDU, RETRY_COUNT, CONNECT_TIMEOUT, this wasn't possible and you would end-up using TNS. Now in 19c you will be able to write something like:

sqlplus soe/soe@//salesserver1:1521/sales.us.example.com?connect_timeout=60&transport_connect_timeout=30&retry_count=3

It will also allow to enable multiples hosts/ports in the connection string (typically used in load-balancing client connections).

10 - Some other cool features
There are many other features that we have to wait for the Beta release to understand better. Below are some of them:

Improvements for count distinct and group by queries
Sharding now supports multiple PDB shards in a CDB
SQL JSON Enhancements
RAT and ADDM at PDB level
Data Dictionary Encryption
Database Vault Operations Control
Web SQLDeveloper

Sangam 18 - The New Optimizer in Oracle 12c

Final thoughts...
As I said, compared to Oracle 12R1, 12R2 or 18c, Oracle reduced a lot the total number of features introduced. That's why I'm excited for 19c, this is the first time in a while that I hear Oracle saying that will invest in stability. Hope this is true.

If you want to test 19c before the others, subscribe in Oracle Beta Program at https://pdpm.oracle.com/.

Sangam 18 - The Groundbreaker Community

More Information:






Oracle Database 18c : Now available on the Oracle Cloud and Oracle Engineered Systems


We’ve covered some of the bigger changes in Oracle Database 18c but there are many more that we don’t have space to cover here. If you want a more comprehensive list take a look at the new features guide here.


You can also find more information on the application development tools here





If you’d like to try out Oracle Database 18c you can do it here with LiveSQL


For More information on when Oracle Database 18c will be available on other platforms please refer to Oracle Support Document 742060.1