• IBM Consulting

    DBA Consulting can help you with IBM BI and Web related work. Also IBM Linux is our portfolio.

  • Oracle Consulting

    For Oracle related consulting and Database work and support and Migration call DBA Consulting.

  • Novell/RedHat Consulting

    For all Novell Suse Linux and SAP on Suse Linux questions releated to OS and BI solutions. And offcourse also for the great RedHat products like RedHat Enterprise Server and JBoss middelware and BI on RedHat.

  • Microsoft Consulting

    For Microsoft Server 2012 onwards, Microsoft Client Windows 7 and higher, Microsoft Cloud Services (Azure,Office 365, etc.) related consulting services.

  • Citrix Consulting

    Citrix VDI in a box, Desktop Vertualizations and Citrix Netscaler security.

  • Web Development

    Web Development (Static Websites, CMS Websites (Drupal 7/8, WordPress, Joomla, Responsive Websites and Adaptive Websites).

24 November 2011

My external RAC memory

RAC Cheat sheet
This is a quick and dirty cheat sheet on Oracle RAC 10g, as my experience with RAC grows I will update this section, below is a beginners guide on the commands and information that you will require to administer Oracle RAC.


Files and Directories

Useful Views/Tables

Useful Parameters


General Administration

CRS Administration

Voting Disk

22 November 2011

End of the year approaching, let use help you modernize you IT Infrastructure!

It will help you save lots of money and runs your Applications much more efficient!

21 November 2011

IBM System p

IBM System p for Oracle Data Warehousing

Main Advantages

 Reduces the complexity and risk of Oracle Data Warehousing deployments
Incorporates IBM System p™ and System Storage™ solution options into IBM and Oracle Data Warehousing reference configurations developed collaboratively as part of the Oracle Information Appliance Initiative
Offers a family of validated and tuned pre-sized configuration building blocks to support a range of raw data, user and query workload requirements
Provides repeatable, balanced building blocks to scale-out the data warehouse
Seamlessly incorporates Extract, Transform and Load (ETL) and application loads into the data warehouse

IBM System p
A reliable, scalable, resilient server infrastructure is critical for application deployments that rely on an Oracle database. IBM System p technology is a smart choice for UNIX® and Linux® operating system-based Oracle database deployments and for businesses that demand powerful, flexible, reliable and secure computing solutions. An extensive system family from 1- to 64-corescalability is competitively priced with64-bit IBM POWER5™, POWER5+™, and POWER6™ technology designed to lower software, energy and space costs through leadership performance and unique IBM capabilities that can enable a dramatic increase in individual system utilization. A choice of more than 10,000 IBM AIX® and Linux applications supports a broad array of business requirements. And models can be selected for specific workloads, from front-end business intelligence (BI) applications to back-end data warehouse (DW) infrastructure.

System p technology leadership
IBM System p models are based on advanced IBM POWER™ Dual- Core chip technology and deliver outstanding price/performance, mainframe-inspired reliability features and innovative virtualization capabilities. Add to that IBM’s unique simultaneous-multithreading technology allowing two application threads to be run at the same time, and it’s easy to see why the IBM System p server delivers outstanding performance.

System p virtualization

System p virtualization technology can dramatically increase server utilization allowing workloads to be easily consolidated and enabling you to do more than ever before with a single server. Available as an option on all System p models, Advanced POWER Virtualization allows you to run multiple applications on AIX and Linux operating systems at the same time in separate, secure partitions using as little as one-tenth of a processor — allowing a reduction in the number of servers and peripheral devices needed. And the System p Capacity on Demand function available on selected models is designed to scaleup and scale-down processing power and memory as business needs fluctuate with no disruption to your business.

IBM System Storage

Disk storage is a critical element in an Oracle Data Warehousing environment. Multiple solution options are available including:

• IBM System Storage DS8000™ family - designed to deliver robust, flexible, highly available, and cost-effective disk storage to support continuous operations for mission-critical workloads
• IBM System Storage DS4000™ family - scalable, modular Fibre Channel disk storage designed with growth, reliability, and availability in mind, from entry-level to enterprise environments and performance oriented to capacity-oriented applications

For more information regarding IBM System Storage product compatibility with the Oracle database please visit:

IBM and the Oracle Information
Appliance Initiative

IBM has developed data warehouse reference configurations, called Information Appliance Foundations, as part of the Oracle Information Appliance Initiative. These foundations combine Oracle Data Warehousing components with IBM System p and System Storage products. These Oracle Information Appliance Foundations provide validated, balanced configurations for data warehouses that combine system resources such as hardware, storage, I/O and networking into data warehouse building blocks that can be combined to address different scalability needs in a linear fashion. They’re designed to support varying loads based on varying raw data size, concurrent user load and varying query complexity.

The development and use of repeatable building blocks allows high-performance data warehouses to scale through the use of a modular design approach to the data warehouse, ETL functionality, business intelligence tools and applications. The modular building blocks are designed to deploy into existing client business intelligence and OLTP (On Line Transaction Processing) infrastructures. For more information about the Oracle Information Appliance Initiative please visit:

Business intelligence reference architecture for Oracle on IBM Systems.

Figure 1 outlines the IBM reference architecture for the deployment of Oracle business intelligence components on IBM Systems and IBM System Storage products. The architecture is a high-level system design. It is free of implementation details and provides a high level description of the solution components.

The key elements of the reference architecture are the:
• Software Architecture Components – these define the overall structure and relationships among the key functional elements of the data warehouse, the infrastructure software and BI server repository
• Systems Architecture - this defines a proven approach for insuring that a balanced set of system resources are in place to deliver expected performance based on the primary drivers of data warehouse performance – including computer power, network bandwidth and storage capacity and bandwidth

The software architecture is made up of three primary components or groups. The first group is made up of those components that comprise the data warehouse. These include the relational database (Oracle’s latest Enterprise Edition release with Oracle Partitioning), storage management (Oracle Advanced Storage Management product for logical volume management of the database objects) and ETL functionality (Oracle Warehouse Builder). For existing deployments the architecture is flexible enough to allow third party ETL products such as IBM Ascential, Informatica, SAS or Business Objects to be used as the ETL driving mechanism.

The second group within the software architecture comprises the infrastructure including administration and management (Oracle Enterprise Manager), cluster control (Oracle Real Application Clusters (RAC)) as well as hardware (IBM System p) and storage (IBM System Storage).

The third software architecture group consists of the BI server repository (Oracle BI Enterprise Edition) and the individual applications that the client uses to access the data warehouse. These applications can be specific Oracle applications or third party tools from other vendors such as SAS, Business Objects, Cognos or Microstrategy.

The systems architecture that IBM has developed which supports the Oracle Information Appliance Foundation and the IBM business intelligence reference architecture for Oracle on IBM Systems will be described in the next section.

Functional components of the business intelligence reference architecture for Oracle on IBM Systems

The logical flow of data from the data source to delivery can be defined as a series of processes that accommodate data integration, data warehousing and data analytics. Each of these areas exhibit different workload and resource characteristics. These characteristics can be defined and resource requirements articulated through best practices and workload sizing and capacity planning methodologies.

Figure 2 outlines these different areas and defines the approach of repeatable building blocks (called Oracle nodes) for ETL, data warehousing and analytics applications. These building blocks or nodes relate back to the reference architecture and comprise the systems architecture described earlier. The use of nodes in this fashion allows the client to integrate existing ETL and application deployments into the data warehouse using appropriate sizing techniques. Each of the nodes are designed using IBM System p and System Storage hardware with associated interconnect technology to scale in an Oracle RAC-managed environment. The nodes provide flexibility allowing a deployment to start in an SMP environment and then to scale-out in an Oracle RAC deployment as the data warehouse grows. Each functional node (Figure 3) is sized based on workload characteristics in terms of I/O bandwidth, memory requirement and CPU utilization. The holistic design is that data integration nodes are balanced not only across the internal resources available but also balanced in terms of performance with the other nodes in the entire business intelligence solution. The Oracle data warehouse node is designed to provide balanced, scalable performance of the data warehouse as multiple nodes are connected together into a RAC-enabled data warehouse deployment.

The individual nodes are the core components of the Oracle Information Appliance Foundation (Figure 4). The business intelligence solution is deployed based on a detailed sizing exercise which defines the number of Oracle data integration nodes, Oracle data warehouse nodes and Oracle analytics nodes that are required. The deployment uses the infrastructure components outlined in the reference architecture based on IBM System p, System Storage, Oracle RAC, Oracle Partitioning and Enterprise Manager.

Sizing functional building blocks How to size Oracle Business Intelligence and Oracle Data Warehousing solutions

IBM offers a process for sizing future hardware requirements when a client is looking to run Oracle Business Intelligence and data warehouse solutions on IBM hardware. This process is based on performance data and other information gathered from the client’s existing environment. This input is used to estimate the resources required to support one or more of the following scenarios:

• New Oracle BI or DW installations
• Additional applications for an existing Oracle BI or DW production environment
• Migrations to new IBM hardware platforms, such as System p or System x™ servers

In order to start the sizing estimate process, follow the instructions on the cover page of the Oracle Database sizing questionnaire. The questionnaire provides information on what needs to be completed and where to send the completed document for processing. Please work with your IBM representative or IBM Business Partner (DBA Consulting) in order to obtain a sizing estimate. To access the Oracle database sizing questionnaire please visit:

Once there, select the Oracle Database sizing questionnaire from the list of supported applicationsystems. On the same web page just mentioned you can access the Oracle Business Intelligence quick sizer from the same list. This is a simplified tool that provides general sizing guidelines on possible IBM hardware configurations when running Oracle BI. It does not replace the standard IBM sizing process but can be an excellent complement or starting point for the sizing estimate.

The Oracle BI quick sizer is a tool developed by IBM and Oracle. It is used as an initial sizing reference with clients who wish to build a new, or extend an existing Oracle Database data warehouse. The BI quick sizer is designed to provide a reference point for discussion around the deployment of Oracle Database 10g and, where appropriate Oracle RAC on IBM System p or IBM System x server technology. The Oracle BI quick sizer provides a discussion structure whereby the client can consider a number of factors:

• Comparisons of growth scenarios around scale-out or scale-up strategies
• Comparisons between System x with Linux and System p with AIX solutions.
• Comparisons of servers for different query workloads
• Storage configurations based on amount of usable disk and Host Bus Adapters (HBAs aka Fiber Channel adapters)

In addition to sizing methodologies for Oracle Data Warehousing, ETL and business intelligence applications, IBM has sizing methods and practices for many of the third party tools that can be seen in a complex heterogeneous business intelligence deployment including SAS, Business Objects, Cognos and Microstrategy. These sizing practices and methods are used to design the analytics nodes as part of the complete deployment. The availability of all these sizing paths provides a comprehensive sizing solution for the client sizing needs in this market segment.

Oracle data warehouse node for IBM System p and System Storage

As noted, the Oracle data warehouse node is the core component in constructing a scalable data warehouse which is the underlying driver of the business intelligence solution. Table 1 identifies the infrastructure elements of an Oracle data warehouse node that can be found in the Oracle BI quick sizer. The node provides 5TB of raw data warehouse space providing the flexibility for a moderate deployment to grow in a scale up fashion to 5TB before then scaling outwards in an Oracle RAC deployment for data warehouses greater than 5TB. The node is designed to provide balanced memory, I/O and processing power to efficiently execute application queries. The node is designed to service between 15 and 90 concurrent users depending on the complexity of the user queries.


One of the major areas of concern for business is the effort involved in architecting, developing and deploying complex data warehouses and the associated components of the business intelligence solution around ETL and analytic applications. IBM and Oracle have developed a powerful architectural model and approach to simplifying the deployment and reducing the risk of data warehouses based on the IBM System p for Oracle Data Warehousing. This solution fully supports the Oracle Information Appliance Initiative and provides a family of optimized and validated presized configuration building blocks for data warehouses to support a range of raw data, user and query workload requirements.

For more information
To explore other System p and Oracle Data Warehousing solutions or to find out more about other joint solutions from IBM and Oracle, please contact an IBM sales representative at 1 866 426-9989, or visit us at:

For more information about the IBM System p please visit:

For more information about IBM System Storage please visit:

Or contact IBM Partner: DBA Consulting

Blog: http://drsalbertspijkers.blogspot.com/

20 November 2011

I am IBM Partner now!

IBM has also a First Class BI Solution that
DBA Consulting is offering to you!

IBM has a first class Business Intelligence Solution that DBA Consulting is offering to you from now on. You can count on first class consultancy services and first class software and hardware that is at your choice at competitive rates and license costs and reliable hardware that has proven itself for over 100 years now!
IBM is number two in the Gartner Quadrant (see earlier blog), just before Oracle and just after Microsoft. That is all according to Gartner that is. I think all three BI solutions are great and highly customizable to your needs and Dashboard wishes.  

With IBM Cognos 10 BI Suite however you are in for as treat.

Cognos Business Intelligence provides:
·         Predictive analytics, allowing users to perform advanced
analysis, publish and communicate with the broader user

·         What-if analysis for users to create and evaluate on-the-fly

Collective intelligence

“The very nature of an organization is different
people doing different things. The only way to
make it work is communication. And the basis for
communication is shared understanding of goals,
of process, of performance, of data.”
Dr. Greg Richards, IBM Center of Performance Management, Telfer
School of Business, University of Ottawa.5
5 Analytics

Organizations need more agile decision-making to respond to
market opportunities and challenges. In the case of the
workplace, the whole is greater than the sum of its parts.

Smart organizations pull creative elements out of
compartments or silos and integrate them into the mainstream.
So that everyone understands and can influence the greater

In a collaborative environment, people proactively exchange
knowledge and cooperate with one another, eliminating
communication barriers and improving the organization’s
ability to be ready for what comes.

In other words, collective intelligence ensures a more informed
and aligned business – one that is more efficient, effective and
adapts quickly to internal and external change. Collaboration
closes the loop from insight to action, and enables everyone to
work together, agree, decide and act.

Cognos Business Intelligence addresses these requirements by
providing collaborative intelligence to share insights and gain
alignment. Collaboration and workflow allow the business to:

•Establish decision networks to share insights and drive toward
collective intelligence.
• Provide transparency and accountability to drive alignment
and consensus.
• Communicate and coordinate tasks to engage the right people
at the right time.

Establish decision networks

Cognos Business Intelligence accommodates the social
networking needs of users in your organization, so they’re able
to share insights, solicit input from peers and enable better
corporate memory.

The software provides integrated access to blogs, wikis and
message boards from the Cognos Business Intelligence
workspace. As a result, users can connect with the right people.

They can establish decision networks and threaded discussion
areas to assemble and discuss relevant issues. For example:
publishing a BI report via a blog to share with peers over
the web.

Provide transparency and accountability

Without process and data standards, organizations lack a single
version of the truth. Annotations, taxonomies and metadata
help organizations to correctly source information, identify
author input and gain a common definition of terms.

Consensus and accountability across the same shared platform
ensures consistency and gives users confidence in the
information they are using.

Cognos Business Intelligence provides:

• Ability to annotate reports to insert commentary for
additional context.
• A centrally managed set of terms and taxonomies.
• Metadata showing report information, technical views and
visual maps of where underlying information is sourced.
• Alignment to business structures, to better support and reflect
the organization

IBM Cognos 10: Intelligence Unleashed
Smarter Decisions. Better Results.

Cognos 10 delivers a revolutionary new user experience
and expands traditional business intelligence (BI) with
planning, scenario modeling, real-time monitoring and
predictive analytics. With the ability to interact, search
and assemble all perspectives of your business, Cognos 10
provides a limitless BI workspace to support how people
think and work.

Cognos 10 enables organizations to outperform by

• Analytics that everyone can use in a BI workspace that
sharpens individual skills to answer key business
• Collective intelligence with built-in collaboration and
social networking to connect people and insights to gain
• Actionable insight everywhere in mobile, real-time
and business processes to instantly respond at the point
of impact
Built on a proven technology platform, Cognos 10 is
designed to upgrade seamlessly and to cost-effectively
scale for the broadest of deployments. Cognos 10
provides you and your organization the freedom to see
more, do more—and make the smart decisions that drive
better business results.

You can have a complete BI solution from IBM with:

  •          Cognos 10 for BI
  •          Tivoli for Security and maintenance
  •          Lotus Notes for Communications and presentation and reporting out of the dashboard
  •          WebSphere for Middleware (57% less costs then Weblogic)
  •          IBM SOA for Customizations of individual Dashboard sources and presentations.

Over 100 years IBM is now in hardware as well. I(nternational) B(usiness) M(achines) can also deliver you this hardware through DBA Consulting as their new Business partner.

A small choice of possible server choices are:

  • System x
  • Blade Center Systems
  • System z

, and for storage that is highly flexible and really competitively priced you can use:

Storwize V7000 Unified

Here is some extra information on storage technologies for you:

, with a very easy to use maintenance GUI interface.

There is also the possibility for virtualization:

In need of a Database to store your data in, you can choose between :

  • DB2
  • Informix
  • Oracle
  • SQL Server 2008 R2
As Operating system you can choose between:

  • IBM z/OS
  • SUSE Enterprise Server
  • RedHat Enterprise Server
  • Oracle Solaris
  • Open Solaris
  • Microsoft Windows Server 2008 R2

DBA Consulting can offer you 0% financing temporarily if you might need this. (Until IBM decides otherwise)!

website: http://www.dbaconsulting.nl 
For more information contact us at: info@dbaconsulting.nl

18 November 2011

Oralce in Cloud number 9!

Below you find a reprint of the blog from Oracle Data Warehouse Builder blog

Delivering software to support the cloud
By keith.laker on Feb 18, 2010
From a software perspective, developing a cloud strategy is all about the data and not moving it. For a long time know Oracle has advocated the basic principle of doing everything inside the database. When you move to the cloud this makes even more sense because you do not want to be continually unloading, moving and reloading data into different engines.
Many of the data warehouse vendors actively promote the use of multiple processing engines to support their data warehouse solution. As a result you get something like this: 

Unfortunately, this is exactly the approach being put forward by the same data warehouse vendors in an attempt to get customers to move to the cloud. Their view of an EDW in the cloud looks this:

In this scenario data is being loaded, unloaded, moved and reloaded multiple times which increases latency, allows errors to be introduced and makes it difficult to determine exactly where a piece of data actually came from. There is also the topic of data security to consider - and that is a big topic! All this data movement, unloading and reloading provides numerous opportunities for security breaches.

If you are going to develop a viable data warehouse cloud strategy then what is needed are some simple rules that can be used to check the suitability of your preferred database platform:
  • Flexible data model not fixed data model
  • Data loading based on ELT not ETL
  • Analytics inside the database not outside
  • End-to-end security not disconnected security
  • Effective resource management not ineffective resource management
Data Model Strategy - > Flexible not fixed data model
Oracle Database is not restricted to a single type of data model. This provides the required flexibility to provide a data model that can support real-time data loading as well as the complex analytics needed to support today's BI queries. Most importantly, as the business changes (new companies acquired, new products added and old products decommissioned) it is important to have a data model that can easily move with the business and not hold it back. This is especially true when considering a cloud-based strategy for the data warehouse since one of the main drivers of moving to a cloud based environment is "increased flexibility".

Oracle has developed and proven its reference architecture through numerous customer engagements over the last 14 years During this time the model has evolved to build on the changes in the capability of the underlying database technology and tools. Each new release of the Oracle Database adds new data warehousing, security and availability features that make it significantly quicker and easier to implement this reference architecture.

The goal of Oracle’s Data Warehouse Reference Architecture is to deliver a high quality integrated system and information at a significantly reduce cost over the longer term. It does this through recognizing the differing needs for Data Management and Information Access that must both be delivered by the Warehouse, applying different types of data modeling to each in a layered and abstracted approach.
The Reference Architecture is intended as a guide and not an instruction manual. Each layer in the architecture has a role in delivering the analytical platform required to support next generation business execution. The Reference Architecture gives us something to measure back against so we can understand what we compromise by making specific architectural, technical and tools choices. It works equally well for new Data Warehouse developments as it does for developing a roadmap to migrate an existing ones.
Below is an overview of the main elements of the reference architecture:


Data loading strategy-> ELT not ETLMany data integration (DI) tools rely on their engines to perform data transformations this is because many databases have very weak data transformation engines. Therefore, most DI tools extract data from a source system, move that data into their own processing engine and perform transformations in a row-based manner. Finally, the data is then pushed into the target - the data warehouse. The situation gets more complex when you start including processes to manage data quality, data lineage, data discovery etc etc.
This approach means customers have to manage multiple servers and their network takes a beating every time the ETL jobs are run because large data sets are moved around the network being passed from engine to engine. Yet the need to use multiple engines with associated dedicated hardware is often cited as an excellent reason for moving to a cloud based strategy. This removes the need to manage all those servers and software licenses. Data can freely move around the cloud taking advantage of the latest versions of each piece of software.
Yet it is the volume of data and the complexity of the transformations (ETL and data quality) that makes it vital that processing is down within the data warehouse database engine under the control of the database workload management features. The Oracle Database has specialized and optimized data transformation features such as set-based operations, error logging, pipeline table functions, regular expressions.
The Oracle Database license includes the Warehouse Builder (OWB) which follows the approach of extracting the data from the source system, loading into the target (DW) and then applying the required data transformations using the power of the Oracle Database. Customers using OWB do not need to buy additional hardware to run their ETL or additional tools beyond their normal enterprise database license. Therefore, using OWB it is possible to "cloud-enable" the ETL process directly within the database.
As all ETL jobs are under the control of the database workload manager the priority and access to resources can be managed from one central console. Using OWB's macro language ("experts") it is possible to write wrappers around normal processes that users might want to do such as load the contents of an Excel worksheet into a table. This way, users can "build" and execute their own ETL jobs using the same ETL tools and repository as the IT team. Then when something needs to be changed the impact on the whole environment can easily be determined.
Processing data inside the database makes sense. Take the analysis to the data not the other way round!
Analytics Strategy -> inside the database not outside
As with ETL it makes sense to do as much processing inside the database as possible since this is where all the data and real processing power is located. Personally, I think the challenge for most Oracle customers is knowing what is inside the database. The latest version of Enterprise Edition offers data mining, OLAP/multi-dimensional models, spatial, text mining, and support for unstructured data. By keeping all these types analysis within the database engine it is possible to run cross-functional analysis that is simply not possible in other data warehouse databases/engines.
Imagine being able to analyze the result from a data mining model using spatial analytics and then applying a top 10 and bottom 10 query to highlight winners and losers? Could you do this using the cloud? Of course, but it would probably mean unloading data from the enterprise data warehouse into a data mining engine and then pushing the results to a spatial engine and creating a federated query across the spatial and data warehouse datasets to run the winner and losers query. That all takes time and time is what most business users do not have - even without considering who is going to write the ETL to move all that data around!

End-to-end security not disconnected security
One of the biggest challenges around cloud computing is data security. Why? Because data is continually on the move from one engine to the next and all that movement is not encrypted, some engines have an encryption process (usually unique to them) and others have nothing. How do you know who is accessing your most sensitive data and more importantly how do you know where it is being moved to?

There is an easy answer to this: don't put sensitive data in the cloud! The only problem is that the sensitive data is usually the gateway to a lot of very important analysis. Therefore, you either stop moving data around, or you apply strong encryption and authorization policies or you do both. Fortunately, Oracle offers both! Using Oracle Database as the foundation of a DW cloud strategy means you can use Oracle's transparent security features to lock down sensitive data and stop unauthorized access. Data remains locked inside the Oracle Database where you can use the built-in analytic power to run queries across effectively secured data sets.

Effective resource management not ineffective resource management
If you are going to manage resources with the cloud in an effective way then you need to be able to control all aspects of the data warehouse workload. Most database systems, including those with cloud platforms, will provide some degree of control over the processing directly within the database.
The Oracle Database Resource Manager (DBRM) allows the DBA to prioritize workloads and restrict access to resources for certain groups of users. This allows the to protect high priority users or jobs from being impacted by lower priority work. The DBRM does this by allocating CPU time to different jobs based on their priority. The amount of resources allocated to a specific workload or user can depend on the percentage of CPU time, number of active sessions, and amount of space available etc etc.

The addition of Exadata to the data warehouse platform provides the Oracle DBA with one significant advantage for managing workloads: it extends DBRM's capabilities to include the coordination and prioritization of I/O bandwidth consumed between databases, and between different users and classes of work. This is only possible with Oracle and is the direct result of the tight integration between the database with the storage layer. Exadata is aware of what types of work and how much I/O bandwidth is consumed. Users can therefore have the Exadata system identify various types of workloads, assign priority to these workloads, and ensure the most critical workloads get priority.

To support a data warehouse cloud strategy that supports both data warehousing and/or mixed workload environments, you may want to ensure different users and tasks within a database are allocated the correct relative amount of I/O resources. For example you may want to allocate 70% of I/O resources to interactive users on the system and 30% of I/O resources to batch reporting jobs. This is simply not possible, or at best extremely complex to achieve, with other vendors databases. With Oracle, this is simple setup and enforce using the DBRM and I/O resource management capabilities of Exadata storage.

With Oracle Database 11g you get an integrated and complete software platform to support a cloud strategy 

In this model Oracle Database provides an intelligent cloud, or iCloud, compared to the more traditional "dumb" cloud which is being heavily promoted by many of the current data warehouse vendors as they rush to prove their cloud credentials. Oracle offers "iCloud" as the way forward for your data warehouse strategy, which really is the only way forward.

To read about the Oracle Cloud Reference Architecture please follow this link to the Oracle Data Warehouse Reference whitepaper:

Oracle Data Warehouse Reference White paper

and a other compelling white paper from Ritman/Mead one of Oracle's important BI partners:

Real-time dataware Housing by Stewart Bryson

Please contact DBA Consulting for more information on out BI solution for customers.

Website: DBA Consulting
Blog : http://drsalbertspijkers.blogspot.com/