(Please refer to the following article: Oracle 12c In-Memory Database is Out
- Hardly Anybody Notices for update on Oracle 12c databases)
Contemporary large servers are routinely configured with 2TB of RAM. It is
thus possible to fit an entire average size OLTP database in memory directly
accessible by CPU. There is a long history of academic research on how to
best utilize relatively abundant computer memory. This research is becoming
increasingly relevant as databases serving business applications are heading
towards memory centric design and implementation.
If you simply place Oracle RDBMS's files on Solid State Disk, or configure
buffer cache (SGA) large enough to contain the whole database, Oracle will
not magically become an IMDB database, nor it will perform much faster. In
order to properly utilize memory, IMDB databases require purposely
architected, confi... (more)
( For latest information on Oracle 12c database update please refer to the
following article: Oracle 12c Database and How It Relates to SAP Hana )
RDBMSs are stable and mature products. While there is nothing radically new
on horizon that would challenge Codd's relational theory and related advances
in data processing there are some developments that force established vendors
like Oracle to come up with new features and products.
Column Stores and Oracle
Column store concept has been around for quite a while. Vendors like HP
Vertica grabbed some market share in data warehousing seg... (more)
Oracle database is a relational database management system that mostly
complies with ACID transaction requirements ( atomicity, consistency,
isolation, durability ). It means that each database transaction will be
executed in a reliable, safe and integral manner. In order to comply with
ACID Oracle database software implements fairly complex and expensive (in
terms of computing resources, i.e., CPU, disk, memory) set of processes like
redo and undo logging, memory latching, meta data maintenance etc. that make
concurrent work possible, while maintaining data integrity. Any databa... (more)
Many products are available as open source or proprietary products that can
handle Big Data. Which one is best fit for this task?
Today's classic RDBMSs and tools are able to quickly load the data, process
it and present results in an easy to understand format. You can use SQL
or programmatic interface to process the data randomly or in batch; RDBMS's
keep data safe, protected against hardware and software failures.
Standards tools and products are not able to cope with Big Data requirement,
which is not dissimilar to what is involved in processing today's regular
data sets, jus... (more)
Amazon Web Services EC2 Cloud is full scale public data center offering
services that are in many aspects far ahead of ancient practices present in
regular IT environments. Fast provisioning and virtually unlimited scale make
old fashioned server procurement and installations look like what they are -
past century's practices. This article will not deal with usual objections to
cloud computing revolving around change management difficulties ( how to
incorporate new environment into the existing IT infrastructure ), security,
reliability, performance etc. since we think that all t... (more)