Sometimes, you’re on a team, and you’re busy banging out the code, and somebody comes up to your desk, coffee mug in hand, and starts rattling on about how if you use multi-threaded COM apartments, your app will be 34% sparklier, and it’s not even that hard, because he’s written a bunch of templates, and all you have to do is multiply-inherit from 17 of his templates, each taking an average of 4 arguments, and you barely even have to write the body of the function. It’s just a gigantic list of multiple-inheritence from different classes and hey, presto, multi-apartment threaded COM.
This page contains a selection of funny database video’s, humor for DBA’s and database developers.
As information management is gaining acceptance, good governance is becoming even more critical for the whole process of retrieval, acquisition, organization and maintenance of information. The crucial factor in information and decision process analysis is an improved design-thinking attitude. Only when decision-makers use a good process and methodology for making decisions under limited circumstances can people’s information management needs and desires can be made technologically feasible. Because all the information ultimately is managed by individuals, wherever there is human intervention conflicts between facts and myths exist. I have come across the following eight myths in information management.
In the world of Business Intelligence, Excel is the devil and BI tools are the savoir. Spreadsheets are a satanic element were trying to drive from unrepentant departments. This is because centralized data is good and distributed data is bad.
I *love* the term “spreadmarts”.
Myth: LDAP is a directory.
Truth: LDAP is an access protocol.
How many times have you heard (or even used) the term “LDAP directory?” It seems that the words “LDAP” and “directory” have been used together so often that they have essentially become synonymous, leading to some unconscious misstatements, which can lead to more important mistakes.
In truth, LDAP is an access protocol , as the AP in its name clearly states – not a directory. In light of this fact, the frequently repeated reference to “storing data in LDAP” seems rather nonsensical. After all, you can’t store data in a protocol, right? No one says, “Let’s store the data in TCP/IP” because they know that TCP/IP is a protocol that specifies the format of data transmissions over a network – not a physical location for holding data. Ditto for LDAP. For some reason, though, the incongruity of such a statement doesn’t seem to register when framed in the context of directory data.
The rest of the “Myth/Truth” exposition is equally rewarding.
It just isn’t easy or simple or inexpensive.
The extract-transform-load (ETL) system, or more informally, the “back room,” is often estimated to consume 70 percent of the time and effort of building a data warehouse. But there hasn’t been enough careful thinking about just why the ETL system is so complex and resource intensive. Everyone understands the three letters: You get the data out of its original source location (E), you do something to it (T), and then you load it (L) into a final set of tables for the users to query.
Analysis is overrated.
It’s worth reading about why.
Nearly every business intelligence (BI) initiative has struggled with the issue of preserving the experience gained. Crucial information is uncovered, fundamental decisions made, tradeoffs negotiated, and most of it is lost before the first major enhancement.
If you have to continually revisit design decisions or relearn technology or even the business issues, your ability to sustain continual improvement is severely compromised. Is this more critical in business intelligence?
How can the CEO and CFO be assured of the integrity of the information they must attest to for SOX, for example, and that the corporation is keeping accurate and complete records? Board members and executives should be asking the chief audit executive (CAE) the following questions:
Let’s see – record every record accessed by every “person”. A simple table scan of the subsidiary ledger for a SUM(YTD_ACTUAL) will generate 200,000 access records. If I do it 10 times in a day (refine the report) I generate 2 million access records. Mutlply that by 100 users.
I’m not sure that the statement (question)
“Capture data access, automatically tracking whenever data is modified or viewed by any means;” is every fully-considered from the unit record level reality.
Clearly one cannot track access to every record by every “user” since access to the tracking records is recursive and will immediately melt down the process.
Commentary: Facts and Fables about Data Warehousing
Facts and Fables are introduced monthly in our Kimball Group Design Tips.