And if your data warehouse is so big that Stephen Hawking has a theory about it, then you might really need to ask yourself questions about its efficiency.
An objection remover is a claim made during the sales cycle intended to counter a fear or concern that you have. Objection removers occupy a gray zone in between legitimate benefits and outright misrepresentations. While an objection remover may be “technically” true, it’s a distraction intended to make you close the door on your concern and move forward in the sales process without careful thinking.
Have you ever wondered why companies buy software licenses from BI vendors for hundreds or even thousands of “seats” and only use a small fraction of them? Or why business people continue to use Microsoft Excel as their primary BI product even though their IT group selected a different BI tool and may even want to ban Excel? And why is it that IT groups boast about the corporate data warehouse’s terabytes of data while business users lament their lack of data?
The underlying answer to questions like these is that business and IT groups have “relationship issues.” Maybe they need couples therapy.
Now, however, the industry has reached a point where a new means for integrating, animating and publishing corporate information is needed. Fortunately, the solution is emerging via a technology we are all comfortable with for other purposes such as entertainment, e-commerce and “surfing” – the Web browser.
Ad-hoc? Managed? Secured?
How can the CEO and CFO be assured of the integrity of the information they must attest to for SOX, for example, and that the corporation is keeping accurate and complete records? Board members and executives should be asking the chief audit executive (CAE) the following questions:
Let’s see – record every record accessed by every “person”. A simple table scan of the subsidiary ledger for a SUM(YTD_ACTUAL) will generate 200,000 access records. If I do it 10 times in a day (refine the report) I generate 2 million access records. Mutlply that by 100 users.
I’m not sure that the statement (question)
“Capture data access, automatically tracking whenever data is modified or viewed by any means;” is every fully-considered from the unit record level reality.
Clearly one cannot track access to every record by every “user” since access to the tracking records is recursive and will immediately melt down the process.
Service-orientation describes a new method for architecting connected systems, and is based upon three simple concepts:
A service is a program that other programs interact with using messages.
A client is a program that makes services usable to people.
A connected system is a collection of inter-connected services and clients.
Instead of integrating disparate applications via direct object activations as in distributed object systems, applications expose a set of “services” that other applications can utilize. �With service-orientation, applications running on different platforms and programmed using different development platforms can fully interoperate. In addition, application developers and system integrators do not require specific knowledge of the underlying object models, type systems and protocols of the software being integrated.�Instead, all applications expose their services using standards-based protocols and policy definitions that can be dynamically acquired. �This ‘uncoupling’ of the applications that comprise connected systems enables simpler integration, more flexibility in adapting systems to changes over time, and also enables more reliable operations.
Is this the next technology framework change for the system that is yet to be complete?
How do we cope with building new systems that take years to develop and deploy when the whole infrastructure changes before the first “complete” product can be build?
Religious wars. We have them in IT all the time. Everyone debates architecture. The “high priests” of our trade guard it. The approaches espoused by Ralph Kimball and Bill Inmon are at the center of one of the biggest data management religious battles.
At my job, data quality has suddenly become an issue. For years we have ignored this question, even though we knew that our data quality was poor and resulted in less than optimal customer service.
Summary: In Part 1 of this article, Haughey established some basic and essential definitions for analytical modeling. In this second
part, he will address a number of issues critical to analytical modeling
of real applications. Along the way, he will discuss how they are handled
in logical modeling, physical design and dimensional design.
Summary: This article reexamines the concept of data design for
analytical systems such as the date warehouse. It takes a close
look at dimensional modeling and defines its proper role and
context. It also positions ER modeling, dimensional modeling and
other forms of modeling into a general framework.