InfoWorld: Identity’s federated future: September 03, 2004: By Neil McAllister : DATA_MANAGEMENT : SECURITY : STANDARDSThat’s a federated system in action. Out of mutual self-interest, using simple authentication at the point of transaction, participating banks have agreed to trust one another to supply funds from their respective vaults. The banks remain separate entities, but the flow of transactions is shared, creating a federated network. Edited.
Forrester Research: The ETL Tool Market is Back and Growing | DM Review | Industry Led, Industry Read
With IT budgets reduced and frozen in 2002, spending on extraction, transformation and load (ETL) tools grew a paltry 3 percent over 2001. When budgets thawed in 2003, spending shot up 17 percent due to the pent-up demand. These extremes do not accurately represent the ETL market’s growth potential, so Forrester forecasts a moderate growth rate of 10 percent for 2004-2006.
The Publishing Requirements for Industry Standard Metadata (PRISM) specification defines an XML metadata vocabulary for managing, aggregating, post-processing, multi-purposing and aggregating magazine, news, catalog, book, and mainstream journal content. PRISM recommends the use of certain existing standards, such as XML, RDF, the Dublin Core, and various ISO specifications for locations, languages, and date/time formats. In addition PRISM provides a framework for the interchange and preservation of content and metadata, a collection of elements to describe that content, and a set of controlled vocabularies listing the values for those elements.
Study: Unpatched PCs compromised in 20 minutes | CNET News.com
According to the researchers, an unpatched Windows PC connected to the Internet will last for only about 20 minutes before it’s compromised by malware, on average
DM Review – Business Dimensional Modeling: Back to the Future
As I reflect on future of business intelligence, I cannot help but think of what we have accomplished so far. Over the past two decades, there have been incredible changes in the business climate and amazing advancements in technology. As an industry, we have accumulated extensive experience, much of which is available through a variety of publications. Notable advancements include:
- Improved overall reliability of the entire environment.
- Ability to effectively build, administer and query very large and complex analytical databases where it is not uncommon to have multiterabyte dimensional data marts.
- Development of a thriving tools business for ETL and data management technologies.
- Emergence of meta data standards that are beginning to be adopted.
- Harnessing of the Internet for cost-effective delivery to many more people.
- Increase in the number of well defined architectures and methodologies to discuss, debate and provide guidance.
Software That Lasts 200 Years
Many things in society are long-term
In many human endeavors, we create infrastructure to support our lives which we then rely upon for a long period of time. We have always built shelter. Throughout most of recorded history, building or buying a home was a major starting step to growing up. This building would be maintained and used after that, often for the remainder of the builder’s life span and in many instances beyond. Components would be replaced as they wore out, and the design often took the wear and tear of normal living into account. As needs changed, the house might be modified. In general, though, you thought of a house as having changes measured in decades
OpenVMS: An Old Dog Still Doing New Tricks
Thought by many to be long since dead and buried, the OpenVMS operating system persists inside many enterprises
Reality IT: Data Mining – If Only It Really Were about Beers and Diapers | DM Review | Industry Led, Industry Read
At my job, we use data mining tools in order to figure out what the heck is really going on. Data mining has been around for quite some time now. About 10 years ago it was even considered by many BI vendors to be the “next big thing” after ad hoc querying and OLAP tools.
Data Integration: The Common Problem – Working with Merge/Purge and Household | DM Review | Industry Led, Industry Read
Of all the issues related to accumulating and identifying unique data during a data warehouse implementation, perhaps the single most difficult area to control centers around data quality issues. And perhaps the most difficult task that must be accomplished once the data is cleansed and deemed “good” centers around the topic of how to do merge/purge or householding.
Householding is new to me. Must have been ignoring the business side of things for too long.
It certainly reads like he has a product to sell.
The death of data warehousing
The death of data warehousing
by Michael M Carter
March 11th, 2004
In the post-Enron economy, businesses need a better-faster-cheaper way to get at data and turn it into intelligence. The developing regulatory environment will require new levels of openness and transparency for publicly traded corporations of all shapes and sizes.