Still holding, not selling: Dublin Core vs. Schema.org

I was recently asked why I use Dublin Core, rather than Schema.org, what follows is my response:

The short version:

In this field, you tend to be cautious about new standards. I think Schema.org is now almost 3-years old – Dublin core is well, old enough to buy beer in some states.

For some context on just how many standards there are for metadata have a look at this, I hope I won’t be visiting schema.org in this graveyard of possibilities: Glossary of Metadata Standards (Indiana University)

The Long version:

Often times, standard choice depends on the purpose. Let’s start by saying, we do not use Dublin Core in the expectation of it being implemented in that way (although it would be nice), we primarily use it to keep our descriptive metadata schema architectures stable when we are designing them. Database architects are free to chose what standard they would like to architect their system under (if they do at all – rich metadata hasn’t really hit its prime yet). Most developers/DBA’s know Dublin Core, so its an easy pass-off.

Why do we (or even I) choose to architect with Dublin Core? Since it’s release, with expected minor updates / expansions included, its is still the single most complete general (non-industry specific) framework created. In general, its proven reliable, relatable, effective, robust, scalable, extensible and simple. It is also ISO-cannonized (the ISO is kinda a big deal) and it is best for descriptive metadata of ALL kinds, not just for the web. There isn’t a single digital asset that cannot be categorized / aligned to one of the core 16-elements (and the refinements). It also allows for the creation of a (flat) ontology relationship model. So for now it still remains the gold standard in metadata architectures.

Rich metadata, like I’ve mentioned, hasn’t really hit its prime yet. There is lots of ongoing debate in the metadata community about coming to alignment on many of these standards (there’s more than just schema.org out there – everyone is screaming DITA, now too). However, until that happens and schema.org, or whichever new-kid-on-the-block is more widely adopted, Dublin Core remains the most recognized and used standards by metadata specialists in all domains of industry.

I must acknowledge Marko Hurst’s inspired contributions to this fine response. He continues to be my spiritual metadata leader.

Some more thoughts:
Much like Beta vs. VHS, its now becoming RDFa vs. microdata. RDFa is the most flexible, you’re not locked into a vocabulary, like you are with microdata. However, because it’s the best doesn’t mean that it wont suffer the same fate as Betamax. Here’s more on this topic in a great posting by, Manu Sporny, called The False Choice of Schema.org

Still holding, not selling: Dublin Core vs. Schema.org

The Role of Digital Art History: Report by Samuel H. Kress Foundation

A colleague just pointed me in the direction of a new report out on Art History’s transition (or lack thereof) to digital, written by Diane M. Zorich, for the Samuel H. Kress Foundation. Her report seeks to gain a deeper understanding of why art historians feel such “ambivalence” toward digital art history.

Zorich covers the limiting infrastructure of these domains to the impacts that the “new” digital publishing model is having against the age old, “publish or perish” model. This change to digital is one that Art Historians have been very reluctant to embrace (in some cases even hostile). Digital is just a “trend” afterall, right? Continue reading “The Role of Digital Art History: Report by Samuel H. Kress Foundation”

The Role of Digital Art History: Report by Samuel H. Kress Foundation