Gartner includes Semantics in it's list of the top 10 disruptive technologies for the next 4 years, as item number 10. Some commenters on the blog link are saying it should be higher (assuming an ordered list), while calling the virtualization and multicore "boring trends".
It should be no surprise that I think semantics should be taken off the list. It's not a next-4-year big thing, if any big thing at all. Multicore and virtualization is riding an exponential curve, which is the only real way to be disruptive. I believe semantics is not only difficult, but also linear. Thus, our progress in the semantic space will be far outpaced by exponential trends. I do believe we'll make progress, but my prediction is that it will be in the brute force space, aided by Moore's law over time.
Multicore isn't exciting per se, but the disruption it will drive in the software space is already visible. Concurrency is already huge if you look in the right places, but the its increasing ubiquity will start to sink in very soon.
May 31, 2008
May 18, 2008
Interoperability is Hard
True interoperability between independent software products is hard. There are multiple levels on which you need to guarantee compatibility (read: agreement among all parties) in order for any deep interoperability to work correctly, including: transport, schema, semantics, ownership, and identity.
A lot of so-called interoperability works fairly well by severely limiting one or more of these aspects (most often identity and ownership), which is fine. However, most of the conversations I hear tend to revolve around the transport, paying minimal attention to schema, and almost none to semantics.
I think this is because 95% of your implementation time tends to get eaten up by the transport, which fools you into thinking it's the hardest part. In fact, semantics is often the hardest part, but that's all done in design, which can easily get ignored. The problem is, a transport defect can be found and fixed in one product, whereas a semantic defect usually causes you to need to change the schema, which affects all products (and therefore usually doesn't get fixed, leading to poor interoperability).
Therefore good interoperability needs the meaning of your data understood and agreed upon by all parties before you settle on the schema. This turns out to be quite hard to do. It's pretty difficult even if you're in control of all the moving pieces.
I'd like to think there's a way to decouple parts of interoperability to be able to iterate on standards after they are entrenched, but I haven't found it yet.
A lot of so-called interoperability works fairly well by severely limiting one or more of these aspects (most often identity and ownership), which is fine. However, most of the conversations I hear tend to revolve around the transport, paying minimal attention to schema, and almost none to semantics.
I think this is because 95% of your implementation time tends to get eaten up by the transport, which fools you into thinking it's the hardest part. In fact, semantics is often the hardest part, but that's all done in design, which can easily get ignored. The problem is, a transport defect can be found and fixed in one product, whereas a semantic defect usually causes you to need to change the schema, which affects all products (and therefore usually doesn't get fixed, leading to poor interoperability).
Therefore good interoperability needs the meaning of your data understood and agreed upon by all parties before you settle on the schema. This turns out to be quite hard to do. It's pretty difficult even if you're in control of all the moving pieces.
I'd like to think there's a way to decouple parts of interoperability to be able to iterate on standards after they are entrenched, but I haven't found it yet.
Subscribe to:
Posts (Atom)