Wednesday, August 13, 2008

Is the Semantic Web Biased?

Someone getting in the way of Tim finally achieving the semantic web.
Will The Semantic Web have a Gender?

But seriously, Corinna Bath has great points about the attempt to 'simplify' the world into semantic goodness effectively disenfranchising those not involved directly (developing nations, alternate cultures, Women, Art majors ;) ).

Also she has very good points about the accuracy, subjectivity, fuzziness and temporal relevance of knowledge (details about a celebrity indiscretion has one meaning for those reading about the event the next day and an entirely different one two years later in a war zone).

It is very important that our tools do not close out possibilities, ideas and more importantly individuals from our society as our tech tools increasingly become the place we live our entire lives.

From the interview with Corinna Bath at the Semantic Web Company

Another class of gendering processes is based on gender stereotypes and the existing division of labour. In these cases, binary assumptions about women and men are not reflected or the (gender) politics of domain is ignored. Thus, the existing structural-symbolic gender order is inscribed into computational artefacts and will be reproduced by use. Furthermore, gendering is enforced by de-contextualisation, naturalization, the use of dichotomies and naïve realism. By abstraction, classification and formalization the two problems of incorporating diversity (or differences) and rupturing the reproduction of gender inequality in artefacts shift to the level of epistemology and ontology on which technological concepts are based.


Whew, I think I parsed that correctly, I had to verify my understanding of some of the big words, and I provided handy references for the lazy.

Friday, March 28, 2008

ROTFLMHO: Science Fiction Mavens Offer Far Out Homeland Security Advice

I can only assume that there was a bit too much 'liquid refreshment' available at this panel discussion. Science Fiction Mavens Offer Far Out Homeland Security Advice


My favorite Sci-Fi writers obviously went a bit overboard here, but I am sure they had a bit of a laugh.


I can't believe Niven made that comment out loud, I wont even repeat it here.


Actually it is a bit of a shame because these guys do have a lot to offer the Security apparatus, but not at this Panel Discussion.


I should read David Brins The Transparent Society, from what I have heard it is very hard to disagree with his basic premise that whatever we wish to happen, we all ARE going to be monitored every second of every day and how do we manage our (non-existent) privacy in that context.


Thanks to Bruce Schneier for pointing this out to me here.

Monday, December 17, 2007

Why is Javascript almost Mandatory?

My Comment on Groklaw

I just posted this on Groklaw in response to PJ, asking why Javascript was necessary for the Web. These are my thoughts on the matter, not right, not wrong but mine.

Javascript might not be mandatory in some parts of the
internet, but it allows web-pages to leap out of the bad gui design of the
previous generation.
Web-sites characterised by page after page of detailed
content, which are are not changed from refresh to refresh cause the users to be
turned off. The guts of an easy to use site is a FAST responsive interface,
which can be easily done with Javascript (although not impossible to do with
other tools, and easy for Flash and other rich clients). There is a reason that
the early internet had such dismal 'stickyness', the reason is that the content
had to be sooooo much more attractive to keep the audience without a good
interface. Site owners might be able to mitigate responsiveness issues with
Hardware/Software/Bandwidth/Content Delivery Network but cannot fix them without
active code being run on the client.
NOTE: Geeklog(and blogs generally) allows
you and the reader to optimise pages into very large 'chunks' of comments, the
only disconnect comes when we feel the need to comment, we lose context of our
comment when posting.
NOTE2: Sucessful sites almost universally have trivial to
use GUI's with very little overhead, that is not an accident.

Thursday, November 15, 2007

If Social Networking Sites *Really* Wanted to Interoperate

This is funny with all the hoo-hah around open social, facebook, walled gardens and so on. O'Reilly really has great content, nay the best on the internet.
If Social Networking Sites *Really* Wanted to Interoperate

Tuesday, November 13, 2007

SOA vs Distributed Objects and performance

I don’t think this article goes into enough detail, but he delves into why object systems (such as EJB) have required architectural additions to simplify and control the performance problems.
i.e. EJB session beans to optimise and control access to EJB entity beans.

http://www.acmqueue.com/modules.php?name=Content&pa=showpage&pid=507

Also I think he is onto an important point that the design of a service oriented architecture is more suited to optimisation on a network because it makes intuitable the network nature and tradeoffs of a call over a network (entirely my interpretation of his words).

And also if the caller of a service is requesting data then a large graph of data can be moved entirely into the callers context for rapid data use, managing the overhead and failure possibilities once and not hundreds of times (literally). Exactly why SQL over SQLNet can be easily optimised for net traffic.

When looking at a problem in a SOA way it seems that you naturally evolve a more optimal way to use the network. I don’t have enough experience with SOA to know what all the pitfalls are of that orientation though, there will definitely be pitfalls. Some pitfalls might be marshalling/demarshalling overhead, lack of data locking, brittleness of implementation due to change, cascading hidden service dependencies, service versioning issues.

When I went to an IBM seminar on SOA it was quite illuminating when they talked about a SOA ‘call’ with hundreds of parameter data elements. I actually thought it was rubbish to design a call with so many parameters, but I really don’t know enough to comment on the design of these systems. It might indeed be ‘good’(tm) to create such a call when the service is ‘state-free’, i.e. implements an algorithm that acts only on parameter supplied data. But a service/interface with so many parameters is sure to be volatile and that volatility could effect callers of the service.

What do other people think?, am I off the mark?

Monday, November 5, 2007

Celebrate because Movember is upon us

The month formerly known as November has just got a facelift.
Sponsor my Mo for Prostrate Cancer
Movember donation
.

Tuesday, October 9, 2007

Agile Indian vs Mainstreet Cowboy

I feel the need to distinguish the “Agile Indian” from the “Main street Cowboy”.

Agile is an interesting and powerful set of methodologies, but I hate to see the collapsing of more than one term into one, especially to the detriment of Agile. I also hate to compete inside my organisation against Cowboy coding masquerading as Agile.

Let us start by distinguishing “Cowboy” coding.


  • Just leap into the project and start programming as fast as you can.
  • No thought to testing each function.
  • The emergent behaviors and corner cases in the product may not be considered up-front, only when they are proven to be a problem do they get addressed.
  • Whack a Mole style development, just hit the problem in front of you.

Then distinguishing “Agile” programming.

A set of methods that allow the following results.

Small / Quick / Short feedback loops:

Allowing any mistake or wrong direction to be quickly and cheaply rectified.

Techniques: continuous testing, client on site, fast iterations, small teams

Optimal process control:

Any process that is not required increases the cost, reduces morale, increases risk.

Techniques: no Quotes, Stories not Endless/Perfect a-priori Requirements, Optimal Metrics, One methodology per project.

I hope this makes sense, comments are encouraged.