Martin’s Weblog

PIE and MASH: a Lens For a Semantic Web

We are only a few weeks into the year and it already seems clear that one of the major trends will be integration activity to “orchestrate” information sources – to create lenses to MASH and focus information for our Personal Information Environments.

Activity Stream integration

 Social networks were a big factor in 2008 and social networkers were among the first off the blocks in 2009 to catch my attention with a meeting on  January 9th at the offices of Six Apart to discuss standards for activity streams. People belong to different social networks but cannot easily (if at all) communicate between these networks – solving this problem will be like the day when email users on different email systems could email each other.

News Stream Integration

Another early set of activities that caught my attention were the discussions about RSS overload and the need to deal with this somehow. RSS is an essential tool for pulling information into your environment but with the dramatic growth of the web even RSS has trouble coping. Michael Kowalchik describes how our feed readers and our use of them are based on the older email paradigm of inboxes and a must read all items attitude. Kowalchik says that both feed readers and our attitudes to information need to change –  ” people will increasingly want to experience information, not be slaves to it”. Kowalchik describes Mike Winner’s the “River of News Concept” which informed many news aggregators including Grazr – “the name “grazr” comes from, grazing information, not drowning in it.”

Activity and News Stream MASHING

Another item that caught my attention was the way the way on-line social media responded to the Hudson River Plane Crash. There have been many stories of the way news breaks first on social networks and about how the major news corporations make use of material from people camera phones but what caught my attention this time was the way in which social media itself could offer coverage. Kevin Sablan’s Almighty Link used storytlr to gather feeds from Twitter, FlickrYouTube and Vimeo to create an aggregated “real-time “story”. He describes “the hard part was editing, or what Tim Windsor calls curating, the approximately 700 bits of information into some semblance of a disjointed story”. The result was “a stream of moments captured by individual storytellers, the  “lifestream” not of a person, but an event.”  There was also a Hudsonplane Friendfeed room which could be regarded as a “web2.0 viralism mashup” equivalent of a newsroom of the event.

Beyond Google – The Real Time Web

Writing for RWW Bernard Lunn uses the web 2.0 response to the Hudson Plane Crash to illustrate the way in which the web has moved from IBM (mainframe) to Microsoft (client-server) to Google (on-line) and is now moving beyond Google’s grasp and into real time. He argues that “It’s the Real-Time Web that will unseat Google. This idea has been percolating for a while, but it took a plane landing in the Hudson River to make it obvious. Google cannot be real-time. It indexes the historical web, and it does it better and faster than anyone else.”

PIE and MASH a Lens For a Semantic Web

With all the activity and news streams flooding into my on-line environment I feel my river of news is more like a rapid – I want something to pre-process the streams and present me a river instead of a torrent. I want to be able to search and define sources; aggregate them and sort their presentation according to my own criteria. For example, I would like to input items on Cloud Computing from Twitter; Youtube; blogs and traditional news sources and web sites. The part that I think will develop this year is the difficult next step of pre-processing the information sources. Quantitative pre-processing tools exist – tools like Postrank will exam social bookmarking statistics, blog hits, referrals and comment quantities  to rank feeds but what I would like is some form of qualitative pre-processing – this is the difficult part – for what do I mean by Qualitative. At the moment my qualitative assessment of information is associated with people and recommendations. To find news I check Twitter first and see what my network is talking about, then I check the RWW and Mashable etc for RSS feeds. In terms of Quality I would need a way to weight feeds according to mentions of sources and people – not just numbers of hits.

In order to apply qualitative criteria to information sources either the information sources must carry additional information (meta data like tags,statistics, Microformats and RDFa) or a tool must be able to extract data from the context of the information source – how it is associated in the web – how richly it is associated and with what. I seem to be talking about the semantic web and this is not surprising as semantics (meaning) is largely about associations and relationships between things – the more meaningful something is the more deeply and richly it is associated with other things and meanings.

People are getting used to Personal Information Environments (PIE) – systems like iGoogle or Netvibes where you can suck in various information and display it in various ways.However, PIE tools look set for a revolution in 2009 if

 Marc Canter’s DiSo Dashboard proposal gains traction. By implementing DiSo dashboard  proposals popular PIEs could extend and integrate across social networks and Lifestream activity as well as RSS mega aggregation.

 

I’m hoping that tools will become smarter in 2009 and help me manage my information sources more meaningfully – I will be keeping an eye on the DiSo project in general and the DiSo Dashboard  idea in particular,

Advertisements

January 18, 2009 Posted by | media, semantic web, web 2 | , , , | 1 Comment

Approaching Clouds – First Impressions

Increasing amounts of our lives are mediated by IT and developments in educational, social and technical culture require organisations to develop systems to deliver expectations.

Back in June 2008 I wrote “MLE to PLE a framework for considering systems” which attempted categories approach and offer criteria to help evaluate systems.

This blog looks at the systems for learning being considered at EHWLC to meet expectations and my first impressions.

Product on-site

This is the traditional approach – purchase software and hardware and install in your systems centre. The system we have been having a look at is Microsoft Sharepoint.

In many ways Sharepoint presents the issues of any traditional product on-site system. I have found Sharepoint to be time consuming and overly complex. Due to the logistics involved (product “manufacture” and provision to customer sites) I have found Sharepoint to be out if date at the time of delivery. It offers a traditional perspective on web 2.0, focused on Office documents when what I am looking for is web page “in-situ” creation and editing where you only need a browser. We are trying to move away from the sharing and circulation of word documents and Excel spreadsheets yet Sharepoint encourages this – not surprising really. One advantage to Sharepoint is it’s tight integration with your internal organisational systems (if you are using Active Directory). However, with the increasing number of non-organisational users you may wish to include (e.g. franchise partners etc) this approach presents problems.

Product hosted

Instead of installing a product in your system centre this approach is to use the system centre of a 3rd party to run (host) your system and access it via interfaces across the Internet. The 3rd party can offer business continuity and security. This approach offloads the work of running the data centre systems but presents the limitations of the product. The system we are considering is the ULCC hosted/serviced e-learning.

We have only just started looking at the ULCC hosted service. I am hoping that it errs more towards a service rather than hosting a product. One of the problems of a product on-site is that we are all so busy that finding the enormous amount of time required to get a system on the scale we are considering started up is very difficult. With the ULCC e-learning services we hope to be able to contract technical implementation time to the service providers so that actually provisioning a service becomes a possibility. One of the major areas I will be looking at are the Interfaces we can use to interface with our other systems

Service – Cloud (Organisational)

With this model you use the system centre of a 3rd party to run (host) your system but are not concerned about the technology behind the service – your focus is on the service itself. We have been experimenting with two cloud services for many months Microsoft live@edu  and Google apps for education  

Neither of these systems is fully ready yet and neither offer all I want or in a format I want but the potential is fantastic. For both these systems we have batch provisioned user accounts from files that can be generated by our MIS systems and both systems are very easy to administer. Both systems provide services which Microsoft and Google offer on their cloud sites (blogs, email, collaborative workspaces etc).

Organisational DIY

If you are lucky enough to have your own programmers this approach is to use your own specialists to program and design your own system. This could be on-site, hosted or in the cloud. We are working with Centime  with this approach. We have identified a great deal we would like to work on such as RSS feeds, interfaces, web page “in-situ” creating and editing etc. A major problem is the time and resources required to engineer these features.

Personal DIY – pure MASH

With this approach we use and integrate whatever people (learners and staff etc) choose to use. W e have been developing awareness and skillsets in many cloud systems for storage, blogging, feed aggregation, website creation etc.

I have found this approach fast moving, dynamic and exciting. The main problem has been with the “paradigm” – most users are unfamiliar and seem uncomfortable with freedoms and self responsibility of a personal DIY approach to their IT. Another problem has been with integrating the diverse systems into something coherent.

First Impressions

My first impressions are that none of the systems offers a complete solution of what I would like to see.

– A system that is inclusive of all our potential users – current staff, students and partners but also potential users and those who have left us (alumni).

– A system that is extremely easy to use and administer

– A system that provides data interfaces for college systems to use (something to identify the user to the system plus associated data)

– A system that is dynamic – easily and quickly able to change (agile)

The full Personal DIY MASHUP approach is I feel the direction we need to point ourselves in and to use those systems that help us to move in that direction.

Microsoft Sharepoint is too complex, slow to change and backward looking but is likely to have a place in a limited traditional organisational deployment perhaps as a development of our staff Intranet and replacement of the Pool drive.

Microsoft live@edu  and Google apps for education  – I have a “philosophical” problem with these – why provision college associated Microsoft live or Google accounts when people can do this themselves. Does a student really want to use a college associated email (e.g. martin.king@gspace.wlc.ac.uk ) for the rest of their lives. More likely is that these services can be used for a traditional secure project in the cloud and this is where our early experiments with these systems have taken place e.g. departmental collaborative space and calendars.

For me this leaves a combination of Organisational DIY (Centime) or service/hosted systems (ULCC hosted/serviced e-learning) provisioned in such a way to facilitate – pure MASH personal DIY.

As a test of these and one of the first projects I would like to look at is the replacement of college provisioned student email with students own email.

December 7, 2008 Posted by | cloud, IT and education, web 2 | , , | 2 Comments

Vido Confession 14: Web 2 for ESOL students on work experience

Our core IT systems run from an Uninterruptable Power Supply (UPS) backed up by a diesel generator that can provide power for up to 24 hours. The UPS provides a nice stable clean supply of electricity for our core systems from its large set of batteries. On the morning that the video was taken our UPS system was undergoing maintenance and our core systems were running from the raw mains for three hours – I was a little nervous.

The ESOL department at Hammersmith has benefitted from the input of Liz Boyden working on the PET project to experiment with new technologies in teaching – particularly the application of Web 2 systems. Web 2 is all about interaction and communication and it has been taken up remarkably well in ESOL – a subject full of interaction and communication.

In the video ESOL teacher Helen mentions the Web 2 systems they have used in their teaching (Microblogging using Twitter, standard blogging using Blogger, Audio podcasting using Podomatic, and unstructured blogging using Tumblr).

Helen describes the use of Tumblr with 16 to 19 year old students on their two week work experience.  

The traditional way to document work experience is for students to be issued with paper work experience diaries which they would fill in each day and then return to the teacher at the end of the work experience period. During this period the opportunity for interaction with college teachers and fellow students is limited, the students feel relatively isolated and are not motivated to fill in the traditional paper diaries – a task which is often seen as a chore.  

Instead of paper diaries the ESOL team helped the students to set up their own Tumblr accounts and taught them how to use them with a few introductory exercises before they went out and had to use them on their own while on work experience. Using an on-line system like this has many advantages:

– The students could interact with fellow students and their tutors while on work experience – they were less isolated and could read about the experiences of their fellow students.

– The students were motivated to complete the log each day – many included photos

– The students were learning and using IT (the latest IT at that) in a real word setting and in a meaningful way.

– Tutors had day to day feedback from students

– Documentation was on-line and easily accessible by all, instead of in a pile of paper forms.

The project found that all the work experience locations had computers which the students could use to access the Internet to type up their logs – this is the 21st century and we are in London so I shouldn’t have been so concerned. Whilst Internet access is increasingly pervasive, we would have to have a contingency for students who were not able to access the Internet daily.

May 17, 2008 Posted by | IT and education, video blog, web 2 | Leave a comment

video confession 10

 “The Network is the Computer” – we have always connected everything but are preparing for a paradigm shift to “The Network is our computer” (1) by anticipating and encouraging the use of web 2 systems. Such use will demand more of our network and of our systems.To prepare the college has started some major upgrades and developments – over the next 12 months we are:

* Upgrading inter-site circuits from 100Mbps to 1Gbps (to cope with higher demand)

* Installing physically diverse inter-site and internet backup connections (to improve continuity)

* Re-engineering our routing and Internet access (to offer new features)

* Installing new core switches from Extreme (to cope with higher demand, improve reliability and add features)

* Installing new Wireless access from Aruba (to cope with higher demand, improve speed, reliability and coverage)

* using Virtual systems (to offer a quicker response to new system deployment and improve continuity).

Virtual machines have proved to have real benefits and at the college we have been using virtual servers and desktops in development, to deploy services faster and to improve continuity.

One of the pressures on IT systems these days is the tremendous demand for storage and I anticipate that virtualisation can help with this too. Interestingly, networking plays a key role in recent virtual storage scenarios.

One form of storage virtualisation is to treat it as a service and make use of massive external systems like Google (use Google Docs, Youtube video), Flickr for photo’s and Microsoft Livespace for file storage. However, the pressure for ever increasing internal storage continues and I will be looking at two systems.

The easiest place to start with standard file storage where space rather than performance is the issue – things like user home folders (Z: drive), shared file storage like our “Pool” folders (P: drive), technicians storage area and the media storage areas (where marketing and the design team keep lots of photos and videos). For this standard large NAS (Network Attached Storage) should give us what we are looking for and I shall be looking at NAS first.

Another interesting option is to look at virtual storage for virtual servers where performance is not crucial – for this NAS could also be used but I will also have a look at iSCSI systems.

In the video

Our senior management team are away on a budget conference meeting which means that all their offices are empty and available- I managed to bag the principal’s office to hold a meeting with suppliers to talk about network and storage virtualisation.

Kevin from Vanix gets in a good plug for his company as “one of the UK’s premier network integrators – Guildford, Paris, Peckham”. Kevin briefly describes the work of Vanix on our Extreme backbone network and their work on our early installation of Aruba 802.11n equipment. 

Jon from Onstor distributor Zycko talks about “off-loading our budget” – storage virtualisation has many advantages but could be a little expensive by the sounds of it – the Onstor NAS systems are certainly big and impressive.

The link here explains virtualisation and provides the explanation below.

“In general terms, virtualisation refers to the abstraction of computer resources so they can be logically assigned. It is a technique for hiding the physical characteristics of computing resources from the way in which other systems, applications or end users interact with those resources.”

 (1) Thanks to Mark Gobin for the phrase “The Network Is Our Computer”

March 8, 2008 Posted by | video blog, virtualsation, web 2 | Leave a comment

Web 2 – Beyond Space and Time

Traditional organisational structure has been historically determined by constraints of space and time. Space and time has determined who you are able to interact with. A striking symbol of traditional organisational life is the meeting – the traditional organisational decision making method– a system defined by space and time.

Culture can be defined as the circulation of meanings within a community – people in a community share the same experiences and the meanings that are derived. When we talk about company culture we are often referring to the values shared by the people in the company space at certain times.  Business “gurus” are aware of the importance of company culture and the importance of “getting together” and “doing things together” – whether this is around the coffee machine, in company events or even on team building exercises.

The organisation provides a structure for organisational culture – space, time and systems for access and interaction. The traditional organisation provides for its people – individuals slot into the company structure and carry out defined roles. The first stages of IT usually automate the manual system and this is the case with most company IT systems – they implement the organisational structure that has been derived through time and space – the company provides systems which define how its people interact and operate.

Companies are embedded within society – their employees, customers and products are part of society. Companies are also exposed to the same forces as society – globalisation, developments on the Internet, the pressure to produce faster and cheaper etc. The traditional method of decision making in companies is under pressure to cope – it can be difficult and costly to bring people together in time and space for meetings with the speed and flexibility required by a faster changing world. IT is mostly responsible for these pressures and hopefully IT can provide a method of meeting them – the problem and the solution come from the same set of tools.

Web 2 and Internet social networking systems provide the tools for new ways of interacting and they are changing the cultural and business environment we live in – I can’t help thinking of the Ostrich or dinosaur analogies here – we can ignore the environment, carry on as we are, become irrelevant and become extinct.  The alternative is to be sensitive to changes in the environment, adapt accordingly, exploit those changes, remain relevant and successful or at least survive.

It has taken many years but companies now routinely use the Internet and “web 1” as an internal and external communications medium (email, web sites, on-line purchasing etc) – the Internet is integral and embedded.

Awareness and use of Web 2 and social networking is more difficult than for “web 1”. Web 1 implemented current organisational structure so that people and organisations could understand and use it. Although there are specific toolsets for Web 2 many people and organisations feel uncomfortable about their use – this is because their use is based upon a different paradigm (paradigm 2) . Paradigm change is fundamental and as a result can be quite uncomfortable – it depends on your mindset. However, with increasing numbers of people (employees and customers and of course business and markets) using web 2 and social networks companies and other organisations must start to consider these technologies or risk becoming irrelevant.

Web 2 and social networking allows us to go beyond the traditional constraints of space and time – we can interact with people globally at different times – we no longer need to collect bodies in the same room at the same time or worse still collect bodies in multiple spaces at the same time (video conferencing).

Without the constraints of space and time boundaries can be made softer. Web 2 and social networking are useful in crossing boundaries – customers can interact with employees, employees with other employees and employees with those in other organisations. For example, some companies are now using web 2 and social networks to replace traditional focus groups – real customers providing feedback to real employees about products in an interactive way.

Organisations should be starting to explore and experiment with the use of web 2 and social networking – this could be like pushing against an open door as increasing numbers of customers and employees are already using these systems and are familiar with the way they work. Organisations should consider running fewer meetings and explore setting up social network groups to create inclusive decision groups that cross boundaries.

IT departments should be leading on the application of web 2 and social networks – more than most departments IT is familiar with constant change and development – they should be able to explore and exploit the use of these new technologies.

December 2, 2007 Posted by | IT and society, paradigm 2, web 2 | 1 Comment

The user interface is dead – long live the user interface

How far back shall we go – I’ve been around long enough to have experiences what could be described as “physical interfaces” like punched cards and paper tape and waited 24 hours to receive a fan fold print out of the results. In those days you had to try and get the program right first time as the time penalties for mistakes were severe. I remember the command driven interfaces of mainframe terminals, personal computers and applications like Wordstar – if you wanted to control the machine back then you had to do it on the machine’s terms and learn its language. I remember the menu driven screens of applications like Multiplan and Word and the relief of having the computer start to interact with me by offering some appropriate choices instead of me having to enter a command, worry about syntax and spelling and get some strange numbered error code to look up.

I remember the Apple Lisa, the Mouse and using Hypercard  (unfortunately I didn’t get to use the Xerox Star). What a “tipping” point this was – a complete intuitive package where the computer interface was more on our terms. I remember the various Macs and versions of Windows through the years and have tried speech input and tablets.

However, the user interface is going to get shaken up again – in the way that the earliest interface s were rooted deeply on the computer side the next interfaces will be rooted deeply on the user side.

Personalisation is one of the hot topics on the web and is the key to the coming shake up in the user interface. Once a major element of a system was the user interface design, in the next generation this could be much less significant or nonexistent. Instead the focus will be on your information interfaces (APIs) the user, their personal environment or tools can connect to.

This is part of Web 2. If your system provides RSS feeds and API’s then the user can get information from your site without looking at your site – the user can take an RSS feed for a summary of updates, the user can mashup your content with other information and create something new. It is now increasingly popular to provide systems interfaces for social networking environments so a user could access your site via an application written by you or a third party via Facebook for example.

The Facebook programming interfaces have created quite a revolution on the web. This is one area where Google are behind and the response and developments of Google  over the next 6 months will be interesting. Currently  Google are working on a common social networking programming interface called opensocial that will work with most of the social networking sites apart from the big names of Facebook and Myspace it seems.

The development and spread of public APIs leads to the “programmable web” and the start of the evolution to Web 3 and what I call paradigm 3 where software agents interact automatically and exchange information. More on this later.

The traditional system user interface is disappearing – user’s will choose how to view your system – long live the user interface.  

November 4, 2007 Posted by | ICT, web 2 | , , , | 5 Comments

What is 2

What you do rather than what you use

It is possible to get too carried away with trying to describe, explain and define what 2 is and try artificially to exclude certain things because they aren’t 2 enough.  The process of looking too hard at it could make it disappear.

The essence of 2 is social and involves participation, collaboration and DIY.  Cavemen did it, animals do it (think of pack hunters for example), we do it in our everyday lives all the time and it can even take place in some meetings.

What marks out the current wave of 2 is that it involves the virtual and mediation by ICT – Facebook for example attempts to do virtually what we do with physical presence – meet, share things, chat etc.

The important thing for me is seeing 2 in whatever flavour whether wholly virtual or wholly physical or any degree in the middle.

Teaching with role play and simulation with no technology at all is as valid as teaching using second life for example. The crucial thing is to engage the students and use methods which connect with them – today this often means on-line environments like facebook, youtube, blogs etc.

September 19, 2007 Posted by | IT and education, paradigm 2, virtualsation, web 2 | Leave a comment

Power tools for paradigm 2

Power tools for paradigm 2

Back in the 1970’s there were predictions that scientific and technical developments would lead to an age of leisure in the 21st century – with machines to do our work for us we would be working half as much. Like the paperless office this has failed to materialise, instead we seem to be working harder and longer. What has happened is that our machines allow us to do more in the same period of time – we still work the same long hours.

The problem with working in paradigm 2 is coping with the jump in the amount of information generated as more and more people  participate and produce. We have power tools like electric drills and saws to assist us with mechanical needs –  we need power tools to help us with our information needs.

Here are a few of the “power tools” required for paradigm 2

Email – automatic processes, rules and filtering

With overwhelming volumes of email you will need to apply some methods of automatically dealing with it. Possible developments with email systems are being able to automatically flag message priorities according to message criteria (sender, subject, key words etc). For now being able to accurately put items automatically into a junk mail filter is very useful.

Search

With so much information “out there” and “in here” on your own systems you need to have some methods of being able to find what you need. Google are at the heart of Web 2 development and it is Google of course who took search to a new level. Although I organise my emails and file storage in folders I have found the new desktop search systems very useful indeed. Having search technology available and being proficient in its use will be vital.

RSS – Feeds

It will become impossible to visit and keep up to date with all the pages you will need  – creating your own summary pages that take updates from the web pages you are interested will become vital.

The Future

One development I see as essential in the near future to help us cope are automated proxies. These will be systems programmed by us to process, decide and respond automatically on our behalf and present us with a summary of actions taken so that programming can be adjusted. Such systems  would present us with a virtualised view and it is quite possible that these tools would lead to what we could regard as paradigm 3.

August 26, 2007 Posted by | ICT, IT and society, paradigm 2, web 2 | 1 Comment

user empowerment on web 2

One of the great things about web 2 is the way you can publish something and then refer many people to it. For example, rather than email a video attachment to lots of mailboxes upload the video to youtube and then just email the link to the mailboxes.

One of the functions I wanted to do with blogging is to be able to publish information or an opinion and then send a link to people.  I had been using blogspot (blogger) as my blogspace but many people had trouble reading pages I emailed to them as links to blog entries. I tested the same function with wordpress and found no problem soI have imported all my Blogspot blog to wordpress – took about 30minutes from creating a wordpress account to having all my blogs imported.

The moral of the story is that users are empowered – it can be so quick and easy to switch systems that suppliers will have to pay attention to issues that are important to users. Compare this to the FUD (Fear Uncertainty and Doubt) that suppliers used to tie up users in the past.

August 18, 2007 Posted by | web 2 | 1 Comment

What I like about Web 2

Web 2 isn’t suitable for all applications but it has a significant part to play in our future computer use.

Why I like web 2

It’s Free and easy and “out there”
There has been free software available since the beginnings of the PC era but you would have to get it on a disk or download it and then install it on your computer. I never liked this – I had to worry about just what it was I was installing – could it be trusted and what would it do to my computer. I would hear so many stories about how items of free software would mess up people’s registry and cause strange effects on computers and how you were stuck for good with the software as you couldn’t uninstall it. As a result I rarely used any free software.

I like Web 2 because I don’t have to install anything on my computer. I can try things and not have to mess up my computer. The other great thing is that I can access it from any other Internet connected computer – it’s on the Net rather than on my computer.
I like Web 2 because it’s free – I don’t mind the ads and many of the products don’t even have ads.

I like web 2 because it’s easy – I haven’t found any “product” difficult to start using. The creators have to make the “products” easy to use because it is so easy for people to go somewhere else. “products also have to be easy and self evident as the business model can’t support helpdesks – users are on their own and the “product has to be easy.

Diversity
There are just so many “products” and the type of thing they can deal with is expanding all the time. There is just so much choice and there is no dominant player .

It’s naturally collaborative
Because your application and its information is already “out there” on the Net it is easy to make it accessible to others and to share it. Trying to share applications and information on your computer is not a natural act – you have to install P2P features and then worry about the security of your computer.

As an IT manager
I like web 2 because it puts the user in control – users can choose their preferred applications and take responsibility.
I like web 2 because I don’t have to install applications on user computers.
I like web 2 because I don’t have to worry about application backups.
I like web 2 because I don’t have to worry about security of the application site.
I like web 2 because I don’t have to worry about application updates.

The application site “producers” have to carry out and worry about backups, security and updates. Instead of spending so much time installing, maintaining, securing and updating applications IT departments can spend more time with users advising and guiding in the use of their computers.

August 5, 2007 Posted by | web 2 | 3 Comments