William Gibson’s quote “The Future is Already Here – It’s Just Not Evenly Distributed” is a powerful and practical idea for working out what is going to happen in the short term – extrapolate from current edge and current trends. These days new technology is announced & piloted very early – there are few surprises in the short term in terms of technology developments. The problem with short term predictions is that we often exaggerate the scale and impacts of predicted developments.
Bill Gates summed it up when he said “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten”. Like compound interest an exponential function is just a fixed percentage of growth that compounds – change is occurring around us all the time and like a slow boiling frog we only jump when we become aware of it. Another factor in ICT change in particular is the Network effect (the value and effectiveness of a communication technology increases with the number of users) – this acts a sort of natural selection – operating both negative and positive feedback on exponential growth.
The problem with long term developments are that they are subject to exponential and combinatorial factors – chaotic things that we are not good at understanding at the best of times. To compound things change cycles themselves are becoming faster.
In the short term nothing much appears to happen while longer term changes appear are often beyond our understanding.
Information technology wants to be personal, abundant, cheap, easy, convenient, open, small, mobile and connected – “resistance is futile”.
The balance of technology in education is weighted to the institution – we depend upon institutionally provisioned hardware and software from data centres and servers to “end user” computers – this is an expensive, resource intensive, centralised and locked down model struggling to meet the demands of what people want from technology.
Continuing on the current trajectory every room will be eventually be an IT suite or every student will have a college computer – how could I provision, support, maintain and secure up to 20,000 computers – we need a new approach. Educational technology must seek a lighter, simpler less resource intensive approach to technology – it must learn to let go of technology, step away from the diminishing returns on the technology treadmill. Instead, education should provide a platform for technology use – a feasible and sustainable model for the next era – the “fifth wave of computing” – personal, abundant, cheap, easy, convenient, open, small, mobile and connected.
The traditional response is for education to provide resources but better choices can usually be readily selected by people from the web. Education needs to de-institutionalise and reduce its own technology – allow the balance to shift to personal technology by exploring DIY and self service approaches.
All our learners have on-line presence and identities – why provide institutional versions – allow learners to use their own resources and on-line identity. Allow learners to select their own email and their own applications – some will use Google apps, some will use Microsoft Live apps while others might prefer Zoho, Facebook office or local apps such as Openoffice or even Microsoft office. If learners don’t have on-line resources then this is an area for education, for education should be about learning for life.
Shift investment from computers and servers to the network. Shake off the ghost of internal client-server thinking – think global – think open – think web only. Create pervasive wireless guest access and increase both internal and Internet bandwidth. Encourage learners and staff to use their own IT on your guest network – let the network be our computer – let the network be the technology platform for learning
Education teaching and Education IT could both share a common new approach – facilitation. Facilitate the use of resources rather than the resources themselves. In the same way that teaching is considering facilitation, coaching, guidance styles so too could education IT.
Technologists often have an almost obsessive addiction about “the next big thing” and a technology fetishism and determination about the power of technology to transform education.
Education is stressed by the need to balance a great many competing factors including finance, legal regulations, government requirements, market competition as well as learning needs – a stress that often results in organisational anxiety a conservative approach to new technology.
The conflux of educational anxiety and technology addiction has in many cases created an addicted, anxiety ridden institutionalised educational technology monster.
The monster mash is depressive, agoraphobic, addictive, obsessive compulsive ritual dance.
Dead and decaying technology is toxic and harmful but the monster is addicted and craves increasing doses to sustain itself in an all consuming self destructive habit.
Technology pushers fool the monster to try ever toxic technologies to keep it and its “users” dependent.
Education has become dependent on technology and has to purchase, power, support and maintain more and more equipment, computers, servers, storage and software each year to satisfy an expanding desire for technology in education.
Education has to deploy ever more complex and expensive technology in order to cope – increasingly needing expensive external specialists.
Education’s dependency on technology is almost 24/7/365 – how long could a typical institution last without a technology fix.
The monster seeks comfort from the familiar, private and closed places – it fears and avoids large, open, public and/or unfamiliar places where there are few places to hide.
Education perpetuates familiar first phase technologies and applications such as locally installed, local area network client and server products.
The monster comforts itself with repetitive self-reinforcing ritualistic behaviours.
Education seeks comfort in conforming to self-constructed norms of technology use – learner:computer ratios; e-boards installations, VLE/MLE and the use of technology in lessons. Ritualised technology becomes repetitive, rigid, self-reinforcing and difficult to change. Education becomes focused on preserving the rituals f technology rather than the function.
Despite all its hard work the monster cannot find love.
For technologists education doesn’t go far enough and for eduction the technology is too wild and risky.
The Monster Mash
The monster mash is a complex, expensive, rigid, and slow moving dance – increasingly ridiculous yet scary and increasingly damaging to education and learning.
New technologies allow Education to provide increasing amounts of IT provisioned faster and more flexibly while also exerting traditional practices for availability, security, control and standardisation. However, there is a price – these new technologies are far more complex than before. Consider the complexity of load balanced server clustering, Storage area networking or a typical institutional email system.
The complexity of our systems is expensive – not only in terms of capital but also in terms of time, skills and increasingly in terms of external support and maintenance.
The scale of educational IT is expensive – the rise in quantity outweighs the fall in unit costs – while the cost of computer hardware has fallen we use many more and while the cost of software has fallen over the years we use more.
The scale of educational IT is expensive to support and maintain – we need increasing numbers of technical people to keep all this ticking over.
There is also a cost in terms of preparing and delivering education doing the monster mash – consider the amount of time spent preparing attractive powerpoint presentations or populating a VLE for classroom use. This is the old e-board and VLE debate where for me the “E” stands for expensive – consider the opportunity costs of these technologies alone.
To deploy, support and maintain on scale institutional IT is pretty standardised – new technologies such as virtualised clients may allow some variety around a standard theme but they are all generally predefined menu selections.
To protect and secure on scale institutional IT is pretty locked down – people often can’t install programs of their choice on educational computers.
Consider the effect of this standardised lock down on learning. A learner may not be familiar with tools you provide so must first learn your tools before they can apply them to their learning – the tools become a stumbling block and get in the way of learning.
Traditional institutional IT is designed for providing a fixed standardised and controlled provision on scale – it is not well suited to providing a personalised flexible provision on scope. New features appear in free public consumer IT regularly and often yet consider the process of upgrading an institutional application or email system for all your people.
Free the Monster
However comforting the monster mash may be it now has an existential problem and risks harming everyone around it. The Monster mash is a big turn off for many people these days.
While slow moving, rigid, complex and expensive its addictive, depressed, agoraphobic obsessive compulsive nature make the monster parasitic and difficult to escape
Shock tactics and cold turkey could be fatal for both the monster and the host – we must treat the underlying problems of addiction and anxiety appropriately with exposure and response prevention. With support the monster must confront its fears and discontinue its escape and avoidance responses. The Monster must learn that it can be safe in open, public spaces and that it can reduce and maybe one day eliminate its dependence on tradition and ritual. Over time educational technology may once again lead a less complex, expensive, rigid and slow moving life – one day the monster may lead a happy and fulfilling life.
I hope to explore some technology and education for the monster in future blogs.
This blog takes a brief look at how industrial processes have shaped our culture, identity and our ideas of information.
The defining factor of the 20th century was fossil fuel (especially oil) which today provides an energy equivalent to 22 billion slaves and allowed an exponential extension of 19th century industrialism to do things faster, bigger and more. The 20th century as an industrial age was dominated by material things, materialism and industrial processes – manufacture, distribution, consumption, disposal and the identity, political and power structure consequences of this.
Fossil fuel became abundant and cheap and 20th century systems could afford to be energy intensive – the globalisation of material things became possible – the earth became a global factory. China for example could manufacture from raw materials transported from multiple countries using oil transported from multiple countries and then transport manufactured goods to multiple countries. Transportation is present at every stage – the energy and pollution costs are now apparent.
Politics and economics became focused on production and people’s identities became focused on consumption – we have become defined by what we have – the things we buy rather than the things we do or make. The consequences of political and economic power can be read from this.
Information is intangible and must be represented in some way and given that the Internet didn’t exist through most of the 20th century then information was by necessity locked up in material forms of representation and the necessary 20th century industrial systems associated with material things – energy intensive manufacture, distribution, consumption and disposal along with the political and power structure issues that result.
The 21st century Internet provides a new perspective on 20th century information – the energy intensive manufacturing and transportation costs involved and the advantages pertaining to those who own the means of production and distribution. Consider what is required to produce a magazine – from the felling and processing of trees to make paper to the printing and distribution and the eventual disposal and waste.
Information was constrained and limited by its physical embodiment in objects – it was expensive, scarce, difficult to change and to share. You may need to travel to a bookshop or library to get a book – there would only be a limited number of books, you couldn’t easily make and distribute your own book or comments on a book. The same issues apply to other forms of information such as audio and visual information – consider the industrial processes involved in the music business to manufacture and distribute CDs.
Embodying information in physical objects slows it down and freezes it – in the same way that paper is a dead tree you could regard a book as dead information – there is no interaction. It seems a bit extreme but you could regard a library like an information graveyard where you can go to read inscriptions on the tombstones – the books as tombstones – dead information.
Computers and software as information technology have themselves also been part of the 20th century industrial production-consumption dynamic. Mainframes were born in the middle of the 20th century and naturally created a centralised information and control model. Punk music and the personal computing trend both started in the late1970s as an attempt at personal production – both were eventually assimilated by the very mainstream industry processes they were a rebellious response to. Personal computing is dominated by major industry businesses. Software “production” is still dominated by manufacture, distribution and installation – it is partially “dead” software. Computers and anything digital are subject to rapid produce and consume cycles – we need ever faster machines to run ever bigger software. The irony is that the machines of the information age had become the epitome of the industrial age.
William Gibson’s quote “The Future is Already Here – It’s Just Not Evenly Distributed” is a powerful and practical idea for working out what is going to happen in the short term – extrapolate from current edge and current trends. Using my crystal ball to throw the light of the recent past into the near future I see network effects creating exponential growth in certain areas and it is on these areas I have focused on below.
Cloud Applications Get Very, Very Good
Google Apps became serious in 2008 and improved as the year passed – they demonstrate the potential of on-line applications with new abilities that come naturally from being on-line (collaboration, web integration and data lookup), platform neutrality (work on Mac, Linux, Microsoft) and are free!
In 2009 Microsoft will be releasing some form of on-line application – the probable result will be to validate the model and expand the market. Microsoft’s activities will be influential and important to bring the cloud to the awareness of mainstream users. The next version of Office is expected to have “cloud additions” like Office Live – which allows you to store and share files in the cloud but you still need local Microsoft application software to use them –Software plus Service (Microsoft software plus Internet service). This approach is useful -Microsoft will be able to sell software (and this would be a compelling upgrade) and the “normal” Office user will get cloud access within the comfort zone of their familiar Office environment – the common Microsoft extend and embrace strategy. More interesting will be the browser based Microsoft office that is due out in 2009 although Microsoft say it will offer “light editing” of office files many expect it to be a very slick operator.
We need competition in the cloud but the scale and resources required needs major investment and infrastructure – competition in this area really needs to be among the really big companies (apologies to Zoho who have a very extensive and nice cloud suite). The competition between Google and Microsoft will be good for cloud applications and the expectation is that Google have major improvements planned as a response to Microsoft’s moves.
Although Google apps are already very good my expectation is that by the end of 2009 cloud applications will be very, very good.
Lifestreaming Becomes Mainstream
During 2008 Twitter achieved its billionth message (tweet) went exponential and was adopted almost everywhere e.g. in Government, Politics, business. Like many major developments Twitter is relatively simple and this is what has driven its growth – it is easy to start, easy to use, easy to use in different ways and is highly extendable, for example the Twitter API carries 10 times the traffic of the visible Twitter site – this is used by 3rd party twitter user interfaces but also for systems interfaces – e.g. to feed into Facebook. A suggestion of what of what to expect is the way web sites and blogs are now embedding twitter streams and for a change Twitter themselves even produced Twitter badges to assist with this. Increasing participation is speeding up the Internet and simple lifestreaming systems like twitter give people a foot in the door and an important presence in a “Global Village” of 1.5 Billion that is growing at 20% per year. Whenever I come across someone I expect them to have Net presence – I expect them to have a Lifestreaming and a Blog. Network effects are important and as more people use Lifestreaming and Twitter in particular the more important it will become and the bigger it will become.
Devices Get And Use Senses
The Graphical User Interface (GUI) enriched and enhanced computing – it liberated us from the command line with a more natural mode of working and allowed “normal” people to work with computers. However, the very success of the GUI has blinkered our vision of other approaches. As computers permeate our everyday lives and environments they need to be more responsive to our daily lives and environments. The Wii and the iPhone – both are less powerful in hardware terms than similar products but both have been wildly successful due to the new way they interact with the user and the environment and both have brought in a new set of users. The Google Mobile App for iPhone is already pretty amazing but even more suggests what will be possible. Consider the impressive list of interfaces already present on the iPhone – location (GPS), visual (screen, camera), audio (speaker, microphone), touch, tilt, proximity, vibration – we should expect further exploitation of these. Other manufacturers will be seeking to catch up and improve upon the iPhone – I think we will see some very impressive sensory applications for mobile devices in 2009 – I’m looking forward to be able to talk to and listen to the net e.g. “Phone – is there a traffic problem around here” and have it search traffic reports in my location and respond “yes – burst water main 1 mile ahead” for example. Microsoft tag and Amazon’s iPhone application suggest some ways in which we can use sensors in our smartphones.
The Web Gets More Programmable
With the web getting more participative information is generated at a higher speed and in greater quantities – tools to manage and cope are essential and those tools need to get smarter – this could be more a wish than a prediction but there are signs that smarter tools (agents) will become available to us. The “ecology” developing around Twitter gives some good examples. Twitter provides only a simple web interface but the Twitter API is heavily exploited by third parties to provide a huge range of applications based around Twitter. These applications indicate what can be done for users – Twitchboard which listens to your twitter account, and forwards messages on to other internet services based on what it hears or Shozu which provides a system to integrate many of your services from your mobile. The more adventurous user may like to try Microsoft’s simple MASH creator Popfly while more advanced users may which to try Yahoo Pipes, the Yahoo Application Platform, Google App engine or Microsoft’s Azure to create their own applications like this Twitter search using Google App engine or this Twitter search using Microsoft Azure. By the end of 2009 I am expecting to see more people using RSS, Integration services and even programming services.
Netbooks Go Massive
Small used to mean expensive but now small means affordable – the combination of economic problems and the need for mobility means that netbooks will be overwhelmingly popular – everyone I meet just loves them. The Asus Eee was like the punks who set a new fashion trend which was appropriated by the major fashion houses so that it became mainstream, now every manufacturer offers netbooks and there is now more of a continuum of models from the smallest to the largest so that the concept of netbook starts t lose meaning – is a 12in netbook really a netbook?
The Cloud Goes Massive
In the same way that mobile phones are given away as part of a network contract we see computers being given away as part of Internet contracts for what use is a computer without a network these days? With the Economic problems people will be choosing between computing (local compute power and applications) and networking – cloud computing is there at the right time to offer a solution with free (or relatively cheap) cloud services with free (or relatively cheap) local computing. Network effects will expend the cloud – as more people use it the more useful it becomes and the more people use it. Microsoft’s development of their Cloud services (Live), Office and Windows 7 will all “legitimise” the cloud for mainstream users and expand the market further.
Many comment that in our on-line activities will leave little past, however it is certainly true that our recent past is better documented that ever before. You can access my twitter activity from this time last year here is my first tweet for example and you can see all my past blogs including my Predictions for 2008 to see how good, bad or ugly they were.
William Gibson’s quote “The Future is Already Here – It’s Just Not Evenly Distributed” is a powerful and practical idea for working out what is going to happen in the short term – extrapolate the current edge and current trends. Things usually get smaller, lighter, easier, cheaper and more functional and common place – computers and telephones are good examples of this. We must also beware of technological determinism – we have to consider the complex interaction of contextual factors (economy, culture etc) that can change the “trajectory” of any extrapolation.
Using my crystal ball to throw the light of the recent past into the near future – it all seems quite cloudy to me and everything I see is on-line. The big theme for the year will be on-line everything and as more go on line network effects will cause more to go on-line resulting in an explosive growth in on-line activity. Despite of (or maybe even due to) economic problems 2009 could be a significant year for the information age – when many 20th century physical industrial activities are moved on-line.
Let’s try to focus some of this.
The industrial processes of the 20th century to represent, distribute and consume information will continue to disappear – information is intangible anyway and so is ideal for on-line virtualisation.
Audio: Audio set the example of how information can move away from the physical the stories of Napster and iTunes are now history and the Nokia music service of 12 months unlimited free perpetual but protected downloads takes this model almost as far as it can go. With music content “infinitely” copy-able and accessible monetization has to move to an incidental model of distribution deals and sponsorship. The SeeqPod service indicates that on-line streaming/access could be a significant development. It will always be useful to have off-line copies of your favourite music but the advantages of streaming are there for everyone. For consumers there can be an “infinite” range to listen to on-demand without management – just search. For the industry on-line listens can be instrumented (pun intended) and monetised with “incidental” and direct marketing. There is of course better control – music can be published free at source (e.g. from Sony), with the publisher monetising through “incidental” services but also allowing 3rd party API and streaming access for downstream services. I like the concept of on-line downstream “radio stations” such as Blip.fm, Lastfm and Pandora. Indeed Pandora offers an excellent example of the benefits of the on-line model in driving further interest by what could be described as audio surfing.
Video: Youtube and the iPlayer provides good examples of what we might expect in video. iPlayer has many advantages (on-demand view) especially with the BBC “transmitting” live on the net via iPlayer. Youtube in particular has become such a mainstream distribution method – standard TV channels, organisations, political parties and of course individuals are all there (e.g. BBC, Channel 4, Obama, Google and have a look at the Governator).
Words: Newspapers and magazines all have good on-line presence and for many their on-line activity is increasingly necessary and important. Pew research finds that the Internet has overtaken newspaper as a source of news for many people and for young people the Net is the main source of news. All the newspapers now offer excellent RSS feeds and various incidental services such as reader and journalist blogs, podcasts and various systems interfaces for systems such as Facebook and Twitter for example. The New York Times indicates how “newspapers” may develop – with the news of their API development program to “make the NYT programmable. To start 2009 the NYT release Represent – it mashes geographical information with various web data to present information about the politicians who represent geographical areas in New York. Books I feel will also succumb eventually – the physicality of a book (cover, typeface etc) are much like the physicality of old vinyl records. During 2008 e-book readers became a lot better and the advantages for industry and consumers over paper became tangible. Although the e-book reader really is useable now it is another purchase and item to carry around and look after – I would prefer to access an books on my smartphone or laptop/netbook. I think that e-books will break the market in for consolidation onto standard equipment – a prediction for 2010 I recon.
I can’t resist it but in 2009 “the medium is the message” (sorry). Possibly the biggest development in media will be the way it all gets mixed up – once they all share the same medium then they can mix and match. 2008 has seen the start of this and it is becoming increasingly common – 2009 will see a lot more of this. Already in 2008 we see newspapers with plenty of additional media content – the Guardian tech weekly for example has audio, video, blogs, Facebook, Twitter etc and text – the Bivings Report indicates just how active Newspapers are on the web – e.g. all have RSS feeds, 75% accept comments on articles and most now have free access. Cross platform media outlet Current indicate how media may develop – in their US Election coverage they have been combining video coverage with input from Twitter, Digg and 12seconds.TV.
All this and I haven’t even covered how Youtube is going live and HD.
The message is – if you want media you need to get on-line.
In the spirit of “I never make predictions and I never will” This blog looks at some of the possible consequences of Cloud computing.
Part 1 – Summary of some of the factors affected by Cloud Computing
Cloud computing throws all the issues of traditional IT up in the air and creates a level playing field. Small companies can make use of the same resources as the largest Global enterprise – on a “pay as you go” or even free model the only difference is in scale of use rather than installation. The cost of entry is significantly reduced. Small companies get access to enterprise level technology and reduce capital expenditure. This could also be a big benefit to people developing countries who could in theory run a large enterprise class IT from a computer with a broadband connection – assuming they can get access to a broadband connection – Google is helping with this however.
Cloud computing and IT as a service reduces development and implementation time and costs and facilitates fast delivery. It is no longer necessary to wait for delivery, install, configure, maintain, support and update traditional on-site services instead you can point a browser at a supplier and configure your service – it is a lot, lot faster. In theory at least, cloud services can be changed, expanded and shrunk as required and new services created by “stitching” together existing and new services.
Innovation often takes you down a dead end and the cost in time and money often put people off from innovation – it is safer to follow known working models than to risk early advantage. Cloud computing and IT as a service reduces the resources used to innovate so that the consequences of failure are smaller.
Where once the Internet was used as a communications channel between “islands” of business, systems and people it can now be used to host systems and business. Collaboration and interoperation through firewalled systems is a problem – interoperation is a lot easier and natural in the cloud environment.
Traditional IT has been centred on physical location and protected from the internet by firewalls. Such systems are well suited to tradition where people travel to work at organisational premises. Providing flexible external access under these conditions has been difficult – usually by allowing increasing numbers of staff external access to internal resources by use of a VPN which goes through the firewall which protects the internal systems from the Internet. Physical location loses meaning when you use cloud computing – the services are on the Internet – it doesn’t matter where you access them from.
Through mass market scale and consumer interest in computing the cost of computers has dropped considerably. Where computer companies once looked at Business as customers they now look at consumers and as a result most business IT is now driven by consumer trends. Consumers want to operate their IT equipment like gadgets – turn them on and use them – they don’t want to difficult configuration and setup and to worry about security and updates. The proportion of device cost taken by software becomes increasingly difficult hardware cost continues to fall – when a device might cost just £200 it seems difficult to then pay another £200 for software.
Internet growth itself could be argued as an example of the network effect as more people use it the more useful it becomes and the more people use it and the more it expands. Cloud computing depends upon highly available reliable Internet connections – people and organisations will be prioritising their Internet connections – this alone will add to growth in Internet availability
Part 2 – The predictions
As the Internet mediates more of our experience we should expect a step change as we live our lives more in Internet time.
Things will change faster (when faster is the new fast) as the cloud and Internet enable things to be done so much quicker than in traditional ways.
Competition will be keener than ever before as smaller organisations can compete effectively with larger organisations in certain sectors – especially in innovative new service delivery.
We should expect to see remarkable product and service innovations as barriers to innovation are lowered the speed and ability to innovate improved.
We should expect to see an explosion in collaboration of all kinds – between systems (Mashups), people and organisations.
Cloud based systems operate outside traditional firewalled “island” systems and we should expect to see the concept of company and organisational operation challenged.
We should expect to see growth in flexible remote working – a “de-location” or work , life and service as it becomes less necessary to be within the premises of a company to work for that company.
We should expect cheaper and easier to use Internet access devices from traditional laptop and mini laptop style units to more consumer oriented gadget type units as well amazingly functional smartphone units – there will be more “plug, play and go” and a growth in non Microsoft software and new software types.
We should expect to pervasive Internet access. As device costs fall and Internet access improves we should expect to see all manner of devices and gadgets with Internet access and we should increasingly expect to get on-line anywhere anytime – hyperconnectivity.
The trends above interact in a positive feedback loop style so that the changes become ever faster. If network effects are present then as more people collaborate and innovate faster then change will happen faster and faster.
It isn’t possible to specify any one factor in the development of cloud computing but like many significant events there are a set of interacting and reinforcing factors that react together to set the conditions for cloud computing. In no particular order here are some of the factors that set the conditions where the cloud can develop and where the cloud can be a natural resource.
Good Internet access is a necessity for Cloud computing and over the last few years Internet access has become progressively cheaper, faster, more reliable and more pervasive – it is increasingly common to be able to get on the net from almost anywhere in the developed world.
In order to use and develop cloud computing you must have good internet access.
Education and work today asks and requires more collaboration.
Collaboration is something that has been “bolted” on to “old style computing” and can be difficult. Trying to work on a document with a group of colleagues using email or file sharing is frustrating and version management is a nightmare. Email attachments fly backwards and forwards or users can’t edit a shared document at the same time.
The cloud is natural for collaboration – collaboration has been built in from the beginning and the resources are naturally “out there” and accessible to collaborators.
One other big advantage for usign the Cloud for collaboration is compatibility – collaborators only need to have a browser and Internet connection. With “old style” IT each collaborator would have to have the same application installed on their computer to access the files being used for collaboration. If using Word for example each person would need to have Word installed on their computer (buy it, install it, maintain and secure it) and then have the right version to read the files (Word 2003 cannot natively open Word 2007 files). Using the Cloud people can collaborate on computers using systems from Microsoft, Apple or Linux for example.
24/7 Mobile and remote work
Education and work today asks and requires more flexible mobile and remote working – from homework to work placement and partnerships outside the organisation.
External access is something that has been “bolted on” to “old style computing” – organisations use network “firewalls” to protect their private networks and providing external access to these private resources is awkward. Either resources are placed outside the firewall (in a DMZ) or “tunnels” are provided to allow external access to internal resources. As more and more people require external access the whole concept of firewalls and tunnel access becomes difficult to sustain.
The cloud is a natural for flexible mobile and remote work – the resources are naturally “out there” and accessible from anywhere with Internet access. You can create a document in the cloud and work on it at home, in work, at a meeting with a partner organisation etc etc.
Consumerisation and Personalisation
Education asks and requires more personalisation. The IT industry is increasingly focused on consumer issues rather than corporate issues. Students and young workers are comfortable with IT and can use their own resources to get things done.
“Old style computing” was formed from business use of IT and is focused on control and application – the ability of users to “do there own thing” is designed out of such systems. No wonder IT “users” in companies get frustrated with corporate IT. This is all made worse by the “bloat” and complexity of dealing with modern applications which makes it so difficult for “normal” people to look after their “old style” IT – they become dependent upon the IT department or those who know how to deal with this stuff. Cloud systems avoid all this – no need to spend hours installing applications and dealign with computer issues like driver problems – just point a browser.
The cloud is a natural for Consumerisation and Personalisation – the resources are naturally “out there” and available for people to use on their own terms. People can choose and use their own communications tools and applications from social network to webmail, cloud documents and microblogging etc etc. IT is possible for people to make use of the “natural resources” of the cloud and get things done themselves without having to wait for overworked corporate IT departments to come along and do it for them.
More of our lives are mediated by the Internet and criminal activity on the net is increasing and becoming more sophisticated – the security of our on-line presence is increasingly important.
“Old style computing” was formed from business use of IT on private networks and “standalone” isolated computers – security grew out of physical access to IT (from inside a firewall to access to the computer itself). Security measures regarding the Internet and the unknown have been “bolted on and have proved difficult and only partially effective – consider the monthly security patches for Microsoft software through to the very concept of a firewall. The problem for traditional computing is that people or organisations need to keep abreast of security issues and practice and to secure the increasing amount of equipment they use.
The cloud has been built with security in mind – rather than starting life isolated behind a firewall or cut of from a network the cloud is naturally “out there” and exposed from the start. The big advantage is that security can be delegated to experts in the cloud – I’m sure that dedicated experts at Google, Amazon or Microsoft can keep their cloud systems more secure than I can keep my computer for example.
The “old style” of IT uses local running applications and files such as Word and Word files (although these applications and files may be delivered from a server they run on the local computer). Both of these present security problems. The local files and applications are a target for viruses and hackers – the majority of viruses ae now aimed at applications rather than operating systems. The other problem is that people carry these files around or email copies – these local files can be accessed by people with physical access to a computer or storage device – the majority of data breaches have been with lost or stolen laptops and removeable storage.
Jack Schofield recently talked about cloud security – comparing the cloud to Fort Knox and traditional IT as Gas stations – the “cost-benefit” involved makes targeting small installations a better option than targeting large well secured installations. A recent article at silicon.com describes the advantages of using the cloud for security.
Another aspect of security is business continuity and availability. “old style computing” developed before the Internet and required you to buy, install and maintain your own computers and for people to become IT experts. Increasingly people and organisation want to use their time and money using IT rather than dealing IT (hardware and software complexities for example). A small company no longer needs to install and maintain its own staff and servers for file sharing, database, web and email these resources can be accessed from web browsers while the likes of Microsoft, Amazon or Google for example manage the hardware continuity- take a look at Microsoft’s “Fort Knox”. If I had a penny for the people I have come across who have lost files from problems with either removeable media or computer hard drives I would be a wealthy man – forget local data corruption or loss – place your data in the cloud.
In summary, where you have good Internet access and wish to develop any of collaboration, 24/7 mobile and remote work, consumerisation and personalisation and have security and continuity concerns then these are ideal cloud conditions.
Application, storage and communications services operating in the Internet and accessed through a web browser.
The Internet is a network of networks and the phrase “Cloud Computing” comes from internet diagrams that use a cloud symbol to hide the complexity of the way networks are connected. My network is connected to your network somehow through the “cloud” – I don’t need to know the details of how this happens. I only need to know how to connect to my internet service provider.
The idea in the use of the phrase “Cloud” is to simplify and hide complexity and to focus on service.
If you use a webmail system (e.g. Googlemail, Hotmail) then you already have some experience of cloud computing.
Many older email systems operated as client – server. The client was an email program that you needed to install on your computer. Servers provided a system for email clients to send and receive email messages between each other. Messages were downloaded and stored on the client computers. This meant that messages stored on one computer wouldn’t be available to you on another computer.
Webmail combines the application, storage and communications aspects of email into one service available from any computer with a web browser and an Internet connection. Webmail provides a good familiar example of what cloud computing is all about and provide my definition of cloud computing – application, storage and communications services operating in the Internet and accessed through a web browser.
Webmail has been around for ages now but the development of new web programming technologies in recent years has allowed the advantages of cloud computing to be applied to most areas of computing – hence the rapid development of new applications – here are a few popular examples of cloud computing:
For more information about Cloud computing follow the links below
The blog offers an outline of the main approaches to provision systems and offers some categories to help you when considering and selecting systems.
It is possible to build much of a PLE with any of the systems approaches below but bear in mind that real world systems will be a combination of some or even all of them.
Product – on-site
Using a supplier’s product and installing, developing and maintaining it on-site.
Product – hosted
Using a supplier’s product but having a 3rd party host the system for you – you manage and access it across the net.
Service – “cloud”
Using a suppliers service – you are not aware of the underlying technology or system but the service you get e.g. access to email, blogs and shared workspaces.
Using your own specialists to program and design your own system.
This could be on-site, hosted or in the cloud.
An example of this approach is Centime which is originated at EHWLC and is developed in partnership with a small number of other educational organisations.
Using and integrating whatever the users (learners and staff etc) choose to use.
Examples are the use of people’s own on-line identity (e.g. openId), email, blogs and social networks integrated with organisational data.
Evaluation Criteria – Outline
The headings below can be used as an outline for more detailed work when it comes to considering what system is right for you.
What ownership options does the system offer regarding the use and fate of material in held in the system and do they meet your requirements for ownership.
– What happens when the author leaves the organisation.
– Who can say how the resource is published (private to organisation, public to specific users or fully public)
– “Copyright” – who decides if it can be copied and by whom
– What happens when there is a disagreement about a resource – liability, conflict resolution etc
These questions can be considered for both staff and learner authored resources.
What service levels does the system offer for availability, security and performance and how do they meet your requirements for service.
What management options does the system offer and do they meet your requirements for management.
For example in provisioning and controlling user access and resources how easy is it to create accounts; change accounts; change access to resources; remove accounts and provision group resources, spaces and permissions.
What data integration options does the system offer and how do they meet your requirements for data.
For example – how easy is it use the system in combination with data systems used by the organisation.
How does the system meet your requirements to adapt and change – does the system allow you to deliver what you want and how easy is it for the system to develop what you want.
Teaching and Learning
What teaching and learning options does the system offer and how does the system meet your requirements for teaching and learning.
What user experiences does the system offer and how does the system meet your requirements for teaching and learning. How easy is the system to use – is it suitable for your users.
How good a fit is the system with your organisational culture.
For example -do your people like a clearly defined framework to work within, are they comfortable with experimentation and change.
Future potential and issues
What future potential does the system offer and does this meet your requirements for future developments.
What skillsets does the system require and do you have these or are you able to develop them, buy them in or contract them out.
What are the costs and the cost types (e.g. capital vs operational) of the system and can you afford them. Consider all the associated costs – cost of equipment installation, maintenance and operation; software licences, staffing costs (training, development etc) and costs to meet the criteria above e.g in supplying the skillsets, data integration, service level etc.