TAG | technology
View infographic that accompanies this post.
In the UK we are used to accusations of being over-traditional and stuck in our ways and our education system is no exemption to this. In my final year at university I experienced the American lecturer for the first time. Lecturers from the US made fun of our class for expecting to just turn up, sit down and be lectured. They wanted a dialogue, response, audience participation – they taught us in a way that, in contrast to my previous experience, seemed almost futuristic. But I was wrong- the future of education is not in Americans making you say constructive things in lectures, apparently the future is in MOOCs. But what is a MOOC?
A ‘loser’ or a ‘moronic bonehead’? A cocktail bar in Leeds? A municipality in the Netherlands? Korean food? Or an acronym for Massive Open Online Course?. It’s the latter, but just in case that doesn’t make things clearer – MOOCs are open access courses that operate on a vast scale; available to anyone, online, for free. MOOCs are proving to be extremely popular – Coursera, which was set up by two lecturers in Stanford, has upwards of 1.5 million students from across the world enrolled and many MOOCs are attracting 10s of millions in venture capital.
In September of this year, the first students to pay £9000 a year enjoyed freshers’ week in the UK. In anticipation of this three fold increase in fees, applications earlier in the year were down by 12%. Our Higher Education system is becoming a privatised market place where education is bought as a ticket to a bigger wage packet. Meanwhile another form of higher education is emerging in the form of MOOCs. The ideology behind the MOOC is that knowledge and education should be free, available to all and sought for their own sake. The aim is to democratise education – Sebastian Thrun, founder of MOOC Udacity claims “It’s the beginning of higher education for everybody“.
But where will this divide lead? 2012 is being heralded as ‘The Year of the MOOC’ but what will 2013 hold? In the market place of higher education it appears that nothing can compete with the MOOC; they are free and open to all. In July the first UK university – the University of Edinburgh – joined the MOOC Coursera. So are MOOCs going to replace our outdated, corporatised universities? No, if we stop being scared of the success of MOOCs we can see that the two systems of higher education are not necessarily in conflict, in fact they are naturally complementary.
MOOCs need Universities. Most MOOCs host courses directly from universities. The MOOCs Udacity, edx and Coursera have all been started by lecturers or universities. It is unlikely that they would be aiming undermine themselves; more likely is that they see huge potential in elearning. These MOOCs derive credibility directly from the institutions that choose to offer courses through them. There are MOOCs that offer independent courses, most notability Khan Academy, but even these rely heavily upon universities. MOOCs are predominantly taken to supplement or refresh degrees; they are most valuable when used in this way, rather than as stand alone courses. In 2004 the UK elearning university UKeU was scrapped with officials claiming that “universities were more interested in “blended” learning involving a mixture of IT, traditional, work-based and distance learning to meet the diverse needs of students – rather than concentrating on wholly e-based learning” The Gates Foundation is a great supporter of MOOCs but their grants are mainly focused on the development of MOOCs “as a supplement to traditional courses, rather than a replacement for them”.
Equally, universities need MOOCs. MOOCS are essentially Learning Management Systems (LMSs) with the password restrictions removed. Most higher education institutions use some form of LMS to enrich teaching. For example UCL use the LMS Moodle which is accessible to all current UCL students and staff with a UCL email address. Users can use LMSs to share resources e.g. documents, handouts, videos of lectures; to communicate, to work together, to organise and structure work; and to administer online assessment.Universities can harness the potential of MOOCs to augment existing courses. The benefits of elearning cannot be denied and universities need to adapt to stay relevant. The fact that so many students are signing up to supplement their current university courses suggests that universities are missing something. Opening up courses via MOOCs benefits in-class students by producing a more diverse class for discussion and greatly improved elearning resources.
There is not just one way to learn. Everybody enjoys learning different things and in different ways. Maybe we are witnessing a new way of teaching and learning arising, but it doesn’t necessarily have to replace other ways. More choice can only be a good thing. As Stephen Downes, a MOOC founding-father stated – “MOOCs don’t change the nature of the game; they’re playing a different game entirely“.
This week we lament the loss of Ceefax. The information service has been running on televisions since the 1970s but is now outdated and underused. The digital communication of information has evolved and Teletext has died out. Ceefax is replaced by the BBC’s Red Button but ultimately the internet is winning in the digital communication jungle: it is more visually appealing, faster and contains far more information. But obviously this isn’t the end of the evolutionary line, so what does the future hold?
Several factors can influence the future of the internet – government intervention, corporate behaviour and us, the public. Government policy and business decisions shape internet supply, availability and functionality but we drive usage and demand. In response to unwelcome changes by the former, websites have been set up to complain or monitor effects, books have been written and large scale protests have taken place.
Tension is increasing between two opposing views of the internet – as a haven for freedom of speech and expression or as something within the jurisdiction of legal and moral rules. The two are not necessarily mutually exclusive; offline we normally consider ourselves to have a right to freedom of speech whilst at the same time culpable for illegal offenses. Yet in the case of the internet, either side seems to believe that to make even the slightest concession to the other is to open the floodgates to a worst case scenario – be that a heavily censored internet under complete government control or a hive of illegal and immoral activity.
The argument is between pragmatism and idealism- do we accept that the internet must be regulated in some respects or do we maintain an ideology of the internet as free, universal and limitless? There is a huge debate surrounding the issue with influential supporters on both sides and the way in which resolution – if possible – occurs will dictate the future of the internet. Modern technology is ‘completely out of control‘ according to Lord chief justice, Lord Judge – but is this in practice or in principle? Sarkozy argues that the internet ‘isn’t a parallel universe’ – why should we allow anything online that we legitimately do not permit offline? Meanwhile, Neelie Kroes, the vice-president of the European commission, calls for the removal of ‘digital handcuffs’ in agreement with Sir Tim Berners-Lee’s belief that the internet is ‘for everyone’.
But this is not just a verbal dispute – this year we have seen action on both sides. In the UK we have seen the removal of videos featuring and promoting gang culture from youtube, a crack down on illegal downloads and the proposal of an ‘opt-in’ system devised to protect children from online pornography. There have even been multiple arrests over offensive tweets in cases of racism and other types of abuse prompting questions over whether this type of action is too ‘heavy handed’. From the other side, we saw websites such as google and wikipedia take part in a blackout protest against US government anti-piracy proposals in the allegation that they would lead to government censorship.
The issue with individual governments exerting control over the internet is that the internet, in that it consists of the world wide web, is intended to be world wide. Sir Tim Berners-Lee claims that ‘This is a question of principle, it’s a right to be able to access [the web] anywhere‘. Government controls introduce localised differences raising worries that the future could bring a series of fragmented, independent internets. This is already noticeable on a small scale – the internet looks different depending on where you are in the world. Many countries ban specific websites containing political or religious content and social media sites completely. This year we have seen Twitter introduce and implement a new ‘Country Withheld Content’ feature, allowing the removal of specific content from one country only. It was recently used to remove neo-nazi content in Germany and France but not the rest of the world.
Perhaps protestors are too idealistic in regarding the internet as something ‘universal’ because this is merely a concept and not the reality of the internet as we know it. The internet did not begin freely open to all and is now being restricted- perhaps as an idea but as an actual entity it is limited by hardware and physical infrastructure which are not equally freely available. A digital divide has existed between developed and developing countries preventing equal access to the web. In view of this, maybe the recent government interventions we have witnessed seem less like a drastic and sudden attempt at control.
So is the internet out of control, uncontrollable or beyond control? Which side is right, or perhaps more importantly, which side wins will shape the future of the internet. We can’t predict the future of the internet as clearly as these children from the 90s but one thing is clear – hyperbolic slippery slope arguments are not what we need, because if we remain at a standoff then we miss opportunities for mutual benefit.
On 24th August this year Apple won $1.05b in a lawsuit against Samsung for patent infringement. This is just one stage in an ongoing battle between the two companies over intellectual property rights. Samsung challenged the verdict and have recently retaliated by extending their counter claims to include the iPhone 5 . It appears that the war between them is likely to rage on.
What does this mean for us, the consumer? If the situation continues with a constant back and forth of claims and counter claims then we might lose interest because it doesn’t seem to have any direct effect on us. Why should we care if one mutli-billion dollar company has to pay another a billion dollars?
We should care if we want newer designs, more choice and more innovative mobile phones. Here’s why: intellectual property patents, which are the subject of such disputes, are designed to protect ideas. They protect the investments made in the generation of these ideas. New ideas lead to new innovations and as consumers we benefit from new innovations because they provide us with more choice and better products. The Apple/Samsung dispute raises the issue of whether these same intellectual property patents can sometimes stifle creativity and innovation instead of protecting them.
US Judge Richard Prosner recently claimed that the US system of patent protection can be “excessive” and many commentators have questioned the relevancy of intellectual property patents in a digital age due to the incremental nature of technological advances. Some go a step further and claim that through preventing imitation, patents prevent innovation. This is the philosophy of highly popular open source systems which are owned by no-one and freely available. According to this side of the argument patents create barriers to competition and perhaps Steve Jobs would agree having once expressed the sentiment “good artists borrow, great artists steal”.
It is a problem if patents stop acting as incentives for companies to invest in R&D and instead shelter companies from industry competition. Competition is good; it is what drives business forward. Industries develop and grow through companies learning from each other and building on each other, if they don’t – or indeed can’t – do this then progress is slowed and the consumer loses out. However, it has been a long time since phones just phoned and if additional functions are recognised as an industry standard they can be protected by essential patents. These are licensed to competitors on ‘fair and reasonable terms’ in order to prevent barriers to innovation and competition.
But it is the presentation of such functions and their interaction with users that forms focus of recent disputes. Apple’s legal claims against Samsung are focused on physical design, visual design and features such as ‘scroll-down and bounce up’ or ‘tap to zoom’. This is where the other side of the argument surfaces: patents – even if excessive – force companies to invent different approaches to products in order to compete. This type of competition is arguably more valuable because it creates new ideas. Different user interfaces may introduce switching costs to consumers in terms of time spent learning to navigate a new handset and different software complicates app designing, but they do provide the consumer with a real choice between alternative products.
We need to find the best way to protect ideas, promote competition and create incentives for companies to keep producing new, exciting, cutting edge designs. Perhaps patents – at least in their current form – are not the best way to do so. The Apple vs. Samsung case is so complicated that it may never be fully resolved but spending large amounts of money, time and effort in court doesn’t seem the most efficient way to try.
Glancing through the myriad predictions which are spat out every January (“The 37.5 Biggest Things in Digital in 2011!”), an interesting trend emerges – namely, counter-trends. 2011, say some commentators, will be the year that people turn off. The arduous quest – for information, for connectivity and for communication – has reached a kind of saturation point. People have had enough.
That is not to say that there isn’t a wealth of new, exciting tech waiting to invade our collective consciousness in 2011. Near Field Communication (NFC) looks set to make a big impression this year. The wireless data exchange technology inside Oyster cards is rumoured to be a feature of the iPhone 5, with Google Android and RIM also announcing that they will be releasing NFC-enabled phones in 2011. By this time next year, tapping your phone will probably be the standard way of negotiating such troublesome physical obstacles as train barriers, hotel doors and venue bouncers. Likewise 3D printing is deemed so important by one tech blog that it is afforded its very own 2011 preview list.
But despite (and partly because of) all these exciting advances, along with the falling price and rising availability of 2010’s technologies, another pattern is appearing. Influential agency JWT identify ‘digital downtime’ as one of their 100 Things to Watch in 2011. Their prediction that such breaks from the technology will be commonplace in an attempt to ‘foster creativity’ seems to unfairly pre-suppose that ‘digital uptime’ (a real feature of 2010) somehow stifles creativity. But you can see the point – there are already signs of a growing nostalgia for traditional practices (personal service, knitting) and things (vinyl, physical books).
Likewise, there has been a decline in the fervour for transparency which characterised the politics of those most unlikely political bedfellows, Barack Obama and Boris Johnson. Remove from your mind the image of those two fellows sharing a bed for a moment, and you will realise that developments in 2010 have changed people’s attitudes to the distribution of information. As the WikiLeaks saga developed at the end of 2010, perceptions of Julian Assange’s role became more ambiguous. Leaving aside security concerns around the disclosure of some pieces of communication, international diplomacy operates on the basis that some things should remain private. Similarly, Vince Cable received criticism when his comments about Rupert Murdoch were revealed, but there was also a feeling that private conversations between politicians and their constituents should remain just that: private.
And this, of course, leads us to Facebook. The controversy over its privacy settings has raised serious concerns about the ways in which personal data is collected, stored and shared on the internet. David Fincher’s excellent The Social Network interestingly highlighted Mark Zuckerberg’s ideological belief in the sharing of information – this belief is not, it turns out, shared by everyone. Facebook’s complicated web of privacy settings is seen by many as pernicious and exploitative. It has led some users to look to alternatives. Elsewhere, as previously discussed in this blog, concerns have been raised about location services.
All of this presents a challenge to all those involved in shaping the way that people use the internet. For me the answer can be found not in switching off completely, but in two other new developments. Firstly, the establishment of clearer and firmer rules on how users are ‘used’ by the big players – and secondly the acquisition of more power and independence by those users.
Social media has already destroyed the traditional one-way relationship between brands and consumers, between broadcasters and audiences. Now, with the coalition government championing a big society (whatever your views on that), 2011 will see the growth of businesses, outlets and schemes set up by the people, for the people. WhipCar, the p2p car sharing network, is a good example. Freecycle, which allows users to donate unwanted items rather than discard them, is another. Meanwhile Made.com and Naked Wines have enjoyed great initial success by taking consumers direct to manufacturers – cutting out the middlemen and bringing down prices. With developments like these, 2011 promises to be an exciting year for digital.
It was only ever a matter of time before our two main channels of media communication were united. The Internet has revolutionised everything from accessing news to purchasing music – our social lives are managed online, and now the opportunity to transform television has been given the green light.
For those of you who are new to the idea, YouView (née Project Canvas), in a nutshell, is TV delivered over the Internet. It is a collaboration between broadcasters (the BBC, ITV, Five, Channel 4 and Arqiva) and broadband network providers (BT and TalkTalk) to develop a subscription-free, web-linked TV service combining Freeview digital channels with on-demand content such as iPlayer. This long-awaited IPTV project, hailed as the ‘Holy Grail’ for future public service broadcasting by BBC Director General Mark Thompson, promises to ‘change the way we watch television forever’, and is coming to our living rooms in early 2011. Such proclamations are to be expected from one of the project’s main backers – but they leave the rest of us wondering whether we really need another set-top box to add to our collection and whether IPTV really is the way forward.
The answer from the YouView consortium is, unsurprisingly, a resounding ‘yes’. It maintains that this simple and free-to-access service, with its easy-to-navigate interface, will soon be a necessity for all UK homes. YouView Chief Executive Richard Halton says the scheme is a great alternative for those who lack the ability or inclination to pay a monthly subscription for similar services offered by companies such as Sky and Virgin. These rivals are predictably unimpressed by YouView’s developments. But complaints to Ofcom that YouView will stifle competition are undermined by the fact that they’ll always have the lure of additional premium channels to tempt viewers.
The evolution of Project Canvas has been something of a roller coaster. It didn’t exactly have an easy start, with the failure of a similar BBC project (Kangaroo) back in 2008 still looming and vocal criticism coming from the likes of Richard Branson and Rupert Murdoch. To make matters worse, Five opted to pull out of the deal in July (they later decided to rejoin). Recently, the scheme has earned a little more support, and Project Canvas was re-christened ‘YouView’, a name touted for some time, in September. Perhaps it’s just a happy coincidence that this name bears an uncanny resemblance to both Freeview itself and a certain global video sharing site owned by Google. A more appropriate moniker might have been ‘iView’, in keeping with ‘iPlayer’ or, better still, ‘iTV’ – although I’ve definitely heard the latter somewhere before.
In terms of functionality, YouView will enable you to watch so-called ‘Linear TV’ (the channels currently offered via Freeview and Freesat) as before, along with video-on-demand services like iPlayer and 4oD. In addition, you’ll be able to access popular sites like YouTube, Facebook and Flickr and on-demand pay TV – films, US drama and sport – all with a wave of your remote control. A recent YouView press release also boasted that it would be a potential platform for local TV services, making it ‘easier for viewers to discover and interact with localised content’.
It’s true that there’s nothing particularly revolutionary about YouView. It has the usual suite of features you’d expect – HD, a video recorder and the ability to pause/rewind live TV – but what it does do is combine this with the enormous potential of the Internet in one nifty, take-home box. The fact that VoD services are available on something other than a laptop screen (or a Virgin Media package) will be the biggest draw for some.
On top of this, as an open platform, YouView is set to boast a whole array of interactive features – apps, widgets, games, you name it. This presents a massive opportunity for content, device and application developers to dip their toes into the IPTV market. The implications for viewers (or perhaps ‘users’ would now be a better term) look exciting.
It will be interesting to see whether this BBC-backed venture pays off. As competition to take over the small screen hots up from a clutch of other big names like Apple and Google, we have to wonder whether YouView will be the one to make the cut. If you’ve been following YouView’s development, or would like to comment on any of the above, please get in touch!
Just a quick one, but I felt this needed more than a single line in the feed on our homepage. I had an illuminating meeting at the Leading Edge programme at the SSAT this Monday. It’’s rare that you meet someone in such total agreement with you as to what technology can do in the classroom, and I would have walked away sufficiently impressed by that occurrence alone were it not for the little demo I got of the SSAT’’s learn AR tool. You”d think that AR in the classroom would start off as a gimmick but for the most part this stuff goes beyond just allowing students to visualise things more clearly – it allows them to do things they might not otherwise be able to.
Cue a Geiger counter experiment (can”t get a screenshot for love or money) that some schools can”t carry out because they can”t get hold of the right materials: 1 marker for the counter, 1 marker for the radioactive source, and another to represent whatever you””re putting between them to compare the absorption of different materials. Engaging, safe, cheap, magic. Not surprised it was a hit at BETT.
As the dust settles on this year’s BETT Show, bloggers have been frantically sharing their thoughts on the 2010 instalment of the educational technology behemoth.
It was my first time. I had been given many warnings as to the overwhelming nature of an event which brings together 30,000 people amongst more green and purple than a Teenage Mutant Ninja Turtles convention. But none of the warnings could have prepared me for the sheer scale of BETT.
It was really nice to see mycurriculum.com get a lot of visibility and attention on QCDA’s stand. The website is looking really good now and it was great to see the branding up and demos taking place.
Ray Barker, Director of British Educational Suppliers Association (BESA), the trade association for the educational supply industry, identified two major themes of this year’s BETT in an interview with Teachers TV. Firstly, Mr. Barker said that this year’s show was “very practitioner-led”, with a focus on professional development and training for teachers.
Secondly, he emphasized the importance of “pupil voice, learner voice” and of “the kinds of technologies that young people are using.” Google and YouTube both exhibited for the first time this year, and the Playful Learning area seemed to be a big hit too – at least with the students who were taking part in the gaming. Some bloggers have commented that there may have been too much emphasis on the “playful” and not enough on the “learning” here. The pupils certainly weren’t complaining.
Whatever the value of the games exhibited here, this seems to me to be a worthy shift in attitude (if indeed it is a shift in attitude). The potential for fun on show at BETT – from 3D video to “serious” gaming – is encouraging. Schools have traditionally tended to fear technology, often feeling more inclined to ban new devices than integrate them into the learning experience.
If BETT 2010 does mark, or at least reflect, a greater willingness to blur the boundaries between work and play and to help pupils enjoy learning more, then this can only be a good thing for young people and those children just entering the education system. In fact I rather envy them.