Cloud Working + MOOCs

///

Making It Count

June 15, 2012 – 3:00am
By
 Massively open online courses, or MOOCs, are not credit-bearing. But a pathway to college credit for the courses already exists — one that experts say many students may soon take.

That scenario combines the courses with prior learning assessment — a less-hyped potential “disruption” to traditional higher education — which is the granting of credit for college-level learning gained outside the traditional academic setting.

Here’s how the process could work: A student successfully completes a MOOC, like Coursera’s Social Network Analysis, which will be taught this fall by Lada Adamic, an associate professor at the University of Michigan. The student then describes what he or she learned in that course, backing it up with proof, in a portfolio developed with the help of LearningCounts.org or another service, perhaps offered by a college.

Read more: http://www.insidehighered.com/news/2012/06/15/earning-college-credit-moocs-through-prior-learning-assessment#ixzz1y96PgtqR
Inside Higher Ed

///
|
frog design (@frogdesign)
27/07/2011 21:06
Why it’s okay to outsource your memory to the cloud: http://bit.ly/naASGV

Why it’s okay to outsource your memory to the cloud

By Jul. 15, 2011, 10:03am PT 9 Comments

Science magazine has published some research into how our memories are influenced by the availability of computers as a source of information, and this has some in a tizzy about the implications of outsourcing our brains. Author Nick Carr, for example — who has written a whole book about how the web is changing the way we think and making us more shallow — says he worries this phenomenon is going to make us less human in some way. But is that really a risk? I don’t think so. I, for one, am glad to outsource the duty of remembering miscellaneous facts to the cloud, because it leaves me free to do more important things.

In a nutshell, the Columbia University psychologists who published the study performed a number of experiments designed to test whether subjects remembered certain things better or worse when they were told that the information — such as “An ostrich’s eye is bigger than its brain” — would be stored in a computer somewhere or would be available through a search engine. Not surprisingly perhaps, people’s memories were somewhat less reliable when they knew the answers they were seeking would be stored for later retrieval (there are more details at the Columbia website).

Implanting forgetfulness?

Carr says he’s worried that by losing these facts and details we store elsewhere, we will become less human in some way, or lose some core of ourselves. But is that really what’s happening? I don’t think so. It’s not like I’m suddenly going to forget my son’s first steps (oh, that’s right — I have daughters!) because I use Google to look up who starred in that movie we watched a couple of years ago, or to figure out who the head of the United Nations is. It’s worth remembering that the invention of writing triggered similar fears, as Plato reminds us in The Phaedrus, quoting the King of Thebes:

If men learn this [writing], it will implant forgetfulness in their souls;they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder.

Carr also makes the argument in his book “The Shallows: What the Internet is Doing to Our Brains” that we are becoming not just dumber as a result of the web, but also (supposedly) less interesting, because our brains are being trained to focus on the ephemeral and the trivial instead of the important things we should be spending time on. I took issue with this kind of fear at the time, as did some others, and I think Carr is being similarly alarmist in this case. Besides, if we use the cloud to remember the trivial and ephemeral for us, wouldn’t that be a good thing by Carr’s definition?

Do we still need to memorize things?

I know that in my parents’ time, memorization of huge lists of facts and figures and Shakespearean sonnets was standard, because that was the criteria by which knowledge was judged. But what difference does it really make if I can’t remember when the War of 1812 was? (that’s a joke, by the way). Is my experience of the things that matter in life going to be impaired because I don’t know who signed the Magna Carta? I can see how this would be a problem if a trivia game suddenly comes up while I am camping in the woods, but other than that, I don’t see why I shouldn’t outsource that to the cloud — the same way lots of people used to outsource it to Encyclopedia Britannica.

As one commenter on Google+ mentioned when I shared the Science magazine article in my stream, the benefit of having something like the Internet available at all times is that it is the most comprehensive collection of knowledge ever invented (although obviously not all of it is correct). How can that not be a good thing? Said Justin Fogarty:

The plus side is that the whole of human knowledge is nearly at our fingertips. I will not miss card catalogs, the Dewey decimal system or heavy book bags.

Computers can’t really replicate memory anyway. All they can do (so far, at least) is store facts — but facts are not memories. What real memories are made up of is smells and sounds and emotions, and no computer or cloud-based system can store those things. But what the cloud can do quite well is store my phone numbers and the photos I took on a particular day or the tweets I sent (something an app called Momento is extremely good at) and leave me free to relive the memories associated with those facts.

To me, that’s a fair trade — the cloud remembers all the boring and mundane details and facts of my life (yes, I use Facebook to remember when people’s birthdays are, as I expect a lot of people do) and I get to focus on the things that are really important.

Post and thumbnail photos courtesy of Flickr users Stefan and Tim O’Brien

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

___*

Pete Cashmore (@mashable)19/07/2011 16:38
Why Every College Should Start Crowdsourcing – http://on.mash.to/oTmw71

Why Every College Should Start Crowdsourcing

Michelle Lindblom 15 hours ago by 16

school imageMichelle Lindblom is a Communications Associate at JG Visual, an Internet strategy company that works with organizations to develop and implement their online presence. You can connect with Michelle on the JG Visual Facebook Page and on Twitter.

Just like all large organizations, universities have their fair share of problems. These problems come in all shapes and sizes and can sometimes seem unrelenting. The usual method of attack for solving these problems involves sending out surveys, forming committees, setting up forums or hiring consultants. For any college that craves a chance from the norm, it may be time to turn directly to the community for a solution.

This is where crowdsourcing comes in. Crowdsourcing refers to the notion of outsourcing to a crowd (hence the name). Essentially, when an organization needs a solution to a problem, instead of investing time and money into generating a solution internally, the organization opens up the problem to a crowd of people for mass collaboration. This method of decision-making is a perfect fit for universities. Read on to find out why.

___*

History of MOOC

The term MOOC is said to be mentioned by two separate individuals: Bryan Alexander (http://infocult.typepad.com/infocult/2008/07/connectivism-course-draws-night-or-behold-the-mooc.html) and Dave Cormier (http://davecormier.com/edblog/2008/10/02/the-cck08-mooc-connectivism-course-14-way/) and this label was loosely posted to a course (CCK08) which was organized by George Siemens and Stephen Downes (who btw has an incredibly interesting educational newsfeed called Ol’daily to which you can subscribe). The CCK08 or Connectivism and Connective Knowledge was a fully open course that could be followed online and for free (there was also a paid, certified option). The idea behind the title of this course is important as it derives from the Connectivism theory which (paraphrasing heavily here) says that learning/training in this era will be successful if we learn how to connect and build relevant networks. This idea of connecting to each other to construct knowledge is one of the key dynamics of a MOOC.

If you want to read up on the subject of MOOC’s or Connectivism, have a look at the references page of this guide.

After this first CCK course several other courses followed: CCK09, PLENK2010, CCK11, LAK11 (Learning Analytics and Knowledge)… but as with all great things, these courses did not emerge out of the blue. The MOOCs were following the trend of Open Education movement described by Iiyoshi and Kumar (2008). The open educational movement focused on open technology, open content and open knowledge. The MOOCs have given rise to a more specific focus on the actual human networking factor within these open courses. Open Online Courses (OOCs) “are live courses, which include direct participation of teachers and rich and valuable interaction among participants” (Fini, 2008, p. 3). This shift from technology to the participant is an interesting one as it coincides with the research focus shift from (mobile) technology to sociology, pedagogy…

Dave Cormier has recorded some great movies describing both what MOOC’s are and how they function. The movies are short and enlightening.

What is a MOOC? Success in a MOOC

___*

George Siemens on Massive Open Online Courses

photo of Michel Bauwens

Michel Bauwens
14th May 2011


George Siemens, pioneer of connectivist learning, in an excellent interview conducted by Howard Rheingold:

“George Siemens, at the Technology Enhanced Knowledge Research Institute at Athabasca Universityhas been running “Massive Open Online Courses” (MOOCs). I talk to him about what a MOOC is, how it works, and the educational philosophy behind it.”

___*

http://designmind.frogdesign.com/

Why iCloud Will be as Important as the iPod

By Adam Richardson – June 13, 2011

Apple’s World Wide Developers Conference keynote last week will be remembered for two things: the bloodbath of disrupted developers and apps it left in its wake, and that it was as important for cloud services as the iPod was for digital music, and that the iPhone was for smartphones.

The Developer Bloodbath

Despite the many cheers from the crowd of developers at the keynote, I reckon there were several hundred third party developers and apps collectively put on notice (and maybe put out of business) by the various announcements. As the NY Times wryly put it, “How do you know if you’ve created a really great, useful iPhone app? Apple tries to put you out of business.” (The Times provides a handy list of apps now scrambling for a second act.)

In truth, quite a few of the things that Apple announced – such as a basic to-do list app, and ways of storing web articles offline for later reading – have become such fundamental needs for so many people that they deserved to be part of the core OS. Unfortunately they are also the bread and butter of many niches developers who saw the same need and leapt to fill it in the intervening years. They will have to rethink and improve what they do, and many of them will I’m sure.

Such is life in the shadow of an ecosystem behemoth. Apple giveth (App Store to give independent developers more visibility and access) and Apple taketh away (obviating the need for those apps in the first place).

Apple has been pretty consistent in adopting good ideas from third parties into its core offerings. Perhaps most famously, Apple introduced the Dashboard feature (a precursor to the iconized app view on the iPhone), to loud complaints of it ripping off a third party developer, Konfabulator who had created something very similar.

As problematic as this can be, it’s all part of Apple’s plan. Chetan Sharma put it succinctly: “Apple’s goal is to commoditize the software, Microsoft’s goal is to commoditize the hardware, Google – both”

Apple has high tolerance for making software free, even if it makes life painful for its developers, because it makes almost all its profit on hardware. For the time being at least, Apple has enough strength and/or momentum relative to Google, Microsoft, media companies and service providers that it can thrive with this approach.

The Mainstreaming of Cloud Services

The announcement of iCloud was met with both enthusiasm and incredulity.

Apple has been firing on all cylinders for years with hardware and software, but has consistently stumbled with services, whether it be the expensive and lackluster MobileMe (the launch of which even Jobs had to admit at the keynote was “not our finest hour”), or the weak reception to its music “social networking” service Ping. (This isn’t a new phenomenon – anyone remember eWorld?) The only service area where Apple has really sung is with its retail stores.

With iCloud, Apple is cinching up the ecosystem it has painstakingly built up, cinching it so tight that it will become increasingly difficult for others – even ones as big as Google – to crack open.

MobileMe was an expensive, under-performing sideshow, but iCloud aims to reach deep into all the other Apple devices and make them all work together better. What was announced on Monday is surely only a hint of what lies ahead in the next 18 months for iCloud, iOS, and OS X all finally getting in sync.

Ironically, iCloud aims to improve on what was arguably the worst part of MobileMe – iDisk, a basic cloud storage feature. Given Jobs’ obvious frustrations with MobileMe, I can’t believe he would let yet another half-baked attempt out the door, especially not one that is now a major strategic piece of the puzzle. Based on the massive data center Apple has invested in, they’re not joking around.

Since the iPad launched, its lack of a file system has meant it’s not a true laptop replacement. One of the brilliant ideas about Dropbox is that it essentially puts the file system in the cloud and moves it off the device entirely. iCloud apparently opens the door for the same thing, and with even superior integration. Today with near ubiquitous broadband and 4G/LTE networks starting to roll out that offer home broadband speeds while mobile, this suddenly becomes a workable solution. (Bandwidth caps, tiered pricing, disappearance of all-you-can-eat data plans? Yes, there are flies in the ointment, but the longterm trend is clear.)

Linking Cloud, Apps, Devices, and OS’s

Consider two things that were discussed separately in the keynote: journaling in the next rev of the OS, Lion (which means no more saving – a file is continuously saved as it’s worked on), and continuous cloud syncing. Voila – you have your most up-to-the-second work constantly saved to the cloud, and made available on every other device.

My feeling is that iCloud will prove to be similar to IBM launching its PC in 1983. Prior to that point, the PC market was highly fragmented and dominated by niche players, and had little mainstream appeal. The arrival of IBM on the scene gave PCs a stamp of credibility and stability, and they gained sharply more acceptance. IBM made PC’s “easy” to get into, made them relevant, and created the archetype which others would mimic for decades.

Apple pulled off the same feat with mp3 players and smartphones, for largely the same reasons. So it will be with iCloud. Cloud services are not new (neither were mp3 players or smartphones), and the fact is that much of our critical data already lives in the cloud, via various web apps, service subscriptions, and email. But until now the various services have been poorly integrated, and offered by startups that many people don’t feel comfortable handing their data over to, whether for security or long-term availability/stability reasons.

They haven’t been ready for the mainstream, and iCloud will come to be seen as the turning point which changes that. For consumers who don’t yet get the relevance of cloud, the media syncing across devices provides the carrot to get into the concept.

MG Siegler looks at the different approaches to the cloud being taken by Apple, Google and Amazon, and notes that “Apple’s belief is clearly that users will not and should not care how the cloud actually works.” Exactly. This is what Apple does best – take complicated things that most people don’t care about, and makes them easy and understandable for a mainstream audience.

AVP of Marketing Strategy Adam Richardson is the author of Innovation X: Why a Company’s Toughest Problems are its Greatest Advantage. His book is the manual for leaders looking for clarity about the emerging challenges facing their businesses. You can follow Adam on Twitter @richardsona.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s