|By Roger Strukhoff||
|September 14, 2012 08:18 PM EDT||
I remember thinking as a 22-year-old just out of college that I would be able to make wise business decisions because I'd read Plato's Republic. Even studied it in a Classics class, no less. Such is the hubris of youth, or at least of mine.
I'm drawing an analogy here with my question in yesterday's piece about how to go about creating a specialized cloud-computing college degree.
A quick review of some CS departments in the US finds requirements that include many semesters of programming, a few of software engineering (including working with frameworks), and some new-fangled, specialized courses for developing mobile apps. An intro to cloud computing can be found as a special topic here and there.
Would it be beneficial to education, IT, and humanity if a more-or-less standardized cloud-computing regimen were to emerge? And if it did, would it create a new generation of IT employees who thought they were now able to drive an enterprise-wide cloud strategy because they'd taken a course in RESTful services?
Actually, CS students should probably be required to read Plato's Republic to get a grasp on some of the ethical dilemmas they'll face with the use of the technology they design and deploy. Sun co-founder Bill Joy would routinely cite The Greeks when talking about how he approached his work.
What does it take to become proficient with cloud? Does it truly require a lot of experience - seeing what can and does go wrong, understanding past approaches that caused past problems - in addition to getting up to speed on the different programs and environments one will encounter with cloud? How much can be taught at the undergrad level? What courses should be included, even if the newly minted grad cannot quite yet be deemed a cloud guru?