Commercial certification is now, very visibly, beginning to replace the older academic routes into the IT industry - so why has this come about? Industry now acknowledges that for an understanding of the relevant skills, certified accreditation from the likes of CISCO, Adobe, Microsoft and CompTIA is closer to the mark commercially - saving time and money. The training is effectively done through concentrating on the skill-sets required (alongside an appropriate level of related knowledge,) as opposed to spending months and years on the background detail and ‘fluff’ that degrees in computing are prone to get tied up in (to fill up a syllabus or course).
I remember when I started in IT, there were no certification course at all. Degree courses were pretty focused on the conceptual side of IT and I was very lucky to start my career with a bunch of people who were very good at implementing stuff because a) University taught me fuck all of use in the real world and b) I could easily have wound up working for incompetents, which would probably have set my personal standards a bit lower.
But when certification came along, we were all pretty jaundiced about it. And when I was employing people, I was much more interested in people who wanted to be good at what they did than people who had a bunch of letters, whether degrees or certifications behind their name.
And I still feel the same today.
However, I'm probably in a minority here and for all sorts of reasons, I reckon it now probably better for you to have both a degree and some kind of technical certification under your belt. It gives your employer some confidence that you have both a theoretical and a practical grasp on technology.
It sucks, really. I really don't believe in certification, but I know that in the highly competitive job market out there, having some certification is almost mandatory.