108470 éléments (108470 non lus) dans 10 canaux
Corporate leaders, especially those considered thought leaders, have a firm right and assumed expectation to voice their opinion on public policy. As such, I've got no inherent issues with Facebook's founder and chief, Mark Zuckerberg, proclaiming his positions publicly -- even if I happen to blatantly disagree with them.
Mark and other leaders of their respective industries have something very intrinsically unique that most others do not. That comes in the form of the power of the pulpit. It places their opinion in a position which bring inherent weight and reach which many could only dream of. Yet, this advantage comes with a social responsibility so as to not abuse or otherwise misuse the privilege.
There are two subtle, yet very troubling, problems which stem from Zuckerberg's outright (now quite public) shaming of coworkers who disagree with his political viewpoint on Black Lives Matter. And thanks to Gizmodo, where news of this story first broke, the tainted prism with which Zuckerberg's position is being framed says it all:
Mark Zuckerberg Asks Racist Facebook Employees to Stop Crossing Out Black Lives Matter Slogans
The first area of contention I have, which most of the media has either avoided or forgotten to address, is the notion that Zuckerberg's coworkers who don't see eye to eye with him on Black Lives Matter are, as Gizmodo insinuated, racists.
Not only is this characterization blatantly false and self-serving, but it's also an assertion which a majority of America doesn't agree with Zuckerberg on. As public polling data (referenced below) proves, Zuckerberg's narrow views on BLM are in the minority, with most Americans -- and most importantly, even African Americans -- not having a favorable view of the #BLM cause and preferring the All Lives Matter tagline instead.
The bigger issue at play is perhaps how this may be one of the more important recent examples of corporate figures crossing the line from acceptable political advocacy to outright unwanted thought policing of their staff.
As the face of the largest social network in history, Zuckerberg risks taking his personal beliefs beyond corporate policy positions and instilling a culture of zero dissention among the ranks of his workforce. This blurring of the lines between protected free speech and a future potential of say, a Facebook Charter of Acceptable Free Speech, could spell trouble for not only Facebook employees, but set a terrible precedent for workers in other non-IT industries.
It sounds crazy, Orwellian, and perhaps far fetched, but do realize that this barrier is about to be crossed by one of the largest countries in the world: China. It comes in the form of a little-covered initiative, known as the Social Credit System (SCS), that will become mandatory by 2020 for all citizens of the state.
Some parts of the internet took aim at a chilling photo Mark released from his appearance at Mobile World Congress, surprising hundreds of guests while they were entrenched in Gear VR headsets. Possible allusion to a modern, Orwellian future dictated by the whims of our IT industry leaders? That's up for interpretation, but a reality some say may not be far fetched.
Taking its cues from the American credit ratings system, this far-reaching program will not only take into account financial standing, loan history, purchasing history, and related aspects of personal life, but it goes one scary step further.
Relationships, gossip, social media presence, and political dissent will factor into this "rating" that each person gets. Just as the American credit rating system allows for the indirect control of unwanted behavior in financial decisions, China is hoping to keep its citizens' political thoughts and government opinions in check to a T. Yes, thought policing in its most acute and tyrannical form.
This is also the same country which still actively blocks Facebook access to nearly all citizens within its borders. It's not hard to believe that Mark Zuckerberg would love to expand his social empire into this untapped market.
And if that potentially means playing by China's rules, would he oblige? Would the SCS get unfettered Big Data taps into Facebook's never-ending hooks it enjoys in its users' lives?
That's a great question, which makes you wonder. Given Mark's recent propensity towards directing intra-company censure on political issues he deems unworthy, where would the Facebook founder put his foot down?
Data Suggests Zuckerberg's #BLM Stance Doesn't Match America's
That Facebook is made up of employees which, just like the rest of society, represent America at large is something which can't be denied. To me, per media coverage afforded Zuckerberg in blanketing those who disagree with the tagline Black Lives Matter, the assumption is being made that Mark is either in the majority viewpoint or that Facebook's staff exists in a vacuum. Both assertions couldn't be further from the truth.
Hence why publicly available data, which has been widely distributed online, is critical in establishing the argument that these workers weren't "fringe" activists in any way. Judging by polling data released, a wide majority of Americans don't see eye to eye with the name or the motives of Black Lives Matter.
Numbers released by Rasmussen last August stated the mood of America at large when it comes to #BLM. An overwhelming 64 percent of interviewed blacks claimed that, contrary to Zuckerberg's beliefs, All Lives Matter, not Black Lives Matter, was the phrase which more closely associated with their views. This sentiment held true with 76 percent of other non-black minority voters, as well as 81 percent of whites.
Even further, a majority, or 45 percent of all those interviewed, believe that the justice system is fair and equal to black and Hispanic Americans. And perhaps most shocking, a resounding 70 percent of voters believe that crime levels in low income, inner city communities is a far bigger issue than police discrimination of minorities.
It's not surprising that inner city crime is so high on the list of concern for average Americans. These death tolls in the inner cities are shoveled on local news segments daily with such frequency that most people are numb to their dire meaning. And most acutely, much of this crime epidemic comes in the form of "Black on Black" crime, as penned by DeWayne Wickham of USA Today.
Wickham says:
While blacks are just 12.6 percent of the nation's population, they're roughly half of people murdered in this country each year. The vast majority of these killings are at the hands of other blacks.
And DeWayne's questioning of the response at large by the #BLM movement continues, addressing their refusal to discuss black on black crime:
Why such a parsing of contempt? Maybe the people who've taken to the streets to protest [Trayvon] Martin's killing don't care as much about the loss of other black lives because those killings don't register on the racial conflict meter. Or maybe they've been numbed by the persistence of black-on-black carnage.
As such, the credibility of the #BLM movement at large continues to dissipate in the eyes of Americans. A follow-up Rasmussen study of 1000 likely voters, done mid-November 2015, follows the public's dismay with the group's approaches and message.
According to the numbers published, by a 2-to-1 margin, voters don't think #BLM supports reforms to ensure all Americans are treated fairly by law enforcement. A further 22 percent of those surveyed just "aren't sure" whether the group truly supports its stated goals.
And on the question of whether blacks and other minorities receive equal treatment by authorities, this number only went down one percentage point from the August 2015 study, to 44 percent, of those who believe this is a true statement.
A full 30 percent of African American voters said that #BLM doesn't support reforms, with another 19 percent not being sure. A sizable 55 percent of whites said the same about #BLM.
While the racial makeup of Facebook's employee base has been mentioned numerous times in media recently, this has little bearing on the nature of whether there was just cause for Zuckerberg to so publicly throw his staffers' political viewpoints into the spotlight and frame them as malicious or metaphorically, racist.
As the numbers above prove, the overwhelming majority of the American public which includes black Americans, consider themselves better represented by All Lives Matter. Why is Zuckerberg's personal will and viewpoint more important than that of not only his coworkers, but more importantly, that of America's? It's a question I just can't find a reasonable answer for.
Let's make this very clear up front: while Zuckerberg framed his wrath in the prism that coworkers were espousing racist feelings by replacing #BLM markings with the more inclusive #ALM alternative, this mischaracterization of the larger issue shouldn't be mistaken. Mark has a personal, vested interest in pushing his own beliefs on that of Facebook workers. This would have never been such a story if he allowed public discourse to occur on its own, just as it does outside the protected walls of Facebook offices.
While scouring the web to try and make sense of how Zuckerberg's interpretation that Black Lives Matter inherently, actually, really means All Lives Matter, I found numerous discussions online trying to pin justifications for how BLM places an inherent importance on raising black deaths on a pedestal -- while being supposedly careful to stay inclusive, as its supporters duplicitously claim.
This roundabout, purposefully misguided framing of what most of America finds to be a divisive name is what naturally gives rational thinkers like myself cause for alarm. Why would these people, including Zuckerberg based on his wild publicized reaction, not prefer the term which Americans' at large say represents them best -- in All Lives Matter?
Some say actions speak louder than words, and BLM has been doubly guilty of doing many deeds which go counter to their stated goals of equality. Take for example the Nashville chapter of BLM's attempt to take a page out of the segregation era, and try to host a color-only meeting of their members at the Nashville Public Library. Mind you, this happens to be a place that is taxpayer supported and as such, open to the entire public, regardless of color.
The Nashville BLM chapter leader, Josh Crutchfield, didn't deny that their branch has this rule. "Only black people as well as non-black people of color are allowed to attend the gatherings. That means white people are excluded from attending".
The chapter, naturally, claimed that the Library's decision came down to white supremacy in the local government. If that reasoning makes any sense to the average person who may support BLM's objectives, I'd love to hear in what manner. That a taxpayer funded establishment couldn't discriminate entrance based on color in the year 2016 is quite a disappointment for Nashville BLM. I'm perhaps more surprised that the media hasn't picked up on this story to a greater degree.
In a similar display of rash white shaming and public berating of innocents bystanders, BLM protestors decided to hold an impromptu rally inside the Dartmouth College library -- with hooting and chanting in tow.
Protesters were reportedly shouting “F– you, you filthy white f–-” “f– you and your comfort” and “f– you, you racist s–". The college newspaper went into detail about the incident and the kinds of abuse that innocent students who had nothing to do with taking an opposition view had to endure.
Is this the movement and approach that Zuckerberg so proudly decided to stand behind from his pulpit? While the rest of America is judging BLM in a lens of totality that takes into account its actions, Mark is seemingly blinded by the idealism he perhaps wishes to establish as Facebook's public policy stance.
To put a crescendo on the discussion about BLM's shortcomings, I found the words of 1960's civil rights activist Barbara Reynolds to provide quite the clarity on the issues at play. She wrote a lengthy op/ed on how BLM has the right cause but absolutely the wrong approach:
At protests today, it is difficult to distinguish legitimate activists from the mob actors who burn and loot. The demonstrations are peppered with hate speech, profanity, and guys with sagging pants that show their underwear. Even if the BLM activists aren’t the ones participating in the boorish language and dress, neither are they condemning it.
If BLM wants to become a serious patron of bearing the message it is trying to convey, it needs to ditch the white shaming, hate rhetoric, and brazen acts of violence. And most importantly, it needs to distance itself from the distasteful actions taken by decent portions of its base. Sitting by idly without denouncing bad behavior being committed in the name of a movement paints the movement at large. Perhaps, only then, would Zuckerberg have a leg to stand on when it comes to upholding the right and just side of history.
BLM, to its likely dismay, also doesn't care that the only African American formerly in this year's presidential race, Ben Carson, happens to agree that All Lives Matter more accurately reflects the sentiment he holds. "Of course all lives matter -- and all lives includes black lives," Mr. Carson said. "And we have to stop submitting to those who want to divide us into all these special interest groups and start thinking about what works for everybody".
Thought Policing Goes Digital in China: A Model for Zuckerberg?
If you're interested in getting a feel for the direction of an organization, or a country for that matter, just look at the stated motives of those in charge. And in this prism, China's goals for the Social Credit System have some overt overlapping with Facebook's own mission statement.
Per the Facebook Investor Relations website, Zuckerberg's corporate intentions for the company are quite simple:
Founded in 2004, Facebook’s mission is to give people the power to share and make the world more open and connected. People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.
It's interesting, then, to read the stated intentions behind China's proposed SCS, as put forth by the State Council (per New Yorker):
Its intended goals are "establishing the idea of a sincerity culture, and carrying forward sincerity and traditional virtues," and its primary objectives are to raise "the honest mentality and credit levels of the entire society" as well as "the over-all competitiveness of the country," and "stimulating the development of society and the progress of civilization".
The words chosen for each directive's goals are markedly different, but the intersecting junction of their larger motives are quite the same: making the world more connected and better as a whole.
Yet for the cautious outsider, it's not difficult to see why you may connect some logical future dots between Facebook and a state-imposed SCS like China is rolling out in 2020. Per the New Yorker piece on how one Chinese company is molding its variant of a SCS, "Tencent’s credit system goes further, tapping into users’ social networks and mining data about their contacts, for example, in order to determine their social-credit score".
It's no secret that China employs one of the grandest firewalls in the world, blocking its millions of users from "sensitive" topics deemed unhealthy and politically off-limits by the State. Facebook's website sits on this massive blacklist... so far. A little negotiating between Mark and the Chinese authorities may change matters quickly.
The SCS trials are already live in China, with users having access to mobile apps (one shown above) which gamify the rating process. The various tenets of what makes up one's score -- financial dealings, judicial rulings, mutual contacts, political dissent, and more -- are all built into the algorithms being used. There's no secret in the fact that Facebook APIs feeding this data mining machine would be digital gold to the ruling Communist party in suppressing freedom of thought. (Image Source: ComputerWorld)
Is it far fetched to believe China wouldn't give Zuckerberg passage onto the whitelist if they only agreed to allow the eventual SCS to tap into any data point Mark's datacenters can store and organize?
Let's not forget the lengths to which Mark will go in order to disguise goodwill just to gain a few Facebook users. His company was most recently embarrassed when the Internet Free Basics phone service was barred from India and Egypt because of its hypocritical stance on net neutrality. Zuckerberg's vision of connectivity is firmly seated in an experience that puts Facebook front and center, which Internet Free Basics made no effort to hide.
Mark's critics on Internet Free Basics were so fueled up that they penned enough signatures to send an open letter to Zuckerberg -- spanning support from 65 organizations across 35 countries. The crux of the debate which connects the SCS situation to this latest corporate blunder shines a light on one very critical aspect here: that Mark's company will go to great lengths in order to ensure global penetration.
And having Facebook blocked in China is likely a very troubling, omnipresent problem for the company. Any kind of legalized access for Facebook in the mainland would in some way have to allow for their willingness to oblige in the SCS. It simply defies logic to see this reality end up in any other outcome if the two are to become eventual bedfellows.
The more important moral question here is whether Zuckerberg would have any moral dedication to withholding Facebook from China if SCS penetration is demanded in exchange. It's no secret that Mark's obligation to enhance Facebook's bottom line trumps much of his policy direction.
His company is on the forefront of the H-1B visa debate, arguing that more skilled foreign labor is needed to fill talent shortages here in America -- something, as someone myself who employs those in the IT industry, believes is a big fat falsehood. As others have pointed out, the numbers behind the argument don't add up, and Facebook is just disguising a desire for cheap foreign labor in this newly created false argument.
How dangerous could Facebook's cooperation with the SCS in China be? For average Americans, at least on the short term, quite disinteresting honestly. The dictatorship in China will tweak and hone the SCS, sucking Big Data out of the likes of Facebook for years to further repress its own people who choose to express a free political will. That's always been the big untold agreement in Chinese society; you can have a grand Western-style capitalist economy, but don't dare to import any of those democratic free speech principles.
But the larger moral dilemma at hand is how Zuckerberg would use the experiment in China to encroach on Facebook users, or staff, closer to home. The title of this article alludes to the potential rise of corporate Thought Police. Such a reality doesn't seem as far fetched when the technology is almost there to enable it, and bigwigs like Zuckerberg are showing more willingness to push top-down political agendas.
Zuckerberg isn't shy about the kind of society he wants to foster. Giving us Facebook was the beginning, but what does the next step look like? I believe controlling and censoring the medium may be a logical next step. (Image Source: Yahoo News)
And by all means, I'm not the only one pushing alarm to this possible reality. In the wake of Mark's BLM outburst, piece after piece after scathing piece came out against the notion of a Thought Police encroachment by Mark or other social media behemoths.
Such a reality would likely begin with unpublicized cooperation with the state of China in implementing API hooks between Facebook and their SCS. Naturally, Facebook would then have the technical insight into how China culls and acts on the data gathered, but more importantly, a public precedent for such a civil liberty dystopia.
While there is no automated approach to cleansing Facebook's corporate office whiteboards, lessons learned from a potential SCS implementation with China could very well help enact such technology against its own staff Facebook accounts. Eradicating staff discourse which goes against Mark's agreed upon thought positions would allow for testing broader US-based social media censure, and it could be twisted in such a way to promote adherence to Facebook's corporate mission statement.
If trials went well, could we expect a rollout to all Facebook users? Again, if the dots on the above established logic can be considered viable, this wouldn't be far behind.
It's a dangerous dance to imagine; an ironic reality where the supposed leading platform in internet free speech would become the forefront bastion of political censorship and cleansed open discussion.
Here's hoping such a reality never comes to fruition. But if Mark's now public BLM outburst and subsequent staff shaming is any indication, beware a frightening future of social media that becomes curated by topical whitelists dictated by corporate Thought Police.
Dissenting opinion certainly will need not apply.
Headline Image Source: Cult of Mac
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
After spending the last few days soaking up as much as possible on the Apple-FBI San Bernardino iPhone spat, the evidence -- in my eyes -- has become crystal clear. Apple's planted itself on the wrong side of history here for numerous reasons, and is using nothing less than a finely scripted legalese tango in defending its ulterior motives.
As a part time, somewhat auxiliary member of the tech media at large, I'm a bit embarrassed at how poorly this story has been covered by my very own colleagues. Many of those who should undeniably have a more nuanced, intricate understanding of the technical tenets being argued here have spent the last week pollinating the internet with talking point, knee-jerk reaction.
Inadvertently, this groupthink is steering Apple's misguided arguments forward to a populace that otherwise relies on the tech media's prowess in unearthing the truth in such matters. This is one such case where technical altruism is blinding the real story at play here, which are Apple's design flaws -- in other words, inadvertent insecurity bugs -- found in the older iPhone 5c.
For those that haven't kept up on this story, you can get a great primer on where the Apple vs FBI situation stems from and its surrounding core facets. ZDNet's Zack Whittaker has a great FAQ post that digs into the major topics at hand in an easy to understand manner. No need to retread already covered ground in this post.
Apple's Framed Narrative in Twisted Prism of Encryption
The tech media is writing story upon story that makes mention of supposed backdoors the FBI is requesting, which entail things like "master keys" which could potentially unlock any encrypted iOS device. While there are far too many media stories I could link to which prove such misinformation dissemination, I'll point to posts like this one on Venturebeat and even coverage on the otherwise usually judicious podcast, This Week in Enterprise Tech (episode 177 is where the Apple/FBI case was dissected at length).
While this self-absolving narrative is making its rounds, let's not forget where this all began. It was Apple itself, in its now famous open letter which was published on Apple's website and signed off by no less than Tim Cook himself. And for that, shame on him.
Many in the media have mistaken this to be a case about phone encryption, due to Apple's framing of the discussion in such a light. In reality, the FBI is merely asking Apple to help create a special iOS firmware for a single iPhone 5c which could disable forced-wiping after 10 entries, and altering the timeout delay between entries. Apple's attempt to sway the narrative leads me to believe it is more concerned about corporate image than public safety. (Image Source: Mercury News)
I know very well that as the leader of a massive publicly traded company, Cook has a duty first and foremost to his most critical stakeholders, those being Apple shareholders. But the finer point which Apple forgets in its shameless fight with the FBI is that the very sacred tenets of American democracy and capitalism have allowed his firm to grow to such unprecedented levels. There is very well a balancing act which needs to be distinguished in a free society that stands at the folds between security and privacy.
The FBI is not asking for any kind of encryption "master key" here, let's be very clear. Such a request would be an overreach of the inherent division that is required to ensure the greater security of data for the masses in question here. And such a request would be one that I would, as an IT professional, yet more importantly, a member of this society, be very succinctly be opposed to.
But this is not what has been asked of Apple, and not what's at stake for the company. This move is driven by a PR objective aimed at keeping Apple's ego and image in something it preaches so dearly: security.
FBI's Request Indirectly Forces Apple to Admit iPhone 5c Insecurity
If you're curious as to how I could come to such a conclusion, you can feel free to glean through the same well written, and lengthy, expose on this situation which convinced me on the subject with clear technical validation and reasoning -- not purely emotional knee-jerk reaction. The post is on the blog for a company called Trail of Bits which has noted deep expertise in security research.
Much of my very stance on the subject is also reflected in Mark Wilson's post from a few days ago right here on BetaNews. Even Trevor Pott of The Register penned a rather wordy, but pointedly accurate piece that confirms what the Trail of Bits blog post puts forth as a theory.
"What appears to be involved is a design flaw. Something about the iPhone 5C in question is broken," says Pott in his Register article. That's right, a design flaw which happens to be the complete lacking of the "secure enclave" which is detailed at length in the Trail of Bits blog piece.
If this were a newer A7 or newer powered iPhone, the FBI's chances of getting in without asking for the dreaded pandora's box "master key" (which doesn't exist) would be next to zero. But Apple never included this security facet on its earlier phones, and herein lies the very nuanced tenet of what the FBI truly wants to be able to leverage.
The FBI doesn't want and has never asked Apple for any kind of master key. It's asking for mere assistance in re-engineering its way through a known security flaw in Apple's iPhone 5c device which doesn't tie PIN entry and authentication to the internal data through the use of this secure enclave. While Apple won't admit as much, this is very much so a security flaw that Apple obviously will never be able to fix for iPhone 5c owners, and naturally, has every intention to re-architect the argument on this situation to deflect any potential for this criticism to reach critical mass.
And even more acutely, the FBI and Justice Department aren't asking Apple to make this available to "all" future iPhone 5c devices recovered in the course of policing. The DOJ says Apple has the free will to "keep or destroy" this special firmware after its purpose is rendered for the FBI's requests. So Apple's consumer-focused defense that it is being asked to "hack its own users" is just another attempt to misconstrue the real intentions of law enforcement here.
One important fact which some of the media has glossed over is that the iPhone in question was not even a personal phone of the shooter. The device was actually a work-issued device that was handed out by the San Bernardino County Department of Public Health, and in turn, is considered employer property with all accompanying rights that employers have over the data stored on those devices.
Apple's A7-powered and newer iOS devices all employ an internal lockbox known as the "Secure Encalve" which broker access to encryption keys used to access user data. The iPhone 5c lacks this very item, which makes the FBI's chances at getting into the San Bernardino shooter's iPhone very possible -- and technically proven feasible by security experts. But Apple's ego, partially built on an image of security, naturally forces its arm in trying to trump the FBI's request. (Image Source: Troy Hunt)
I'm not here to use Apple as a pincushion, as the industry at large needs to double down in its attempts to put its actions where its words are about security. But Apple deserves heat here, not only because it's putting shareholders first above national security, but because it has previously been guilty of trumpeting "security through obscurity" as I've covered at length in previous posts.
Any reasonable technology company is going to have bugs and defects in its devices and code. That's the nature of the beast, and understood by IT pros like myself. But Apple has built an empire in part by its clever marketing teams that have flaunted layers of security which supposedly beat and exceed those of any other company's competing products.
Sometimes, it is in the right and marketing matches reality.
But many times, like with the now-dead claims that OS X doesn't get malware which I fought against for years, Apple put greenbacks before fiduciary responsibility to be honest about its software and device capabilities. And while the cessation of the famous "I'm a MAC" advertising campaigns signaled a more subdued competitive standing on the OS X front, Apple's big moneymaker isn't in desktop computers anymore, it's in iPhones that it sells by the millions.
How does this round back to its reluctance to work with the FBI? Very simply, doing so would inadvertently admit that the iPhone 5c indeed has the security flaw which the FBI and the industry has exposed. And the problem for Apple is that it has created an ego bubble for itself which fans have bought into that has security as a notable keystone.
If that keystone falls here, Apple's back to square one with winning back its fans that place i-Devices on a pedestal most other manufacturers only wished they had.
Put in other words, it's Apple's ego at stake here. And it takes that very, very seriously if you haven't noticed.
Apple Has an Undeniable Duty to the Society it Built its Fortunes On
We've clearly established some very agreeable, black and white, facts surrounding this situation based on everything I've linked to above:
As such, I'm convinced beyond any doubt that the abilities to get into this phone exist, and can be done so in a way to protect the universe of iPhone users at large from massive data grabs by legal overreach. Apple's denial in helping the FBI, as described earlier, is not grounded in technical validity, but rather being driven by a corporate ego that has grown too large for its own good.
Apple's feelgood and impenetrable stances on its device security are at risk of being exposed to the masses. For a company that has built its fortune around peddling a larger-than-life notion about its own security prowess, this would spell downright disaster in the marketplace, especially in the newfound re-emergence it has found in the previously reluctant Enterprise market towards its products.
VOTE NOW: Poll: Should Apple help the FBI unlock the San Bernardino iPhone?
But let's go beyond profitability reports and corporate egos, as the larger extrapolation here is Apple's duty that it owes to the citizens of this very nation. A country that is now in need of a compassionate about face by Apple so we can connect the dots on a terrorist situation that will not only help explain the events leading up to the San Bernardino massacre, but likely expose critical nuggets of information about other future plots or combatants.
Apple's attempt to paint this discussion in a sea of technicalities and promotion of the privacy of its users at large ends up falling on its face when the facts are dissected in sunlight. If that very sunlight means Apple's design flaws must be vaulted into public discussion, so be it. That's the duty it owes its users in being assured that its designs are not merely existing in a lab -- but being tested, sifted, and penetrated to make future generations of hardware better on the whole.
While our democracy has been historically opposed to gross intrusion of privacy, as seen in opposition to ad-hoc phone record dredging, a common sense approach towards nuanced security needs is something we cannot become blind to. Companies and advocates like Apple will try to smokescreen their intentions with public decrees like Tim Cook's in a blanket position on privacy, but even its future has just as much at stake if the terrorists can use an over-extended privacy veil as its own.
The day the Justice Department calls for blanketed encryption"master keys" from Apple is the day I will stand with Apple. But that day is not today, as Apple has not and is not being asked as much.
Do the right thing, Tim Cook. Your company enjoys prosperity through the same democratic society that is pleading with you to put the future of our nation ahead of personal or corporate motives.
If future deaths could have been prevented acutely via that iPhone 5c you refuse to help unlock, what kind of responsibility will fall on Apple's shoulders? Only history will be the judge of that.
Image Credit: klublu/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
As an IT professional by day, it's a question that has confounded me for some time. I've tossed it around in my technical circles, trying to get a feel for what true reasons exist for Apple's double standard when it comes to not allowing OS X onto other platforms -- but gladly allowing Windows to run natively via Boot Camp.
How come Apple doesn't allow PC users to install and run OS X on the hardware of their choice?
I know very well there are business reasons it doesn't allow it. And I also know that the company has legal restrictions in place to prevent it from happening as well. But that doesn't answer the why of what I'm digging at; financial and legal restraints are merely artificial boundaries for something that is otherwise quite feasible, as I'll prove below.
Apple makes a lot of money on the hardware it sells with each OS X system, and it is a corporation, so 2+2 here makes sense. It has a moral obligation to shareholders to maximize profits for the business. And as such, it has constructed licensing legalese to help keep the kingdom of Apple computers strong.
But I wanted to step back and take a more holistic, almost philosophical approach to this debate. One that takes into account consumer choice, hardware innovation, technical feasibility, and other points of interest that may or may not have been tossed around.
So that I can get it out in the open, I'll fully admit my curiosity on this subject stems from my own personal objections for why I have never purchased an Apple computer. Some would come to the conclusion that this makes me an Apple hater, but that's merely a convenient way for Apple loyalists to paint me as someone who doesn't have any merit to my opinion. How wrong they are.
I'm a tinkerer at heart, and can't stand the closed nature of the hardware around Apple's computers. Likewise, I've never been satisfied with the limited choice Apple affords buyers of its computers. It has always adhered to a Henry Ford-esque mentality when it comes to choice, and it goes against my every grain of consumer free will in gravitating towards more options, not less.
And perhaps my biggest stoic objection to Apple has always been a philosophical one coming from my dislike of the crux of what supports the Apple OS X computer business: a reluctance to allow OS X onto anything other than Apple-branded hardware. I'm a firm believer of keeping my dollar vote strictly aligned with companies that see eye to eye on things like consumer choice, software freedom, and price competition.
When it comes to these areas which I hold dear, Apple has never satisfied. As such, I've chosen to stay away from its products, which is my option as a consumer.
I know I'm not alone in questioning Apple's long held business practices. PCMag has covered the topic in the past, and online forum goers frequently opine on the merits of Apple's ways. Judging by online commentary, a big portion of Linux users stay on that platform because they refuse to allow Apple to control their system of choice.
Others, like Richard Stallman, go much further in outlining the reasons they refuse to buy Apple, covering things from its reliance on proprietary screws on devices to its love of DRM on most items sold in its online media stores.
For me, as an enthusiast and IT professional, I believe that Apple allowing OS X onto PCs would be a big move in showing the goodwill needed to win back lost trust from people like myself.
Would it happen? Could it happen?
Here's my top list of reasons why it definitely should happen.
8. Isn't Apple's Current OS X Stance Hypocritical?
Apple is no stranger to having zero shame for saying one thing and doing just the opposite when it suits its interests. The most recent example of this blatant double standard when it comes to Apple is its introduction of an aptly named "Move to iOS" app on the Google Play store aimed at -- you guessed it -- converting Android faithful back to Apple land.
Numerous outlets pointed out the hypocrisy of this shameless maneuver, seeing that Apple matter of factly rejects any app submissions into the App Store which merely mention another mobile operating system. Its official App Store submission policy makes no effort to hide this.
It goes without saying that one must ask the obvious: how come Apple has no problem with gladly helping users get Windows to work on its own machines, but refuses to budge in allowing OS X onto PCs? Wouldn't this be the fair, honest approach Apple could take to show its commitment to goodwill and a betterment of the technology world?
Its marketing department has tried to claim as much, in not these exact words, over the last decade or so. In my eyes, this would merely be an extension of its already established corporate mantra.
The Apple faithful see no issue with this, but as someone deeply entrenched in this industry now for a decade already, I've always wondered how no one has the audacity to call Apple out on its arguably biggest double standard.
The fruit logo company has similar opinion disparity when it comes to technology patents. Apple has a history over the last decade of calling out other companies (Samsung, Microsoft, others) in outright copying the "hard work" its company invested into bringing certain items to market. Yet, when caught on the receiving end of such complaints, Apple insinuates that the patent system is "broken".
And on the political front, Tim Cook's outspoken stance on gay rights in the USA pales in contrast to what he has refused to say on the global stage. There's economic convenience in Cook's obsession with gay rights only pertaining to the USA, because a large portion of the global markets Apple sells within have atrocious records on gay rights and women, as Carly Fiorina pointed out. Tim Cook knows full well that causing too much of a stir in many of these Middle Eastern and Asian markets would spell catastrophe for Apple sales there.
It's no secret that Apple is now looking to make inroads even in Iran, where gay people can legally receive the death penalty for their "crime". Where's the outcry from Apple's loyalists?
Time and time again, Apple has shown no reluctance to take stances where economic realities uphold the best return on Cupertino's dollar. Even if it means blatant hypocrisy in keeping such positions, whether it be OS X on PCs or gay rights.
7. OS X Already Runs on (Mostly) Standard PC Parts
Apple has been on an upwards trajectory when it comes to using standard PC parts, ever since it announced it was dropping the horrid PowerPC platform in 2006. This wasn't always the case. The 1990s were replete with Apple Macs that had proprietary boards and cards and memory chips. Repairing these machines with proper parts meant you had to always get the Apple variants -- which came with expected price premiums that kept the Apple hardware market pricing artificially inflated.
But those days are long gone. Apple learned its lesson and has been stocking every Mac desktop and laptop with (mostly) standardized components which can be purchased at no premium by any technician. This is great from a repair standpoint, and even better for another reason: it means that there is little technical roadblock to preventing OS X on traditional PCs. Intel x86 on regular PCs is the same as it is on Macs in almost every regard.
This point was proven factually possible in the market by a company called Psystar which sold Mac clones for a fraction of what Apple sells its own systems for. Apple's legal department was able to squash the startup with ease in the courts, but the crux of the discussion on whether OS X can be reliably installed and sold on non-Apple hardware was already shown as viable.
And today, this mentality lives on in various websites that offer easy instructions for running OS X on nearly any PC system -- a method dubbed "hackintosh" in tech circles online. We won't link to any of these so as to keep Apple's legal team away, but you can do your own searching. It's out there, it works, and proves that the only party standing between OS X on regular PCs is Apple.
6. OS X Could Finally Become a Competitive Desktop Gaming OS
While gaming on OS X is better than it has ever been, that's not saying much. Some popular titles are available on it, but a large portion of hot upcoming or already released games that Windows enjoys have no plans on releasing onto OS X.
Examples include the new Star Wars Battlefront, Metal Gear Solid 5, Battlefield 4, Fallout 4, Rise of the Tomb Raider, and Just Cause 3, to name just a few.
I couldn't find a single example of a title that came out on OS X but not on Windows. Such a case doesn't exist from what I can tell, which explains why PC gaming is Windows territory by and far.
Does it have to stay this way? Absolutely not. Apple could grow OS X into a legit secondary PC gaming platform if it opened up usage on regular PCs. I'm of the belief that there are a few major reasons why the gaming industry doesn't waste its time on porting titles to OS X (on the whole, but not in all cases).
One major obstacle is Apple's arguably low market share, especially on the global market (currently just over 7 percent, according to Net Applications as of Sep 2015). Windows makes up over 80 percent of that space on the desktop/laptop side. It doesn't make fiscal sense to employ the time, energy, and money to make games for OS X with such a small sliver that OS X enjoys. If PC gamers could have the choice to purchase OS X for their PCs, giving Apple the same competitive choice to otherwise new Windows buyers, this may tip the OS X scale on a global level. As such, developers would likely give OS X renewed interest in the platform as a whole.
Another item that stems directly from this low market share perspective is the time and effort that hardware device makers -- namely graphics giants like AMD and nVidia -- have to invest in getting performance on par with where it stands on Windows. The overall mindshare that has been dedicated to this on Windows has been growing for over two decades already. On OS X, comparatively little attention is placed on gaming performance for reasons stated above.
And finally, I think Apple's artificially premium pricing on its own hardware isn't helping matters when it comes to penetration. If educated consumers were given a choice of buying OS X on a plethora of competing systems, many of them would appreciate the choice in cost and quality of their machine. Segments of the market which otherwise can't afford an Apple would now be welcomed into the ecosystem their friends may enjoy, shrinking problem #1 I referenced a few paragraphs earlier.
While the gaming community has never traditionally been one that Apple has cared to cater to, it could easily grow OS X as a gaming competitor to Windows with simply opening OS X up to the PC market.
5. OS X Could Move Into New Avenues
It goes without saying that Apple opening up OS X to the PC market as a whole would have larger ramifications than just placating its critics. There are numerous secondary avenues that some have only dreamed of OS X being usable within, but that nasty licensing roadblock sits in the way. What dividends could reaped from potentially opening up OS X to the masses?
Many, in fact. One major area that my company FireLogic has been involved in implementing for organizations are VDI solutions -- namely Windows RDS backbones running on Hyper-V. I've penned previous deep dives on how fantastic the technology is with Windows Server 2012 R2. But the lowest common denominator in this equation has always been a Windows desktop as the endpoint.
Running OS X in a non-Apple virtual environment has already been proven technically feasible, as shown above as a proof of concept. If Apple tore down the licensing walled garden around OS X, it could turn into a potential VDI endpoint to compete with Windows. Increased competition would mean everyone wins. (Image Source: coolcrew23)
Is it implausible to believe that OS X couldn't be farmed into an RDS-style or Citrix driven environment for hosting end user desktops? If licensing restrictions were taken away, and Apple played nice, this isn't as much of a long stretch as some may believe.
Some offices that have spent countless sums on buying individual Mac desktops for staff could instead opt to keep their familiar work interfaces, but centralize administration and security of the solution on something like Microsoft Hyper-V or VMWare ESXi. Unheard of today, but this could become an easy reality given the will from Apple.
Another current obvious no-go is OEM sales from vendors like Dell, Lenovo, HP, and others. Psystar proved there is a market for non-Apple OS X machines, even if the law wasn't on their side when they went to market. I'd be much less critical of Apple if it allowed others to sell OS X based computers and allow the open free market to set pricing for competing systems.
This would also allow for Apple to move back into being trusted by another big market segment which has soured towards Cupertino over the last decade...
4. The Enterprise May Take Apple Seriously Again
Two years ago, I penned a piece that claimed Apple would never be embraced by the Enterprise ever again. Bold words, and I'm hoping it proves me wrong. It would only benefit the entire industry at large.
But as it stands, Apple has been sealing its fate with the Enterprise market for some years now. It shamelessly discontinued the last vestige of a proper Apple server, the Xserve, and told the community to oddly embrace Mac Minis or Mac Pros as server machines. While some companies have gone to great lengths trying to make sense of how to make this happen -- a select few do succeed with style -- the rest of us are scratching our heads on how the heck Apple intended its style-first systems to ever fit cleanly into network U racks.
It's nice to see that Rubbermaid organizers can double as Mac Mini racks for the office. But it goes to show the shortsighted vision of Apple's intentions for the Enterprise. Opening OS X up to standard x86 PCs would mean businesses could choose to purchase or build proper network closet servers running OS X -- and forego the shenanigans with racking Mac Minis or Mac Pros. (Image Source: Random-Stuff.org)
And while the Enterprise values systems that can be easily repaired with spare parts, Apple places meandering archaic rules around how spare parts can be purchased by IT departments, and even took home the title of having one of the least repairable laptops ever with its 2012 Macbook Pro.
InformationWeek shared results a few years back from its Apple Outlook Survey, providing insight into the Enterprise's feelings on Apple's viability in big business. There were some key figures which I outlined before:
Could the Enterprise change it's tune on Apple? It would take much more than just allowing OS X onto PCs, but I'm a firm believer that this would be a catalyst towards moving channel vendors -- the Dells, the VMWares, the Citrixes, and others -- into helping build and sustain a viable OS X presence in the Enterprise beyond just the iOS penetration we see today, which may not have lasting presence.
Desktop computing is going nowhere quick, contrary to what some have been claiming for years now. Slowing tablet sales are already hitting the market. And recent stats show that a whopping 82 percent of IT Pros are replacing existing laptops/desktops for like systems -- NOT with tablets, as many have wrongly claimed. Only a minimal 9 percent of IT Pros are putting tablets out to replace dying desktops/laptops, which is a slim minority given how many years tablets have been out already in force.
By allowing OS X onto PCs, Apple could potentially reverse its course on the losing end of the Enterprise desktop/laptop market, and in turn, help foster the beginning of a supporting ecosystem dedicated to furthering OS X in the corporate world. It's not guaranteed, but it's as good of a shot as any at this point.
3. Overall Market Share Would Easily Rise
While still doing better than Linux or ChromeOS on the whole, Mac OS X has never been able to rise above the ten percent market on any major market share stats charts. In my eyes, Apple is actually its own worst enemy. It's true.
For starters, the high cost of Apple branded systems is a barrier to entry for a large majority of buyers who would otherwise consider an OS X machine. Apple's cheapest first party systems all hover around the $1000 marker (give or take a few bucks) which is out of bounds for not all, but a good majority of people (especially overseas buyers in emerging markets).
Take away the requirement that only Apple-branded hardware can run OS X, with OEM licensing extended to the market at large, and Apple could reverse the struggling woes of OS X on the traditional laptop/desktop side in my opinion. The market playing field would be substantially opened and leveled for OS X hardware, with a potential par for par competitive option for new buyers considering Windows vs OS X.
This would satisfy many enthusiast critics such as myself, who have long criticized Apple for its artificially inflated pricing tactics of now-standard computer hardware. Bringing down the price point of entry level OS X systems could let consumers decide on the OS of their choice based on functional merit and not just whether their pocket book was large enough.
While there are no guarantees there would be large swings in market share benefiting OS X, I see no reason why Apple couldn't eek out a good 20-30 percent by opening up OEM licensing options for OS X. Increased adoption of OS X could therefore lead to Apple positioning its own systems as the counterparts to Microsoft's Surface devices -- the premium experience for those who can afford it and want Apple's vision of computing on their desk.
But the masses would no longer be held at arm's length from being able to choose OS X if they really wanted to, due to artificial pricing floors. Consumers would end up as winners, and Apple would look like a hero of a company. A win win.
2. Increased Competition for Windows = Consumers Win
In the sub $1000 market for computers, Apple has zero presence today. Aside from refurbished systems or Craigslist hand me downs, you can't go to the store and find a Mac at this lower price point. As such, Windows has a stranglehold on what consumers can buy in this territory.
Sure, ChromeOS is an option and Linux has always been there, but I've written before for why Linux is also its own worst enemy when it comes to market share. For all intents and purposes, Windows controls the sub $1000 market space for computers.
Why does this have to be the de-facto standard? From a functional perspective, and from an ecosystem of apps perspective, OS X is by far the most seasoned alternate option to Windows for traditional desktop/laptop users. Most major desktop apps are cross compatible between both OSes, meaning if it weren't for price, more consumers could opt to go OS X if they really wanted to.
And therein lies my argument for this point. Few would disagree that the intense competition of the Windows ecosystem has not only brought down prices for consumers, but likewise, increased overall quality of hardware and software. Competition drives innovation, not stagnation, and this important fact is why Windows has not only survived, but thrived, as a platform.
One can point to the relative lack of advancement on the Mac from a hardware perspective as one example of Apple's negative hold on OS X. Sure, there is no question Apple is using premium processors and other internals when it comes to raw horsepower, but that's not where I am going here. I'm specifically talking about Windows platform innovations which have come to market and offered entirely new usage experiences for consumers.
Touch on laptops and desktops? Apple is nonexistent there. Convertible hybrids? Apple's nowhere to be seen. Stylus support on desktops or laptops? Again, Apple has never had an inclination to allow such functionality. There are undoubtedly plenty of buyers out there that would love to see some of these options available for purchase.
Apple's tight control over the hardware ecosystem for its OS X platform has stifled its own innovation, and with the growing reliance on iOS-devices for its revenue base, Apple has less and less incentive to steer outside of its comfort zones.
Giving the PC market a chance to do what it does best -- test new ideas for hardware combinations that make sense functionally and fiscally -- is perhaps one way OS X could stay relevant for the long term on the desktop.
Give others a chance to go where you refuse to, Apple.
1. Apple Fans Could Finally Have True Device Choice
This final point will probably have people either in complete agreement or vehement disagreement. But while many of the Apple faithful believe that Apple itself is the only one capable of creating OS X devices adhering to the Apple vision, I beg to differ.
While the status quo has tainted the opinion of many loyal buyers, I would ask loyalists to consider this: at any given point in time, the number of new Apple computers you can choose from on the market is somewhere in the range of 4-6 core models. While there are flavors offering more horsepower or battery life, the devices themselves all never stray too far from a common design baseline.
Many enjoy this limited set of choice in hardware. But from talking with others and reading comments online, there are just as many who hate the Henry Ford approach to hardware sales by Apple. Count me as part of this category.
On the Windows side, buyers have countless choices not only between form factors, but device brands and spec points. This plethora of choice has only benefitted in bringing new concepts to market, and giving consumers the ability to find the device that fits their needs best. Why wouldn't OS X fans benefit from similar open hardware choice?
While Apple's argument has always been that this limited set of hardware increases reliability, does this still hold fervently true? My company still offers residential computer repair for local customers and we get more than a fair share of Mac systems in our office each year that suffer from hardware/software incompatibilities, failed hard drives, incessant "spinning wheels of death", and recently growing with each month, malware infections that some believed were impossible.
If Apple cares about its dedicated fan base as much as it claims to, I would think that giving them the ability to choose the hardware platform that they run OS X on would only be beneficial for building and keeping the trust of its customers. Restricting hardware choice to a limited set of options solely for financial and business reasons may still prove to bring short term success, but I doubt it is viable for the longer term, as computing prices in general continue to fall.
Device choice would not only be limited to traditional form factors for the consumer market as we have come to expect. This could come in the form of OS X servers made by Enterprise giants Dell or Lenovo, just as an example. It could also be POS systems built on OS X for retail. It could even be integration platforms for the auto industry, akin to things like Microsoft and Android Auto already represent.
The penetration of OS X could go beyond the tried and true and open up new markets with more choices for vendors and consumers alike. And while Apple is convinced its fans would be losers in such a scenario, I think that couldn't be further from the truth.
If OS X is to continue to prosper as a platform, let it win in the market based on its own proven merits. It's time for Apple to tear down the moat around OS X and let it be free of its artificial restraints.
Eat your own dogfood, Apple, and consider Thinking Different on this one. You may make new believers out of some of us.
Main Image Credit: McdonnellTech.com
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
"UniFi is the revolutionary Wi-Fi system that combines Enterprise performance, unlimited scalability, a central management controller and disruptive pricing." That's the pitch thrown by Ubiquiti Networks right off the homepage for their popular UniFi line of wireless access point products. In many respects, that statement is right on the money.
But as the old adage goes, sometimes you truly do get what you pay for. And when it comes to UniFi, that tends to be my feeling more and more, seeing the gotchas we have had to deal with. We've continued to choose their access points, primarily in situations where cost is a large factor for our end customer. Who wouldn't want Enterprise level features at a Linksys level price?
I give Ubiquiti more than a decent ounce of credit for its altruistic intentions in the wireless market. They've spent the better part of the last five years trying to offer up an alternative to the big boys of commercial Wi-Fi -- the Ciscos, the Ruckus, the Arubas, etc -- in the form of their UniFi line of products. With their entry level access point, the UniFi Standard (UAP) coming in at under $100 USD out the door, it's hard not to notice them when shopping for your next wireless system upgrade.
When it comes to hardware build quality and aesthetics, their access points are absolutely top notch. The flying-saucer-like design choice of their Standard, LR, and Pro series units looks super cool, especially with the added visual flare of the ring LED that adorns their inner sphere. Their included mounting bracket is easy to install on walls or ceilings, and the access point itself merely "twists" into place to secure for final usage.
I cannot forget to mention that the pure wireless prowess of these units, especially from the UniFi Pro which has become a certain favorite of ours, is simply amazing. At the low price point these little saucers command, the coverage area we can blanket with just a few APs is astonishing. And once configured, they rarely ever need reboots in production -- we have had numerous offices running for 8-10 months or longer between firmware upgrades without a single call from clients about W-iFi downtime.
But design and Wi-Fi power of the hardware itself is about where the fun with UniFi starts and stops.
One of my biggest qualms are with the way UniFi handles administration of its access points with the much-touted software controller that is included at no extra cost. While the new version 4 of the interface is quite clean, it's still riddled by a nasty legacy requirement: Java.
For the longest time, the UniFi controller refused to work properly with Java 8 on any systems I tried to administer from, and I had to keep clients held back on the bug-ridden Java 7 just to maintain working functionality with UniFi. OK, not a dealbreaker, but a pain in the rear. Not to mention the number of times the controller software will crash on loading, with only a reboot fixing the issue. The UniFi forums are full of threads like this discussing workarounds to the endless Java issues.
The software controller woes don't end there. For some new installs, the controller would refuse to "adopt" UniFi units at a client site -- forcing us to go through a careful tango of hardware resets, attempted re-adoptions, and countless manual adoption commands. After trial and tribulation, most units would then connect to the controller, with still some refusing, and ending up being considered DOA duds.
My distaste for the UniFi controller further extends into situations where you have UniFi boxes deployed at numerous branch offices, with control being rendered from a single controller. While Ubiquiti claims clean inter-subnet connectivity at standard layer 3, real world functionality of this feature is much more of a hit and miss affair -- more often than not, on the miss side.
Initial adoption between branches can be time consuming and experimental, and even when connected, units will show up as disconnected for periods of time even though you have a clean site to site tunnel which has experienced zero drops.
Another little "gotcha" that Ubiquiti doesn't advertise heavily, and which is complained about on forums regularly like here and here, is the fact that their cheapest Standard units don't use regular (now standard) 48v PoE. Thinking of deploying 10 or more of their cheapest APs on your campus using existing PoE switches? Not going to happen. You'll need to rig up a less-than-favorable daisy chain of special 24v adapters that Ubiquiti includes with their APs -- one per AP.
Ubiquiti of course offers their own special switches that can post 24v PoE, the ToughSwitch line, but this is little consolation for those that have invested in their own switch hardware already. And I see little excuse for them forcing this on shops, since their Pro and AC level units use standard 48v PoE. A nudged play on their part to get people to buy more expensive APs, or just a technical limitation they had to implement? Take your pick.
I'm not here to deride Ubiquiti on what is otherwise a fantastic piece of hardware. Their end to end execution, however, is where they suffer, but they aren't alone in failing to deliver a well-rounded solution on all fronts.
The common theme I see in the Wi-Fi industry over the last decade or so is that you have rarely been able to get a product that satisfies all of the usual "wants" from business-grade Wi-Fi hardware:
And therein lies my issue with most of the common vendors in the game. Ubiquiti offers great pricing and hardware, but has a software-based controller with numerous issues and offers zero phone support. Cisco's Aironet line has great hardware and tech support, but comes straddled with expensive hardware controller requirements and complex management and setup. Ruckus sits in a similar arena as Cisco, with some premium pricing to match.
Since our focus is primarily the small-midsize business customers we support, we've been on the prowl for decently priced gear, that comes with rock solid support, ease of management, and ideally gets rid of the need for hardware controllers -- not only due to the added cost, but also the requisite replacement and maintenance costs that go along with such controllers.
Not all hope is lost. Luckily, we found a product line that meets nearly all of our needs.
Enter Meraki
Last year, we grew quite fond of a company called Meraki for their excellent hardware firewalls. To be fair, Meraki isn't its own company anymore -- it's a subsidiary of Cisco now, with some well-to-do rumors saying that Meraki's gear will one day replace all of Cisco's current first-party networking gear. I went so far as to pen a lengthy review of why we standardized on their MX/Z1 line of firewall devices.
After battling with similar hits and misses on the firewall side, toying with the likes of ASAs, Sonicwalls, Fireboxes, and other brands, I found a fresh start with what Meraki offered in firewalls. Competitive price points, in extremely well built hardware packages, with top notch all-American 24/7 phone support when issues arose.
As a growing managed services provider (MSP), our company decided to standardize on Meraki across the board with regards to routers and firewalls. If a customer wishes to use us for managed support, they're either installing a Meraki firewall, or paying a premium for us to support the other guys' gear. That's just how heavily we trust their stuff for the clients who likewise entrust us for IT system uptime and support.
While we had been using Ubiquiti's UniFi access points for a few years already, biting our tongues about the less-than-desirable Java-based software controller, we weren't content with the solution for our most critical client Wi-Fi needs.
Meraki actually offered a webinar with a free (now extinct) MR12 access point, and since then we got hooked on the Meraki magic, as we call it. We used the unit to provide our own office with Wi-Fi until we moved late last year into our current space, and upgraded to the beefy entry-level MR18 access point. The WAP is pretty centrally mounted in our squarish 1300 sq ft office and provides stellar dual-band coverage for our space.
We even decided to pit the Unifi Pro AP against the MR18 and for all intents and purposes, coverage and speed levels were neck and neck. Seeing as the Unifi Pro was a known quantity for us in terms of coverage and stability, this was great to see that Meraki's MR18 was as good as what Ubiquiti was offering us for some time already.
In terms of hardware selection, Meraki offers a competitive set of (6) distinct options that are not overbearing (unlike Engenius, which at any given time has over a dozen access points available) but offers enough choice given the scenario you are installing into.
Our go-to units tend to be the MR18 (802.11a/b/g/n with dual band 2.4/5GHz) or the MR32 (802.11a/b/g/n/ac/bluetooth with dual band 2.4/5GHz). The MR18 is the most cost effective option from Meraki, with the MR32 being installed in situations where AC future proofing is a requirement.
Both WAPs perform similarly in terms of coverage area per unit, with the MR32 having double the potential bandwidth if the right requirements are met on the client side. I wrote at length about the concept of high-bandwidth Wi-Fi and other related topics in a piece on Wi-Fi best practices from last month.
I will note that for some situations, where we are replacing a client's firewall with a Meraki device anyway, we sometimes opt for the Wi-Fi-enabled versions of their routers. The full-size option most SMBs we work with tend to go with an MX64W in such cases, or for very small (or home) offices, the Z1 has been the little champ that could.
I personally use a Meraki Z1 in my own home condo, and have no issues with coverage -- but it definitely cannot compete on par with the beastly radios in a unit such as the MR18. It's about half as powerful as far as coverage goes in my unscientific estimates.
Meraki = No More Hardware Controllers
If you're coming from the lands of Cisco Aironet, Ruckus, Aruba, or any of the other competitors to Meraki (other than Unifi or Aerohive) then you will be quite familiar with the concept of the hardware controller. It's an expensive device, usually starting at $1000 on its own plus licensing, that sits in your network to perform one function: tell your WAPs what they should be doing.
In the era of ever-pervasive connections to the wider web, why in 2015 should we consider this the gold standard of Wi-Fi system control? While the likes of UniFi prefer to rely on a pudgy locally-installed software controller, Meraki has built a cloud-based infrastructure to provide command and control for its MR line of access points.
Meraki ditched the flawed concept of hardware controllers, and instead unified its entire management platform under a single, web-based cloud dashboard. It doesn't cost any extra, doesn't require you to pin up any servers of your own, and it's constantly updated and maintained by the experts.
For all the cloud-detractors out there, don't point your "I distrust the cloud" wands at this solution unless you've tried it. The number of times I haven't been able to access my Meraki dashboard in the last year I can count on one hand -- and even these times were brief, with no on-premise networking gear being affected.
When the cloud dashboard has issues or goes down for maintenance, all of the connected devices which rely on it go into locally-managed mode which only affects management functions related to making changes or running reports. It's a quite ingenious design that leverages the power of the cloud, but is fully prepared for times when internet access may fall out for periods.
This leads into another neat aspect of the Meraki cloud controller system: all of the devices under the Meraki flag are controlled in a single, unified dashboard. If you're like our company, with a Meraki firewall and access point(s), then you have one single place to log into to manage all aspects of your network.
Add in switches or their MDM Systems Manager platform, and your management overhead doesn't increase any. You just get extra tabs on your dashboard to jump between.
So think about what we used to do back in the day to manage all the pieces of a growing business network. Your firewalls all had local interfaces which had to be dialed into for management -- like the Java-reliant ASDM for your Cisco ASAs. Switches all had a command line or hideous web-based interface. Wi-Fi access points may have had a hardware controller or separate software controller, like UniFi offers.
Meraki tosses that nasty, bungled mess out the window in favor of one interface with seamless oversight across all aspects with just a few clicks of the mouse. Need to put a wrench on nefarious visitors on the guest Wi-Fi? You can find out what services they have been abusing and create policies to filter that particular traffic out, and further create bandwidth limits on the guest SSID -- all from the same dashboard in a matter of minutes.
Tasks like the above, which would have taken numerous interfaces and sequences to resolve, are child's play on the Meraki dashboard. Things that used to take potentially hours to implement and test can now be done in mere minutes. For our clients and my company, time is money, and I'm no longer losing both to handle menial tasks.
Firmware Updates are Automatic and Hands-Off
In my Meraki firewall review from months ago, I discussed at length the security problem that the networking industry exacerbates with the way they handle firmware updates. That is, firmware is either an afterthought entirely or a chore to update -- and many times, plagued by both issues.
Only half a year ago, hacking team Lizard Squad exemplified why it's critical that we take firmware updates seriously with the network gear we deploy in the wild. To their credit, they took advantage of legions of home (and some business class) routers with their DDoS attack tool and wreaked havoc on the large Xbox Live and PlayStation Networks on Xmas Day 2014.
In plain terms, that little Linksys or Cisco in your home or office closet could very well have been an unknowing compatriot in Lizard Squad's botnet of compromised attack routers. It doesn't take an expert to figure out that this may have been just the tip of the iceberg. Why can't other devices, like WAPs, be next? It's naive to think they can't.
Meraki has an update model which should be applauded in my eyes. They take the responsibility away from the end-user, which likely will not have the ability or patience to perform timely updates, and brings this back onto the vendor's back. The cloud controller network that provides maintenance and configuration capability for their WAPs is the same one which automates the update process.
The only thing you have to choose is your update schedule. No worrying about different .rom or .rox files or whatever. The WAPs download their newer firmware, as released, and update according to your predefined schedule.
If you wish to play with beta firmware, there's a channel you can opt to be on right within your dashboard. We do this for our own office MR18 and firewall, but naturally, choose to have all clients on the stable release channel.
What's That Meraki Magic You Speak Of?
In my previous piece covering Wi-Fi best practices, I dug into many of the value-adds that make managing Meraki Wi-Fi networks so clean and simple. But I'll touch on some of the most important things which can be accomplished out of the box on these access points.
One of the very first things we setup for nearly every new Meraki Wi-Fi installation is traffic shaping for the purpose of bandwidth throttling. Think about it: your visitors rarely need as much bandwidth for their Facebook and YouTube browsing, compared to your internal staff. Why not ensure they aren't using larger pieces of the pie than they should be entitled to? Traffic shaping allows you to set per-SSID bandwidth controls (with manual overrides per client, or sets of clients) as needed.
I can use the same configuration page to then also apply QoS policies for things like VoIP so that voice traffic can always have the fastest lanes compared to other, less critical traffic. Remember, changes made here are applied to all access points in a single save -- I don't have to configure each individual WAP, like I had to back when we used Engenius access points at client sites years back.
Some organizations, especially those who offer storefront guest Wi-Fi for their patrons, may want to shut down their Wi-Fi SSIDs at night or on weekends when no one should be accessing them. Instead of having to manually shut down their Wi-Fi, Meraki offers integrated SSID scheduling which can do this for you according to predefined schedules. It's a great way to secure your network infrastructure during off-hours.
For large and complex guest Wi-Fi networks, like one we recently installed for a big banquet hall, Meraki offers integrated NAT DHCP service that offers up near endless addresses on the 10.0.0.0/8 subnet. This means we don't have to worry about providing addresses to these visitors, further removing stress from internal DHCP servers and giving that responsibility to the Meraki cloud.
Another neat item which helps keep the 2.4GHz spectrum open for those devices which need it is Meraki's Band Steering functionality. For WAPs that are on dual band operation, which is pretty much any standalone current Meraki WAP for sale, they will force 5GHz-capable clients to that band and keep the 2.4GHz channels open for older devices. For highly congested scenarios with large client counts, this is key for keeping Wi-Fi operational and in optimal condition for the biggest number of users.
For offices where the highest levels of security must be maintained, such as HIPAA scenarios, VLAN separation between SSIDs is also an easy function to configure right on the dashboard. In combination with firewall rules, 100% segmentation between sets of subnets can be achieved across different SSIDs off the same access points.
Some techs don't have the best experience with knowing how to properly channel map their Wi-Fi deployment between WAPs, and therefore Meraki allows you to use the dirt-simple Auto RF functionality in their dashboard. Meraki WAPs in a given network will constantly scan across all available channels and re-configure their channels of choice based on surrounding circumstances. This means that WAPs will not only optimize for usage based on internal interference within your organization, but more importantly, for external interface outside of your wall which you rarely have much control over.
You can find dozens of other interesting value-added features which come as part of the standard Meraki software package on the official MR datasheet. I've only touched on the most common ones which we are frequently deploying in the wild.
Simple, Uniform Licensing
One of the other aspects about Meraki which I love is the simple per-device licensing model they employ. There are no separate licenses for WAPs, hardware controllers, or support. It's a single license that gets applied to any WAP you purchase, what they call the Enterprise License.
Before you ask, yes, the license is required for the life of the unit. If you choose a three year license and want to use the WAP past the third year, you need to re-up your license. But this model makes perfect sense, because this single license entitles you to all of the following:
You have the option of getting 1, 3, 5, 7, or 10 year licensing for your devices, with increased discounts per-year as you move up the license line, naturally. As a practical matter, and a balancing act when it comes to client budgets, we usually choose 3 or 5 year licensing for our clients.
Since their licensing is the same across the entire MR line of access points, this means you can easily upgrade to newer models down the road and NOT lose your licensing. You can merely disassociate the older models from the dashboard and add in the new ones, and the system will automatically assign the licensing to the new devices.
Licensing doesn't have to be tied to serial numbers; Meraki merely cares that the # of access points in your account matches the # of access points assigned by your paid licensing. Straightforward and simple in my eyes.
Packaged With An Attention to Detail
I'm not one to go into the finer points of product packaging in the usual way that endless "unboxing" reviews prefer to. I care much more about the functional design and operation of a product then how nicely it fits into a box.
But this is one instance where I wanted to point out something that differentiates Meraki from the crowd. They have an immaculate attention to detail when it comes to their hardware packaging, as seen below:
(Image Source: YouTube)
Why does this even matter? Because for anyone that has been on the installing end of Wi-Fi access points, they know that the hardware and accessories bundled with a WAP can turn a 30 minute install into a multi-hour affair quite quickly. I had to mount a new MR18 for a healthcare client who is expanding an office just last week, and I was quickly reminded of how Meraki considers their experience premium from start to finish -- literally.
The necessary hardware for mounting your WAPs on walls, ceilings, in wood or plaster, and even on suspending ceiling braces, is all included in every WAP you purchase. No second guessing on whether you will be able to mount your new WAP on the material and position of your choosing.
While not shown specifically, all of the Meraki access points I have encountered are also bundled with premium backplates that are used as the actual brackets which the WAPs are slid onto. This makes mounting/unmounting of WAPs for maintenance or other needs easier, and also reduces damage that unevenly installed mounting screws do on your expensive gear.
I think some people call this "Apple level boxing" of product. Yes, it meets that standard.
Where Meraki Can Improve
While excellent in most areas, Meraki has room to grow and fix some issues. No vendor is without its downfalls.
Just like I mentioned in my MX firewall review earlier this year, Meraki is guilty of imposing some of the most arcane purchasing procedures for product that I have ever seen. Simply put, all of their gear has to be shipped directly from their main warehouse in California for US-based purchases. Unlike other gear that I can purchase at a moment's notice for clients from suppliers like CDW or Ingram Micro, Meraki imposes hyper-mediated policies that surround gathering information up front on end-customer information, prior to any gear shipping out.
This leads to either dealing with near week long lead time for any gear necessary, or paying for expensive two day or next day air for time sensitive needs. In today's market of being able to procure nearly any product under the sun from a bevy of supplier warehouses, many local to us in Chicago, this is the most painful part of dealing with Meraki product purchasing. In many cases, for emergencies where older gear needs to be changed out same-day, we cannot go with Meraki even in cases where the customer would have wanted to.
Another gotcha which is a part of the Meraki world is the fact that any gear which loses licensed status, over a 30 day grace period, ceases to function. Some may see this as a downside of moving into a cloud-managed era, but it does not bother me that much. There is a lot of value-add that goes into what Meraki offers, in terms of support, cloud management, and updates, and this isn't an endless bucket of benefits you can get with a one time payment. I just want to make sure potential buyers are aware of this before chastising Meraki on the flipside when they refuse to re-up their licensing.
Meraki was featured in a case study as part of Issue 4, Volume 27 of the International Legal Technology Association Magazine. (Image Source: ILTA)
A minor complaint which I have at their general MR product line is that they don't have an AC-enabled offering inside any device lower than the MR32. Seeing as Meraki is a premium product to begin with, it would be nice to see the MR18 (or a similar-class entrant) offer 802.11AC to keep pace with what Ubiquiti now offers at lower pricing with its UniFi AC WAP. In a year or so, it will be inexcusable to not have AC across the entire line of WAPs they offer.
And finally, for those who may be commingling Wi-Fi broadcasts from Wi-Fi-enabled routers (like the Z1 or MX64W) and Meraki's true standalone WAPs in a single network, the current state of affairs is far from an ideal situation. We have one such client, a talent agency, which has an MX60W and an MR12 at their headquarters, and the way in which Wi-Fi oversight is managed between the devices is far too rigid. It's almost as if Meraki never intended situations where Wi-Fi-enabled routers and WAPs shared the same space. This needs to improve so that management of these signals can be unified with the same feature set across all Wi-Fi-enabled devices Meraki makes.
Closing Words
At the beginning of this piece, I had a wish list which covered all of the things that I was yearning for in a great Wi-Fi solution. How does Meraki stack up against these line items?
The vendor isn't without its flaws, as I mentioned in the previous section. But I'll give them a fighting chance to make it right in those areas since they hit a home run in so many other respects. From a best in class cloud management system, to effortless automated patching, to high quality hardware that ranks up there with the household names we already know of like Cisco Aironet and HP and Ruckus.
To be completely honest, I was tired of solutions that only gave us 80% of what we wanted. Great hardware and coverage but buggy, Java-based software controllers. Or, excellent price points but hardware controller requirements and a lack of vendor support. Mix and match whatever recipe you wish; we've seen all possible combinations out in the wild.
Meraki offers up a truly enterprise class solution under a single umbrella which integrates seamlessly with its other network device pillars, like routers/firewalls and switches. As such, building complex, multi-branch networks that can be easily managed isn't an oxymoron any longer. It's quite easy to accomplish, and we manage such networks with ease on a daily basis now without the sweating associated with overseeing competing vendors' network gear.
We've been standardizing our managed services (MSP) customer networks under the Meraki flag for over a year now, and will continue to centralize on their solutions until someone can do it better at the same price point.
The future of software defined networking is clearly seated in the power of the cloud -- a cloud Meraki's been busy paving as a roadmap for the rest of the competition.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
If there's one common request I get from readers via email, it's that they want an updated deep dive on my thoughts regarding the whole Office 365 vs Google Apps debate. It's a topic I've written about on numerous occasions in the past -- dissecting the facts, wading through the FUD, and piecing together my honest opinion on who comes out on top.
But it has been years since I dove into the crux of the debate, honing in on why one suite beats another in the important aspects that set them apart. And in the world of IT, years is an eternity.
Unlike some in the tech media, who either have allegiance to one side or have little to no real world experience with both suites, I've got none of the aforementioned hangups. Not only have I rolled out both suites in organizations small and large, but I've been professionally certified by Google and Microsoft in each respective suite for years now.
It goes without saying that as such, I've got a keen eye for the gotchas that each suite comes with. I've been able to objectively compare the ins and outs of each suite's finer nuances and features. And most importantly, those critical "first 30 days" post migration to either suite? I've been there, playing witness to the good, bad, and the downright ugly.
Both Microsoft and Google are fighting a fierce battle for the hearts and minds of organizations of all sizes when it comes to cloud productivity suites. And a roller coaster ride is quite appropriate when describing the competitive nature of the two ecosystems. Google came out swinging with Google Apps many years ago, with a comfortable lead in features, stability, and pricing alike.
But in my eyes, come early 2013, Microsoft launched the latest wave of Office 365 and this turned many of the tables. Google no longer enjoyed a de-facto lead in the cloud productivity sphere; they were fighting a Microsoft which licked its wounds and came to round two anxious to regain lost territory.
Seeing as it's 2015 now, and it has been quite some time since my last in-depth showdown on the two suites, I figured I would update my sentiments. This time, however, I'm diving into more detail on the various competitive areas surrounding each suite.
In Part 1 of this special 4 part mini series, I'm covering the pricing aspect of Google Apps vs Office 365. Subsequent pieces will dive into cloud email, cloud file storage, and each suite's take on unified communications.
Enough with the small talk. Let's dive in.
Apples to Apples Cost Comparison Isn't Possible
I blame Microsoft. And I blame Google. They're both at fault in making a discussion about price in a simplified apples to apples way extremely difficult. Why do I say this?
Google, for its part, chooses to make Google Apps pricing very simplistic. It's true, and I applaud them for it. For their part, they really only have two core Google Apps plans -- one vanilla (standard) plan, and an upgraded plan with Vault and additional value-added capabilities. That's it.
As such, Google does a good job in trying to lead the discussion on pricing by saying they undercut Microsoft easily, and they chalk this as a clear cut "win" for their camp. But no so fast.
It's entirely true that Google has the lowest price point available for an all-inclusive cloud productivity plan between the two giants. But if you dig into the finer points of what they consider all-inclusive, this tells a whole different story. And this was one of my major goals for this first piece on pricing -- exposing the innards of what each camp is bundling as part of their price points. Google is plenty guilty of some crafty smoke and mirrors here.
I've similarly got beef with Microsoft, but conversely for their adherence to very complex, byzantine pricing structures. In fact, I spend numerous hours each week on the phone with clients trying to make sense of what direction they should head with Office 365 licensing. From cross-plan feature disparity to endless ala-carte options, Microsoft is the master of the "build your own licensing" smorgasboard.
What does this truly mean? As a simple example, for a small business customer that wants basic cloud email, with Skype for Business and SharePoint Online, there are no less than 5+ combinations of licensing I could come up with to satisfy these needs -- each with differing monthly price points and feature specifics.
Don't get me wrong. In many ways, I enjoy the freedom to build custom licensing scenarios where clients can save the most money possible each month. But for the average person, comparing all-you-can-eat plans with ala-carte options, and trying to find the right mix for their situation, is overly complex and difficult. As such, most clients come to us calling for an SOS right from the start.
Hopefully I've made it apparent as to why an apples to apples cost comparison is almost a fruitless effort. After hours of knocking my brain around on how to build an objective apples to apples comparison chart, I've thrown my hands in the air.
It's not possible. I tried.
As such, my comparison of the two giants is undoubtedly apples and oranges, but honestly tries to evaluate their offerings across the most common "all you can eat" plans. If I allowed for the comparison of all the ala carte offerings that Microsoft has on the table, this article would have been over 20,000 words long. I'll spare both of us the exhaustion.
It also goes without saying that, since this comparison cannot be stripped down to an apples to apples approach, the honest way to look at the two giants is in an overall qualitative lens. Who provides the most bang for the buck? Who has the most features packed into a given plan in contrast with the other camp's like-sized offering?
When looking at the two sides in the above nuanced lens, it's much easier to justify clear winners between the camps.
My Methodology
While there were numerous directions I could have taken with a comparison, I decided the easiest and most honest way to compare between Google and Microsoft was to split the head-to-head comparisons into two primary sections:
Some may disagree with my decision, but this was absolutely necessary in my eyes for one very glaring reason: the feature mixes and price points in Microsoft's plans are somewhat different in each target market range.
While Microsoft caps its small business plans at organizations with 300 seats max, I honestly think that 300 seats is more enterprise-leaning than not. But I'm not here to discuss semantics with Microsoft's marketing department.
For the purposes of this article, we are breaking the comparison into a SMB head to head, and an Enterprise counterpart. This way Google's plans can directly face off with Microsoft's chosen end-customer segment sizes in a rational way.
Small Business Plans: Head to Head
Below is a concise table showing how Microsoft's Small Business Office 365 plans stack up against Google's two Apps offerings. As I've always done before in head to head comparisons, a vendor with a clear win in a given category is given a yellow highlight for that area in one (or both) of its plans.
(All Content in Comparison Table Valid As of 8-16-2015)
Let's get the obvious out of the way right off the bat: Google wins purely on cost. But that's where the debate on pricing starts and ends. As I made clear, an apples to apples comparison of the suites is far from possible because of the bundled value-adds each camp has tucked into its suite at the varying price points.
As such, from a qualitative perspective, Google Apps leaves much to be desired across both of its offerings, as my comparison chart concisely outlines.
For starters, I will give Google credit in the areas where it clearly takes wins from Microsoft. I like how Google refuses to place max seat counts on either of its plans. Now one can say that, well, since they only have two plan options they can't really place limits on seats for either level. But honestly, they very well could have forced all enterprise users into the higher priced Vault-enabled plan level by capping the lower end standard plan at some chosen seat count.
I'll also point out the other remaining Google "wins" which I tossed their way, albeit not without some hesitation. As opposed to Microsoft, which opts to dole out storage space that is function-specific (email has its own bucket; OneDrive has its own bucket; etc), Google instead has a single bucket which three core services borrow from.
Gmail (email), Drive (Docs), and Photos all share one common pool of space. In some ways, this can be a good thing. But the only way you can truly capitalize on this is if you meet two criteria requirements:
1) You wish to pay for the higher Vault-enabled Google Apps plan for $10/user/month AND
2) Your organization has at least 5 users.
If you don't meet the above prereqs, Google forces you back down to 1TB of space per person with a Vault-enabled plan; or a small 30GB shared bucket if you are on the cheapest standard plan.
But even with my reservations on all the gotchas and asterisks surrounding the requirements for the unlimited space offering, I tossed Google a bone on the email and personal cloud file storage categories. They snuck out on top, shady circumstances notwithstanding.
However, the best Google was able to do against Microsoft's small business plans was a four category win for the Vault-enabled Google Apps plan level. Redmond took home wins in many other areas; some of which are critically important to many customers out there.
One of the biggest areas in which Google had a near no-show was in the office suite categories for computers and phones/tablets. It goes without saying that Microsoft's bundling of Office rights for up to (5) distinct computers, phones, plus tablets (that's five PER device type -- NOT five installs total across the board) is a very good deal any way you look at it.
Seeing as business users are accustomed to purchasing single-use-rights copies of Office for upwards of $200 or more per machine, this alone is well worth the price of entry. And Microsoft doesn't water down the rights to the lowly Home/Student edition either -- this is full (near) ProPlus level of Office, minus Access.
Google has no dedicated locally-installed apps to speak of outside of some offline-enabled functionality in Drive for limited file types. Unlike MS Office on the desktop, which are full blown, fully-featured native apps, Google's Docs editors are restricted in numerous ways compared to what Office users have grown up with. For basic editing, Docs may get you by, but don't expect any of the fancy formatting from Word or powerful number crunching from Excel.
Some may also be curious as to why I split up cloud file storage into Personal and Enterprise subsections. This is because, for all intents and purposes, Google Drive is NOT purpose built to act as a true replacement for an enterprise class file storage & management system. Google scuttlebutts around what Drive really is, and it's nothing more than a bona-fide cloud sync and share platform.
Compared to SharePoint Online that Microsoft offers (which I discussed at length in a post one year ago), Google Drive revolves around a notion of file/folder ownership that sits at the individual user level -- that is, a bottom-up approach to content management. This means someone at ALL times needs to play owner of any set of files or folders.
For super small companies, this is not an issue. But once you have more than 5-10 users, when staff with complex folder structures which have been shared need to leave the company, you have nothing less than an administrative mess on one's hands. It was part of the reason my company ditched Google Drive (and likewise traditional file servers) for SharePoint Online.
In addition, SharePoint Online provides many other enterprise level features which are more in line with what traditional beefy file servers offer, such as granular & group-based permission sets, data loss protection rights, contextual enterprise search, and abstracted document libraries (file shares) which are managed top-down, and not bottom-up.
Another area where Google stumbles is with Outlook support. On the surface, they claim true Outlook compatibility with their GASMO add-in for Outlook. But as someone who has deep experience with the messes this tool has made at client sites, and how many upset customers have ditched the tool entirely, I'm not giving Google an easy pass in this area. Their Outlook support stance is half-baked at best.
It goes without saying that if you have a staff base which loves Outlook, or relies on its finer nuances, then you shouldn't be looking at Google Apps. Google's GASMO add-in may be fine for light Outlook users, but long time Outlook addicts will not be pleased over the long haul. Users on the Google Apps support forums have numerous, long threads running with various complaints, such as this one or this to name only a few.
While not a requirement for all companies, those who need to fulfill HIPAA compliance should note that Microsoft wins out doubly on this area. First off, Microsoft offers FULL HIPAA compliance across ALL services under the Office 365 umbrella. In another smoke and mirrors move on Google's part, they claim HIPAA compliance -- but dig under the surface, in their HIPAA implementation guide, and you'll find out that core services like Hangouts, Groups, and even Contacts cannot have any PHI stored within them.
That means any HIPAA covered entity which wanted to use Groups as a shared mailbox replacement wouldn't be able to do so. It also means that contact address books could not in any way have any PII stuffed into notes fields. And likewise, offices who were planning on using Hangouts to enable some form of telemedicine with patients must look elsewhere (Skype for Business is fully HIPAA compliant via end-to-end encryption utilized).
Another aspect of HIPAA compliance related to cloud email is the ability for an organization to ensure encryption for all data in-transit in addition to at-rest. While Microsoft doesn't bundle the necessary component into the small business plans, it does offer it at an economical $2/user/month additional, which happens to be their own first party service (Azure Rights Management, for those curious).
Google offers email encryption now via a third party called ZixCorp under the guise of Google Apps Message Encryption. Google has had a patchy past with email encryption, since it previously used to offer it as a first party add-on for its now-defunct Postini email spam and security suite. While I don't have inherent objection to a third party offering, something critical for more and more companies going forward needs to be easy to manage and cost effective. I just find Microsoft's lower price point here, and the "under one flag" approach, to be more palatable for the long haul.
Finally, for those who will be conducting online meetings with their new suite, Skype for Business as part of all Office 365 plans is decently more powerful out of the box compared to Google Hangouts. Not only is the 250 user max comfortably higher than Google's 15 user cap on Hangouts in meetings, but Google has had a horrendous track record with Hangouts as a stable platform.
PC World threw out a scathing report about this just a few months back, and other online musings like this add insult to injury. My own limited usage of Hangouts, for grad school needs and during a webinar-enabled presentation I put on at a local high school recently, has been pitiful and chock full of drops, errors, and bogus results.
In contrast, my entire company uses Skype for Business (formerly Lync) for all online meeting, IM, video, voice, and PSTN phone system needs, and it has been a relative pleasure. Not to mention, Microsoft makes S4B available on all major devices and platforms; unlike Google, that cherry picks its apps for Android and iOS only. With Windows 10 running on 50 million devices in only two weeks after its launch, Google's anti-Windows stance is troubling for a growing mobile and desktop user base.
Now that we compared the two giants on the small business end of things, how do they fare in the enterprise realm?
Enterprise Plans: Head to Head
One of the biggest differences for the enterprise plans Microsoft offers is that it removes the seat count limits that stand with its SMB focused plans. Therefore, both Google and Microsoft have unlimited user growth headroom for their respective plans at all price points in this head to head.
Here's how they stack up:
(All Content in Comparison Table Valid As of 8-16-2015)
Seeing as I covered much of the meaty discussion about inter-provider differences in the SMB section above, I won't dive into those same topics here. Instead, I want to focus on the facets specific solely to Microsoft's enterprise plans and how they compare to Google Apps for Work.
Once again, it's clear that Google wins on a pure dollar basis. Their highest priced plan comes in at $10/month/user while Microsoft caps out at $20/month/user. But as I described in good detail earlier, there's substance to the price differential which needs to be accounted for.
What you're getting for your money is much more important than just what your raw per-user monthly cost is, as I frequently remind my clients. Paying for third-party add-ons to tack on this function or that feature down the road gets costly, quick.
Two other areas Google still holds the crown in are email storage space at its highest priced plan, and Google Drive space for personal cloud file storage.
However, as I noted previously, Google places a 5 user minimum on this unlimited space -- meaning that if you have 4 or less users, you're knocked back down to 1TB per person. I don't appreciate Google's arbitrary baselines around the "unlimited" bucket of space, and gave them the wins here with noted reservation.
Many of Microsoft's other previous wins on its Office 365 Business Premium level are retained by the E3 plan which competes pretty directly against Google's highest plan offering in the enterprise space.
It should be noted that Microsoft tosses E3 subscribers two great bones compared to Biz Premium: included full blown email archiving/legal retention as well as complimentary email encryption for all of these seats.
One thing that I did not spell out in the above charts, but should be definitely noted, is that unlike Microsoft, Google does not allow for mixing user sets with differing plans. For example, you couldn't place the bosses on Google Apps w/ Vault and leave the lower staff on Google Apps standard. This means that for larger organizations, you either have to swallow everyone getting the higher end licensing, or opt to stay at the standard lower level.
I think this is a hit in the gut, and a sly way for Google to retake some lost revenue from selling Google Apps at such low price points, especially at the standard level. I bet many organizations would like to get Google Apps w/ Vault for a subset of users but not have to license the entire staff at that price point.
Microsoft, to its credit, removed a big barrier I rallied against in previous years, which was their insistence in keeping up an invisible barrier between SMB plans and Enterprise plans (note: ala carte plans are considered Enterprise plans in MS land). Since about late last year, those barriers have come down, and companies can freely mix/match licenses across all price points for their staff as they please, albeit still being capped at 300 seats for any of the SMB-focused plans.
This is great for companies, especially larger ones, that have needs to license subsets of staff with particular options (like email encryption, or bundling Office download rights) but then have other sets of users which only need basic email accounts -- which can be easily achieved on the cheap by obtaining bare Exchange Online licenses for the lowly $4/user/month.
In Google Apps land, it's an all or nothing proposition. For example, if you have 100 users, and want Vault rights for just half, you're getting Apps w/ Vault for all 100 at a monthly cost of $1000.
With Microsoft, we could concoct a licensing mix to get 50 users on Exchange Online for $200/month, and the rest get Business Essentials w/ Exchange Online Archiving added on for a total of $400/month -- or $600/month out the door. That's a sweet $400/month savings over Google, or $4800 less for the entire year.
It does add up, and these are some of the gotcha scenarios where Google Apps doesn't win on price. We have discussions with clients in predicaments like this all the time, and sometimes Google Apps wins out, but more often now, Microsoft, with its considerably more liberal licensing mix policies, is getting the lower-cost wins -- especially at larger organizations where large sets of users don't need all the fancy bells and whistles.
Tips on Pricing Google vs Microsoft for Your Potential Cloud Move
While I plan on writing a lengthy insider's guide to licensing Office 365 on the cheap, noting many of the tricks we use to save clients money on their monthly bills, here are some of the key points to remember when costing out the two giants for your own needs.
As is clear from the lengthy expose needed to highlight the differences between plans and vendors, pricing discussions are far from cut and dry. While Google tries to attract those shopping on raw price tag alone, there are substantive differences to account for, and gotchas to be careful of, like their reluctance to allow for mixing/matching Apps plan levels on the same account.
How do you objectively price out these suites in order to save the most money possible? Here's a few things from the blackhat FireLogic playbook of cloud email licensing.
The above tips should help you gain a better baseline for understanding for how to objectively view your options based on actual business needs, and not over-licensing your company unnecessarily as is too often the case.
When is Microsoft or Google the Better Option?
While I provided the necessary facts above to help organizations make well rounded decisions on path, there are some hard truths that hold steady on the average when comparing Microsoft and Google.
Office 365 is usually better for organizations that:
And conversely, Google Apps is a better pick for organizations that:
While the above observations generally hold true based on what we see in the field, it goes without saying that there are always exceptions to the rule. Again, reach out to a trusted expert to get a third party opinion on which platform your organization may best be suited migrating to.
Further Reading
While pricing is an important part of the cloud productivity suite decision process, it surely isn't the only piece. I've structured this look at Office 365 as a four part series.
Direct links to the other parts of the series will be made available once they are posted.
Part 2: Email
Part 3: Cloud File Storage
Part 4: Unified Communications
Have a question for me which wasn't made clear in the series? Feel free to reach out directly via email at dw (at) firelogic.net (dot) net and I will do my best to answer.
Main Image credit: ChurchMag
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
As someone who is always on the prowl for new podcast material to enjoy, I recently came across one which is hosted by a name familiar to many in the States. Joe Rogan -- currently best known for being lead color commentator on most large UFC PPV events -- doubles as someone who hosts an interesting podcast under the simple guise of "The Joe Rogan Experience."
Specifically, episode 680 of his podcast showed up on my phone today, and it featured a lengthy discussion with a Steve Hassan on the intriguing topic of cults. Hassan is a mental health counselor who has personal experience with cult entrapment, as he was once a member of the Moonies. He dove into a variety of areas surrounding cults and their characteristics with Joe, and if interested, I recommend listening to his appearance on JRE 680. The real reason I plugged Joe's show is because during the episode with Hassan, the relevant topic of Apple naturally arose at one point. The question at large was thrown out in the wild. Is Apple and its following a cult?
For substance, Joe mentioned how he remembered old colleagues discussing upcoming Mac OS releases with near mysticism; as if every new iteration was about to bring about a newfound revolution in computing. It was part comedy, part reality, but the notion that hit home was quite apparent.
Apple has become almost holy not due to any altruistic motivations of its own doing -- its profit motive burns as brightly as any large tech company -- but because of an almost fanatical following it has assembled.
Apple product launches are a big deal -- and for hardcore believers, almost ritualistic. Apple Store workers, shown above, are cheering on the lucky first buyers at an iPad launch years back. Apple's near evangelical following was a core facet of the 2009 documentary, Macheads. (Image Source: CNN)
This new religion is passed down to followers (end users) through the Scribes (tech journalists) that preach the gospel of Apple. Its places of worship are easily found (Apple Stores) across the world, and its mass gatherings (launch events) reinvigorate the word of its gospel and propel belief in the brand.
As someone who owns a managed IT services company full time, and writes tech op/ed when time permits, I know full well the fervent positions that the Apple faithful hold as virtue. Apple Macs are better than PCs. Apple doesn't get malware. Apple this, Apple that.
And even when I take the time to outline point by point, with references, some of the holes in stated Apple Doctrine -- for example, ripping through their long held claim to security through obscurity -- I'm lambasted by the faithful in comments and me-too blog posts around the net.
While I'm far from fazed about the heat tossed my way, it comes without saying that the most dedicated of the Apple community are, in some respects, blind believers. And therein sits one of the core facets of followings that blur the line between mere thought collectives and cults: instilling a mindset that the cause can do no harm and never be wrong.
The Religion of Apple
Joe Rogan and I are certainly not the first ones to imply Apple as slightly holier than thou in the tech sphere today. In fact, numerous academics with far more science behind the matter have come to similar conclusions.
A sociologist from Eastern Washington University, Pui-Yan Lam, authored a scholarly paper which was published in the Sociology of Religion journal in 2001.
That essay, May the Force of the Operating System be with You: Macintosh Devotion as Implicit Religion, touched on religious themes related to higher meaning, believers who claim to be persecuted, and metaphorical battles between good and evil.
Lam spoke to one Apple believer, who was quoted in the research as saying, "For me, the Mac was the closest thing to religion I could deal with".
Lam goes on to say in the paper that "The faith of Mac devotees is reflected and strengthened by their efforts in promoting their computer of choice".
For the most devoted of Apple fanatics, the fruit logo goes beyond a fashion statement. It represents a lifestyle of technology usage that begins and ends with anything to come out of Cupertino.
And when it comes to recruiting new faithful, Apple makes proselytizing quite simple for followers. Apple's bold, simple logo is prominently plastered on all of its products, allowing its entire user base to purposely or passively perpetuate a simple notion: join us.
Why else did Apple need to have a glowing fruit logo on nearly every Mac laptop sold since 1999? While the trend seems to have supposedly finally bucked, walk into any Starbucks in metro areas like my home Chicago and you'll see the ever present glow of that iconic logo on a majority of computers.
Whether you know it or not, Apple is soft selling its brand through legions of, well, drones -- you know, those same drones they once ridiculed in their 1984 ad? Henry Ford famously claimed that a customer could have any color on their new Ford as long as it was black.
The same notion holds true for Apple buyers of today. They're more than welcome to any Apple computer, as long as it runs on Apple hardware and comes with Apple's proprietary connections and dongles. And of course, the shiny, glowing Apple logo.
When you're selling a lifestyle choice, as Apple clearly is today, conformity is key. The meaning of choice has been neutered down to the lowest common denominator which matters: you're going Apple, or you're not. It's as simple as that.
Just like most major religions, there's no negotiating on the core principles that the respective belief system is built upon. You can't build a following which allows followers to fragment ideals as they please. It just doesn't help foster the consolidated growth of the following.
Apple Stores share much in common with houses of worship. They have an idolatrous centrality around the iconic Apple logo. Products are elevated and isolated on booths akin to altars. Floors and facade are usually chock full of bold, smooth stone just like churches employ. Coincidence or just crafty observation? (Image Source: iPhone Savior)
Kirsten Bell, anthropologist from the University of British Columbia in Canada, follows a similar tune to that of fellow academic Pui-Yan Lam.
"They are selling something more than a product", Kirsten was quoted in a piece on CS Monitor. "When you look at the way they advertise their product, it's really about a more connected life". Most major religions share in the belief that they are promising a better life for followers, Kirsten noted.
She also went on to highlight how tightly knit Apple's logo and current day image is centered around the story and self of Apple's iconic founder, Steve Jobs. While Microsoft in some ways has Bill Gates to fill this same void, the devotion Apple fanatics have to Jobs -- placing him on a pedestal of his own -- shows that he enjoys a larger than life presence in how followers perceive Apple.
Steve Jobs is Apple. Apple, likewise, is Steve Jobs. There is no discernment between the two. For many devotees, buying that new Apple laptop or iPhone is very much akin to paying homage to their supreme leader of Appledom.
Apple portrayed PC users as mere drones in its "1984" commercial spot. 30 years later, and it's safe to say that the tables have turned. Apple has dominant control over legions of users who have next to zero choice of hardware and are virtually locked into Apple's own walled software garden. Who's Big Brother now? (Image Source: Cult of Mac)
Kirsten noted how Apple's product reveals are always in locations "littered with sacred symbols, especially the iconic Apple sign itself." For the keynotes, an Apple leader like Jobs or Cook "addresses the audience to reawaken and renew their faith in the core message and tenets of the brand/religion".
While anecdotal and qualitative evidence points in one direction, does actual scientific testing also come to the same conclusions? Well, it seems they do. A team of neuroscientists decided to put the theory to test as part of a BBC TV documentary, Secrets of the Superbrands, which first aired a few years back.
The test scenario was pretty simple. Take brain scans of a self-proclaimed Apple lover (Alex Brooks, of World of Apple fame) and compare them to those taken of religious followers from various faiths.
What they found gives us the hard evidence to support what Lam and Bell alluded to in their academic studies. "The Apple products are triggering the same bits of [Brooks’] brain as religious imagery triggers in a person of faith".
One of the scientists from the study went on, "This suggests that the big tech brands have harnessed, or exploit, the brain areas that have evolved to process religion".
Even the creators of the aforementioned TV documentary series, Alex Riley and Adam Boome, noted that what they saw at a new Apple store opening in London reminded them more so of "an evangelical prayer meeting" than just "a chance to buy a phone or a laptop".
Every year, Apple's inner circle (journalists, developers) descends upon events like WWDC or product reveals. Since the average person is usually not invited, Apple's gospel is then spread through sermon, commonly delivered by devotees within the tech media at large. (Image Source: FastCompany)
This begs the obvious question: is near religious dedication to Apple, which is nothing more than a bona fide brand, healthy or ideal? Many out in the public sphere don't even skip a beat in admitting to their obsession with thy holiness Apple.
Richard Seymour, tech contributor at The Guardian, titled his part-observation, part Apple Watch gluttony editorial, My name is Richard and I am an addict -- an Apple addict.
His opening paragraph didn't skip a beat describing how Apple's (arguably) underwhelming Apple Watch made him feel:
Help me. This is not normal behaviour. I saw the Apple Watch and didn’t think, "That is a preposterous piece of absurdly overpriced crap". I saw the Apple Watch and thought, "Come to me, you thing of beauty, for I must have Siri on my wrist".
Rounding out his piece, Richard points out the obvious among his fellow faithful. "Apple products are beautiful, smooth, ingenious, luxurious. As every Apple addict knows, they're 'just better'".
Will It Last?
Much of what I wrote above will simply get tossed out the window by a good majority of the blind faithful. This piece was never meant to act as a vehicle for changing beliefs or attitudes for or against Apple. My sole purpose was to answer the very question I put forth up front: is Apple and its following a cult?
Based on the evidence, including qualitative, anecdotal as well as scientific findings, I tend to believe that Apple has surpassed the point of having a loyal brand following.
Its followers -- the ones in line on all day one product launches, the ones who buy Apple product regardless of technical/functional/financial merit -- consistently show signs of being aligned with something greater than just a brand they love.
For many of these fanatics, their lifestyle is defined by Apple. It's a status symbol; a state of being; an inclusiveness as part of a special community that outsiders just can't understand.
They don't see their devotion as potentially unhealthy -- it's who they are, who they've become, and Apple is to be undeniably thanked for allowing them into the exclusive "club".
Am I saying that all Apple users are brainwashed zombies toting some socialist Apple line? Absolutely not. Many people use their products because they have a proven functional benefit they enjoy or just like their products over the competition. That's natural and absolutely justified.
But there's a good portion of Apple's user base -- definitely their most vocal ones -- who feel they're intrinsically tied to Apple's greater fight for survival. That's Apple as a belief, not just a brand, mind you. This "inner circle" that fostered this cult like following of Apple are the ones I'm referring to.
How long the cult of Apple can live on can't be predicted. There will no doubt be those who preach the gospel of Apple long after they are considered the "hip, trendy" tech brand -- the diehards, so to say. Whether it's Google, Microsoft, or some new entrant to take over this throne, it really doesn't matter.
There will be a point in time when even Apple outlives its own coolness.
Like all things in the world of tech, nothing lives forever. Even if your Messiah is his holiness, Steve Jobs.
Photo Credit: Shutterstock/Andrey Bayda
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Pitiful Wi-Fi implementation is so prevalent these days that people take it for granted. From the hotels we visit, to the cafes we frequent, down to the offices we call home 9-5 daily. And that's unfortunate, because when done right, Wi-Fi is an enabler for connecting us wherever, whenever.
Why can't most organizations get Wi-Fi right?
It comes down to this simple fact: Wi-Fi is more of an art than an exact science. Every vendor is guilty of pushing their own FUD around the market, enticing purchases for gear that either is either chock full of features a client will never need, or underpowered for the environment they're intended to serve. Add in channel mismapping, SSID bloat, and click-thru portal hell, and you've got a recipe for what makes Wi-Fi stink.
While it's still hard to compare the entire market's offerings apples to apples, there are some battle tested best practices which have stood the test of time.
All of the following industry guidelines are gleaned from a position of evaluating and installing Wi-Fi gear since the heyday of 802.11b. We're overhauling or implementing new Wi-Fi setups for our clients every month. And regardless of what gear you end up choosing for your network, the same common sense planning approaches apply to any Wi-Fi network in the planning stages.
Is there such a thing as enterprise-class Wi-Fi for the SMB? Absolutely. My company FireLogic sets up such networks all the time using these best practices.
15. Ditch the Portal: Use SSID Scheduling and WPA2 Instead
Many organizations have a business and technical need to keep "squatters" in check. But more often than not, they tend to tarnish the right intentions with the wrong methodology. And if you've used any form of public corporate Wi-Fi in the last decade, you've encountered our nagging little friend: the portal.
While portals are commonly used as a stopgap to help prevent squatters and advertise Wi-Fi ToS for an organization, I have little love for these bastions of last-generation Wi-Fi planning. Why?
They are rarely mobile optimized, and even tougher to navigate on non-computer devices like traditional Kindles or game consoles/set-top boxes.
The local library in my hometown sticks out in my mind as being a poster boy example for the problem with portals. On laptops, their click-thru loads just fine. But on tablets and mobile phones, getting the acceptance checkbox to behave properly is part battle of patience; part finesse of the finger to zoom in exactly right. For elderly users, this is downright painful on mobile devices when poor vision is added to the mix.
Business-class, modern Wi-Fi equipment has functionality called SSID Scheduling (or availability) which stops abuse after hours. In short, the SSIDs of your choice can be automatically disabled during hours when no one should be accessing them anyway. Meraki's MR access points which we use heavily allow us to customize different schedules per-day, per-SSID to match business needs.
Couple SSID Scheduling with a simple WPA2 password that gets changed every quarter for guest-facing SSIDs, and you've got the intended results of a captive portal without the awkward limitations that accompany them.
14. Use More Access Points, Not Stronger Antennas
There is a common misconception among SMB owners that singular high-powered routers or WAPs are most effective in blanketing a large office with good quality Wi-Fi. While this may be a cost effective compromise that potentially works in some scenarios, it's NOT in any way the answer in blanketing large spaces with Wi-Fi; especially in high-density situations.
The biggest argument in favor of the "more WAPs" approach comes down to physical limitations on everyone's favorite Wi-Fi clients: mobile devices. As opposed to laptops, which have more plentiful antennas and higher powered Wi-Fi chips, tablets and smartphones battle with the realities of having to optimize for battery life.
And herein lies the misconception with these over-powered routers and WAPs. Just because you can get a powerful broadcast out over a long distance to a small client like a phone, doesn't mean the other side can respond in tow. It's in essence like someone shouting out to you from afar but you don't have the vocal power to respond back to them. It's an asymmetrical broadcast situation which is heavily weighted against the mobile devices.
The real world symptoms of such situations are seen all the time in poorly designed office Wi-Fi configurations. Clients show as having full or near full strength signal, but upon inspection, packet loss is wreaking havoc on their transmissions.
For smaller rollouts, single high-power WAPs may be OK -- and we go this route in some situations. But larger locations, both vertically and laterally in size, will do better with more access points that are strategically placed to balance the client load.
One final caveat in favor of more WAPs is the fact that each access point has a theoretical limit to how many devices it can support. Most enterprise Wi-Fi manufacturers claim this to be in the 50-75 device range, perhaps slightly higher, but I use 50 as a simple realistic figure. This ensures we are not placing too much reliance on a single WAP in situations where Wi-Fi performance is critical for areas with high concentrations of users.
13. WAP Placement: Mount 'Em High, In the Open
It goes without saying that your Wi-Fi signals are only as good their ability to traverse the open air. Obtrusions like walls, floors, metal cabinets, and even lower-lying Wi-Fi interference sources such as microwaves all degrade Wi-Fi signals to different degrees.
Every organization has different aesthetic and ethernet-port placement limitations to account for when placing WAPs. But when possible, a few simple guidelines to follow:
A lot of companies we get called in by complain of Wi-Fi coverage problems even when their WAP counts are good. But their Wi-Fi planning downfall was concealing WAPs in obscure locations close to the ground, and signal was being eaten up by traditional enemies of Wi-Fi like concrete or metal-studded walls.
Hang 'em high. That's the goal for the best Wi-Fi coverage possible. We always prefer ceiling mounting access points where allowed. This not only provides nice blanketing of large areas with excellent open airspace, but also ensures no obstructions are placed in front of WAPs to degrade signal. (Image Source)
Most business class Wi-Fi gear has a good balance on the performance and aesthetics front. Our favorite, Meraki's MR line, has clean-cut matte white designs, and others, like Ubiquiti's Unifi units, follow suit. Manufacturers are trying to create appealing designs that don't have to be concealed in closets anymore.
12. Only Use 20MHz Channel Width in High-Density Scenarios
Channel bonding technology, or 40MHz channel width as it's also called, is a great idea in theory, but it has numerous drawbacks in the real world. It was designed to allow 802.11N and 802.11AC Wi-Fi to achieve the high throughput figures manufacturers love to advertise. But it causes additional inter-channel interference and doesn't "play fair" with neighbors, per se.
Without getting too technical, channel bonding above the standard 20MHz standard channel width takes two or more channels and binds them into one fat channel. The upside is doubled (or greater) bandwidth per-client, allowing for faster transfers of extremely large files.
But the true justification for using fat channels is short sighted in most scenarios. First off, a large majority of users don't have a need to transfer massive files over the LAN via Wi-Fi. Keep in mind that fat channels ONLY have a benefit for local transfers -- WAN speeds rarely touch the 150Mbps+ range yet, so you are almost always limited by the WAN pipe, NOT the LAN pipe.
And secondly, for high-bandwidth scenarios, it's usually better to run hardwire ethernet, which is much better at sustaining near gigabit speeds than Wi-Fi ever can. If you have a business need to transfer large files on a consistent basis, hardwiring these machines is a much better investment and will save time as well, since file transfers will still be much faster for the foreseeable future.
The bigger issue with fat channels is that they eat up critical airspace with "bloated" signals compared to standard 20MHz channels. Not only does this reduce airspace for natural Wi-Fi neighbors you may have, but it also reduces the airspace for high density situations like conference halls or public cafes where client counts bloat quickly and easily. Giving so much airspace to each individual client in high-density situations causes performance headaches for everyone in the end.
Stick with 20MHz bandwidth, play fair, and allow your Wi-Fi deployment to serve the most number of clients possible. Fat channels are rarely, if ever, necessary in most Wi-Fi deployments, and we never roll them out.
11. Future Proof: Run Dual Ethernet Lines Per New WAP
For most situations where new WAPs are being installed, we're running dual ethernet drops per location. While there is little on the market yet that takes advantage of dual lines, numerous experts are pointing to 802.11ac Wave Two and beyond as requiring the use of dual lines at some point.
The reason for this is due to the increased baseline bandwidth that 802.11ac Wave Two and beyond will be pushing in deployments. Once we start touching closer to the 1Gbps barrier and higher, a single ethernet cable won't be able to deliver both the power and data necessary. While a few forward-thinkers believe that 10Gbps lines may be necessary at some point, most (including myself) see this as wishful thinking and quite unrealistic.
The cost of running an extra CAT6 line to each new WAP location is marginally more expensive. Compared to the cost of having electricians or LV wiring folks engage in a separate project to run extra lines later, putting in the dual Cat6 lines now is realistic and ideal.
When 802.11ac Wave Two (or later) starts requiring dedicated PoE and Data lines, you'll be ready for an easy change out of your WAP equipment.
10. Don't Forget the Switches: Go Gigabit and PoE
Some businesses will plan on spending oodles for advanced access points, but will skimp on the switches for the backbone. This is a recipe for disaster. Lower end, or generic, switching equipment has software quality to match, and thus leads to hard-to-track issues that eat up consulting labor time or prolonged downtime -- or both.
There's no need to overspend on the highest end switches out there, either. If budgets allow, going with Meraki's cloud managed switches are ideal to keep control and oversight centralized, but we routinely opt for lower cost Cisco Small Business or NETGEAR Prosafe products. Both alternatives come with lifetime warranties on the equipment just like Meraki's devices, minus the rock-solid 24/7 telephone engineering support.
Lastly, while it's attractive to stick to switches that lack PoE across the board with their low price tags, the pains this introduces when it comes to powering WAPs isn't worth the small cost savings. Remember, if you are skipping on using PoE to push power over data lines to WAPs, you will need to accommodate for power outlets near every single WAP being deployed. That gets expensive fast.
Add in the headaches from potential AC adapters getting unplugged from outlets, or having issues/dying, and you can see why this is a flawed approach and not recommended. The same advice goes for deploying VoIP phones, which I covered at length in a separate best practices guide.
When it comes to switches, stick to full gigabit for all ports, and get units that have as many ports pushing PoE as possible. You'll be glad you did.
9. Get Access Points With Bandwidth Limit Controls
This is another area where consumer line routers and access points just don't hold their own. Bandwidth logic, or bandwidth limiting as it may be called, is the capacity for access points to literally play traffic cop against hungry client devices. Seeing as most companies cannot afford or don't have access to Google Fiber-class WAN connections yet, we need to ensure clients are playing fairly on the wire.
The thinking behind this feature is to cap the amount of upstream/downstream bandwidth that any single client can suck up. Without controls -- which is what most consumer routers are stuck with -- your internet connection can quickly get drowned by just a handful of hogs downloading large files or using torrent services.
Meraki's WAPs and firewalls have native bandwidth limit controls, as shown above. Rate limiting on individual SSIDs, or across your whole network, is made dirt easy. For high density client environments, this is a downright requirement.
On almost every deployment we do with Meraki access points, we enable Bandwidth Limits with asymmetric caps on download/upload, reflective of the lopsided nature of most symmetrical WAN pipes these days. For most regular offices with decent pipes, I place caps onto guest Wi-Fi for about 500Kbps down / 150Kbps up, with slight adjustments per situation. We can make the caps for staff Wi-Fi much larger, or remove them altogether if necessary.
In high density Wi-Fi deployments, like a banquet/conference hall we just deployed recently, we bring this cap down lower into the 300Kbps down / 90Kbps up range.
If a Meraki firewall is deployed in the environment as well, we can extend these bandwidth caps to all wired devices too. This is a fantastic, easy to manage feature which allows us to ink the most out of smaller WAN pipes instead of purchasing larger WAN lines, which some owners mistakenly believe is the only way out of bandwidth-starved situations.
8. A Word on DHCP Scopes and Lease Times
When we plan out for high density Wi-Fi deployments at FireLogic, two things which are always kept in consideration are the number of DHCP scopes we will be advertising, along with the lease times for client devices. Standard SOHO routers offer single scopes in the /24 range which equates to 254 client addresses. After you trim out the addresses lost to static devices, staff computers, and other special needs, you can end up with very little left to dole out for guest devices, especially when you are hosting a lot of guests.
What approach you take with higher density deployments depends on the equipment you have to work with. If you're sticking with Meraki WAPs, then you can easily enable SSIDs to use Meraki's NAT DHCP service, which opens up a subnet in the 10.0.0.0/8 scope and offers an innumerable amount of address space for clients. This also means you don't have to manage your own DHCP server for large loads and worrying about associated downtime if things go awry.
But you don't always have situations that allow for Meraki's nice gear. When we setup a banquet hall a few months back that wanted to host upwards of 500-700 devices at any given time during business conferences, budgetary constraints on the client's end forced us to use standalone Engenius access points, which meant we had to manually configure and push unique DHCP scopes for various VLANs off a Meraki MX100 firewall.
The situation ended up working out, as we created half a dozen dedicated DHCP scope ranges for pre-planned sets of APs to use in the /24 subnet range. In the end it worked out, but it required some decent up front planning to determine which WAPs would run off which VLAN/subnet combos, to ensure that we didn't run out of addresses. But ideally, I'd rather not play with this as guest Wi-Fi works great off Meraki NAT DHCP in other setups including our own main office.
One last thing to keep in mind if you are manually planning VLAN/subnet separation between WAPs is that lease time should be carefully considered. In a /24 traditional scope with 254 addresses, using a standard 24 hour (or week) lease may bring a conference or other high density Wi-Fi situation to its knees unexpectedly; especially where clients may be moving large numbers of devices between conference hall rooms constantly throughout the course of a day.
7. Use Band Steering To Push Devices to 5GHz
The 2.4GHz spectrum of Wi-Fi is drowning with problems these days. The biggest issue comes down to how many competing access points are fighting for the same limited, non-overlapping channel airspace. In the late 1990's when Wi-Fi was just becoming a reality, no one thought airspace would become so contested and oversaturated.
But we're at the point where 2.4GHz is so congested that it should be pinned up as a last resort for legacy devices -- with newer, more capable ones riding on 5GHz lanes when and where possible.
5GHz brings along the nicety that it offers up 23 non-overlapping channels, which means interference from competing access points is far less of a problem. It also means that we can pack more devices in the same airspace compared to 2.4GHz.
To ensure we are providing the best possible experience for devices, Band Steering is something we deploy on any WAPs that support it. For Meraki's MR series of WAPs, it's an easy switch to flip on a per-SSID basis. Other devices also carry this feature. In high density deployments like conference halls or hotels, band steering should be a clear requirement for your new equipment.
Otherwise, leaving the decision up to non-intelligent devices will cause quick overuse of the 2.4GHz spectrum and in turn, lead to less than optimal results for all client devices.
6. What Kind of WAN Pipe Do You Need? An Easy Way to Calculate
I've written about bandwidth estimating before, and have usually relegated the practice as a part science, and part guessing game. Luckily, there is an easier way now to estimate your bandwidth needs for a WAN pipe, especially if you are enforcing bandwidth limits on your SSIDs.
The calculation you can use to estimate your backhaul (or WAN needs) is fairly simple in this scenario.
So, in my 500Kbps / 150Kbps per client limit example above, if I wanted to allow for 25 devices on my network, I'd need a pipe the size of:
And that's it. Keep in mind these estimations are assuming that every client is needing to use 100% of their given piece of the pie at all times. This assumption is clearly not how situations play out in the real world. The numbers are more so meant to be guidelines for how to size your WAN pipe needs; NOT to give you exact figures in any way.
Depending on your scenario, Wi-Fi clients may be needing less bandwidth depending on the time of day, type of device being used, and type of application being utilized over the pipe. You can take your planning from there in fine tuning your WAN size.
Most new offices we are setting up these days are getting dual cable-coax business lines, or a hybrid with coax/fiber, coax/metro ethernet, or another viable combination. Keep in mind that firewalls like Meraki's MX-series units can perform load balancing which can virtually offer up bandwidth from two separate pipes to endpoint devices. Don't overspend on a single fat pipe if you can get away with more affordable dual pipes.
5. Use Common SSIDs Across WAPs with Roaming Enabled
Too many poor deployments we've come across have had every kind of SSID configuration under the sun. The worst offenders are the ones where each WAP is broadcasting its own unique SSID. And if dual band 2.4/5GHz is being used, they go a step further and separate the SSIDs per band, per staff/guest network, leading to potentially 4 or more SSIDs coming off each WAP.
This is obnoxiously difficult to manage, and creates a nightmare for devices that need to roam a location with multiple WAPs.
Avoid that colossal mess. Modern business class access points are built to allow for seamless roaming between WAPs on the same SSID, as long as coverage zones are slightly overlapping without dead spots. In homes and offices where we deploy Meraki, testing has shown that roaming across a single SSID between WAPs works extremely well as long as channel mapping is set to auto or is properly manually mapped.
Best practice for a multi-WAP office is to always use matching sets of universal SSIDs across WAPs. This ensures easy roaming for clients, and reduces the number of WPA2 keys people have to connect to. Many enterprise class WAPs, like Meraki and Unifi units, allow for seamless roaming as well. (Image Source: No Strings Attached Show)
Best practices for business Wi-Fi these days is to use a single private SSID, a single guest SSID, and have these matching SSIDs cover both the 2.4 and 5GHz spectrums alike. Band Steering comes into play and allows clients that work on 5GHz move to that spectrum, with legacy devices sticking on 2.4GHz as needed.
4. Always Use Separate Staff, Guest Wi-Fi SSIDs
Any business class WAP worth its weight allows for the creation of a private, staff SSID where clients can access internal network resources; and a secondary, guest-only Wi-Fi SSID that maps clients to a different VLAN or standalone DHCP scope, like Meraki's included NAT DHCP service. The added benefit of the latter feature is that isolation is enabled by default, which means potential nefarious guests cannot interact with other devices on the same subnet even if they wanted to.
Offices that covered by HIPAA regulations should note that using a single Wi-Fi network for guests and staff is a BIG obvious violation, and will easily fail risk assessments and any subsequent HHS audit, until staff/guest Wi-Fi separation is enabled. This is because HIPAA rules place "reasonable expectations" on offices to keep devices storing/accessing PHI separated from outside guests.
I went into some more detail on what HIPAA covered entities need to worry about on a separate article.
3. You Don't Need a Hardware Controller-Based Solution
There is a lot of FUD being passed along by hardware vendors who have natural vested interests in peddling their expensive hardware controllers. Ruckus, Cisco (non-Meraki, first party), Meru to name just a few, are all guilty of this. You should be advised that we've tested most of the hardware controller-based solutions out there, along with many software/cloud controller-based products.
Hint hint: the days of needing a hardware controller for awesome Wi-Fi are gone.
The good news? Software or cloud controllers work extremely well today -- so well, that we don't go with hardware controller-based solutions any longer. The hardware/licensing costs for the controllers alone, plus the added installation costs, are not worth it in my eyes. They don't provide any value-add over that of similar cloud-based controller solutions; don't perform any better; and are just one more item that can break down on a client's network.
We started off with, and still deploy on a limited basis, Unifi access points. This is the low cost leader when it comes to a software controller, but there are two main limitations that are not apparent up front. First, Unifi access points rely on a software controller that must be loaded and running on a server of some sort. While it doesn't need much horsepower, the Unifi software is notoriously plagued by its reliance on Java; as such, we've also seen configurations get corrupted between Java updates or Unifi controller upgrades.
Secondly, if you are planning on managing multiple sites with a single Unifi controller, the platform still has a tough time managing WAPs in different subnets. The functionality is advertised as being fully supported, but in reality, it is buggy at best, with access points on non-native subnets getting lost for periods at a time.
Hardware controllers introduce increased up-front product costs, long term licensing costs, labor costs, and one more link in the chain that can cause downtime. Going with a software, or cloud-based, controller system brings the flexibility of being able to stay up to date at all times, and manage your system from anywhere. Unifi APs come with a free software controller, and our favorite Meraki provides a cloud-based controller.
Our company prefers the 100% cloud based solution which Meraki offers. We can control all access points for a single given client's sites in a single web base dashboard, and likewise, we can access all our clients' Meraki networks from a simple dropdown menu on our MSP dashboard. It's fast, simple, and is always up to date. We don't need to maintain a server of our own, either, and clients also get access to the dashboard if they want to be hands on.
Whatever path you decide to go down, be sure to do your homework. While there is nothing wrong a hardware controller solution, the days where this was required to gain the bells/whistles that enterprise Wi-Fi has to offer are long gone. Software controlled (or cloud based) solutions like Meraki or Unifi are just as good, and in many cases better, than the more expensive hardware-based controller routes, even for the biggest rollouts.
2. Channels Matter: Use Auto-RF Assigning WAPs, or Do it By Hand
The crazy channel usage I've seen in the wild still astonishes me. Supposed "corporate" Wi-Fi networks that have APs with channels outside of the 1/6/11 range, or that are overlapping channels in close proximities. Don't fall victim to Wi-Fi hell by making the same mistakes others have.
There are some common guidelines that apply to any Wi-Fi rollout, small or large.
In short, Auto RF assignment is king these days as Wi-Fi is becoming more prevalent en masse. While we do our best to plan out channel assignment during installation, you have zero control over what your neighbors do over the long haul. Having access points that can rework their channels on the fly is key to sustaining an optimal Wi-Fi environment for your organization.
A solid Wi-Fi network is only as good as its channel map. Either leave the heavy lifting to automated logic, like Meraki's Auto RF feature on their MR WAPs, or do it yourself. While the 5GHz spectrum has plenty of non-overlapping channels to choose from, the 2.4GHz spectrum is rather landlocked with only channels 1, 6, 11 being considered kosher for usage. The above sample WAP layout shows how an office should be configured with overlapping WAPs in both spectrums. (Image Source: Cisco)
1. Great Wi-Fi Doesn't Have to Break the Bank
While you don't have to go broke on the most expensive wireless gear to achieve awesome Wi-Fi, it's also not recommended to buy the cheapest gear on the block. I've been installing and troubleshooting Wi-Fi in some capacity for the greater part of the last decade, and have put my hands on most of the common gear available on the market.
I will spare many of the "hot air" pushers on the Wi-Fi market some shame and not point them out by name. Instead, I'd rather provide recommendations on the equipment we have consistently seen work well in customer scenarios.
What's the pattern among all of the WAP makers we recommended above? None of them require a dedicated hardware controller. That's right -- we're considering those days long over. We've helped on large corporate installs, namely with Meraki gear, and the lack of a hardware controller has never been a detriment for us.
Don't be afraid to evaluate gear before installation. Various vendors are willing to provide loaner or demo gear from the likes of Unifi or EnGenius, and Meraki offers weekly webinars where they hand out a free access point just for attending -- it's honestly the way we got hooked on their awesome gear.
Above all, keep this in mind: paying a little more up front for quality Wi-Fi gear usually results in less management headache, more uptime, and much happier end users in the long haul. After literally over a hundred business and residential Wi-Fi installations, I can safely say that we have by and far had the least trouble with well built business class Wi-Fi equipment.
Some clients that had us roll out Wi-Fi on the dirt cheap were frustrated when we had to come and overhaul their installs just a few years out because the implementations didn't meet their longer term needs. No Wi-Fi install will last an eternity; but you can help keep upgrade cycles in the 6-8 year range by planning out for the right gear up front.
Further Reading
A lot of the best practices I referred to are cross-referenced in other good reads for those looking to deploy rock-solid new Wi-Fi networks. The bigger the client load, and greater the distance your deployment needs to span, the more important it is to ensure you are properly planning your rollout.
Meraki has an excellent whitepaper which goes over many of the items you should be planning to implement, especially in high density deployments. Aerohive, a Meraki competitor in the cloud-managed Wi-Fi space, has a similar whitepaper that goes over some related concepts.
A great infographic-style guide was published by ekahau which goes over a lot of great topics, including some good coverage on site survey planning.
And finally, Aruba published a lengthy whitepaper with some further considerations around highly reliable business Wi-Fi.
Photo credit: Shutter_M / Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Too often, cloud hosted VoIP gets a bad rap on the internet. People bashing provider A because call quality stunk. Or giving provider B a tough time because staff constantly had "fishbowl effect" issues with the service. I've read many of the reviews out there, and I'm here to set the record straight about cloud VoIP: the majority of these negative reviews are pointing fingers the wrong direction.
Much of what people see online about cloud hosted VoIP negativity is FUD -- partially being peddled by customers with poor networks, and partially by some nefarious traditional premise-based VoIP telco providers trying to stem the wave of customers moving to the technology.
I'll be the first to admit that when done right, cloud hosted VoIP (or cloud PBX, as some refer to it) is a wonderful option for modern business telephone service. It's extremely cost effective; can be used nearly anywhere if needed; and requires none of the hardware investment that traditional on-prem systems necessitated in the past. But what exactly constitutes done right?
As a managed service provider (MSP), I've been responsible for proper implementation of dozens of successful cloud hosted VoIP installations. And I've also had to render just as many SOS cleanups for those who jumped in two feet first without doing their homework. Furthermore, some clients that decided to take the VoIP plunge without consulting us or adhering to best practices, ended up ditching their systems and pinning their misrouted blame on those "terrible providers" as they call them. Hence the reason the net is full of so many untruthful reviews on otherwise good and honest providers.
If there's one thing I've learned about cloud PBX over the years, it comes down to one simple theory:
On Cloud Hosted VoIP, You Are Only As Good as Your Weakest Link
That's it. Short and sweet as a concept, but the real meaning here is different depending on client situation. Therein lies the devil that sits within the details. Your weakest link may certainly be far different that someone else's weakest link. What shapes do weak links take when discussing cloud hosted VoIP?
I have to look no further than real client situations for examples here. It could be a terrible single DSL WAN link as an internet provider. It could be cruddy Cat5 wiring within the walls, without a proper patch panel system at the rack backbone. It very well could also be a terrible SMB router that they picked up from the local office supply store for under $100. You name it, we've seen it.
A clean network backbone for VoIP follows industry best practices, which includes discrete Cat6 wiring going into quality patch panels, which is then interconnected into business class switch equipment. A messy/incomplete network closet should be the first order of business before ordering VoIP service. (Image Source: L-Com.com)
Before you decide to consider a transition to cloud hosted VoIP for your company's telephone needs, here are some tried and true best practices I've gleaned from years of handling this in the wild. We don't need any more facetious reviews of cloud VoIP providers that are getting an unjust bad wrap due to circumstances beyond their control.
11. Questions You Should Be Asking Any Prospective VoIP Provider
Too many times, we are pulling people out of bad provider situations that could have been avoided with either proper consultation in conjunction with a neutral VoIP expert. Or, due diligence could have been accomplished just by pegging a prospective VoIP provider with some simple questions that put providers' feet to the fire.
Before signing on the dotted line, here are the items you NEED to get answered:
Be sure to get input from key stakeholders in your organization BEFORE calling a sales rep, as you will likely build a question list with items that are custom to your needs. Fact finding up front is critical to avoid wasting time on providers who can't match your requirements list.
10. HIPAA Certainly Does Apply to Cloud VoIP
Are you a Covered Entity (CE) or Business Associate (BA) regulated by HIPAA's iron grip? Then the question of HIPAA compliance should be one of the first items you peg a VoIP provider with. Many VoIP providers out there love to wiggle around this big issue by stating that they are covered under the "conduit exception" under HIPAA regulations. Unless they are offering you service which is purely in the form of a simple analog telephone line for calling/faxing, they likely shouldn't be using the "conduit exception" argument.
And as such, this is an area with a lot of hot air around it. I'll be quite honest here and say it like it is: the only cloud hosted VoIP provider I know of which is 100% guaranteed HIPAA compliant is 8x8. I'm not trying to do a cheap plug here -- these guys will sign a business associate agreement (BAA) on the spot for any client who needs one, and they have numerous pages online outlining their strict adherence to HIPAA compliance regulations.
HIPAA compliance is an all-encompassing methodology not only relating to the security of the technologies being used on their backbone, but how calls are fully encrypted; how data like voicemails/faxes are stored; and the policies and auditing they employ to uphold their compliance stance.
A cloud VoIP provider sidestepping the topic by claiming "conduit exception" isn't serious about gaining your business, and you should walk away.
VoIP provider 8x8 has an excellent blog post about why holding your VoIP carrier accountable for HIPAA compliance is super important.
9. Don't Go With the Cheapest Provider Around
As if this one needs restating. But too many clients are purely shopping on price point. If you are one of those merely driven towards cloud hosted VoIP for cost alone, you may likely be sacrificing other important facets like stability/uptime or 24/7 support.
In fact, many of the "low cost" providers out there don't even have telephone support and rely on email based support or worse, forums. An example of a very low cost provider that advertises extremely cheap pricing, but skips on phone support, is CallCentric. Don't let cost alone override other considerations for a quality VoIP provider.
8. Don't Let Vendors Push On-Prem PBX Boxes as Silver Bullets
I see this a lot in our industry. Many vendors love to push traditional on-premise PBX units, like Avaya IP Office systems or similar, as they have shoe-ins which guarantee their ability to keep their feet in the door with service contracts, maintenance agreements, and the like. There are numerous reasons I dislike this approach for phone service these days:
There are other specifics I could get into, but the above three are the main items of interest I use as pushback when clients talk to PBX-centric vendors. Unless you have a compelling business reason to go the PBX route, cloud hosted VoIP is (usually) your better option.
7. Know Your VoIP Quality Metrics: Bandwidth, Latency, Loss, Jitter
Too many clients on the hunt for VoIP are stuck in a tunnel-vision discussion about their internet needs related to a single metric: bandwidth. While important, this metric only tells one part of the story. This is because, simply put, you can have a pipe that is quite fat (like a 40Mbps connection from your cell provider) but that doesn't mean it's necessarily fast. And this is where many well-intentioned people get burned with VoIP.
Here are the key metrics for internet and VoIP quality which you need to know:
There is an excellent primer on some of these terms with deeper explanation available on DSLReports.com.
And if you are interested in testing your WAN connection quality before getting VoIP, RingCentral has a great tool available for putting your line(s) to the test. They likewise also offer a capacity test tool to see how many VoIP connections your current setup can handle at various quality levels.
Bandwidth, which is the metric most ISPs sell their services upon, tells only part of the VoIP story. The term relates to how fat of a pipeline you are being given in two directions (upload/download) for your needs. But having a fat pipe doesn't always equate to a fast pipe. (Image Source: Coolnerds.com)
After reading the above definitions, your goal should be clear: stay away from cellular/satellite ISP providers and opt for cable/fiber service where possible.
6. Wired Always Beats Wi-Fi, But if you Must...
The highest quality VoIP backbones are always built on solid, Cat6 entrusted networks from the core routing/switching equipment out to the actual endpoints (desk phones, soft phones, etc). This is because wired connectivity has a high level of assured delivery and quality guarantees that we just can't achieve with Wi-Fi-based VoIP yet.
I know situations will arise where it's not possible, or financially feasible, to get wired connectivity back to all desk phones. Or some workers have no choice but to use Wi-Fi off devices out in a warehouse or similar office setting. If you've got to go off Wi-Fi, here's how to make the best out of a bad situation:
As I said, wired is king... but Wi-Fi can be made to work decently well with the right equipment.
5. VLANs Have NOTHING to do with VoIP Quality
There is a long held belief on the internet, coming somewhere from the trenches of geekdom, that VLANs are somehow this saving grace and panacea when it comes to VoIP performance. It's as if any mention of VoIP issues always circles back to this notion of "just use VLANs." This is absolutely WRONG and couldn't be more short sighted and off base.
VLANs are strictly a SECURITY related function of network design. Many people mis-associate the usage of separate subnets for data and voice networks as VLAN'ing, but this is merely because many intelligent firewalls combine both aspects into one operation. The usage of separate subnets is the key part to any kind of segmentation you want to build into your network -- NOT the VLANs.
You don't have to take my word for it. Discussions like this and this online do a great job at having other experts explain the nuances of why VLANs are ineffective for performance improvement or QoS needs.
The usage of strict VLANs unnecessarily complicates your network design with zero added benefit. Only if your DHCP backbone forces you into a combination VLAN/subnetting segmentation mentality should you be using VLANs (as do the Meraki MX series firewalls we usually employ in VoIP scenarios.)
4. It's UPLOAD, Not DOWNLOAD Speeds, That Kill Cloud VoIP
Most internet providers offer service in asymmetrical manner these days, aside from dedicated pipes in the form of expensive fiber or metro ethernet. While these are very cost effective pipes most of the time, they are usually dogged by the reality that download speeds are almost always FAR greater than upload speeds.
So take a typical Comcast coax cable line, for example. Comcast offers various speed levels, such as 16 down / 3 up, 25 down / 10 up, and 50 down / 10 up price points, to name a few. As you can see, upload speeds are never as generous as download speeds.
And this is why I mention this very point. Your download speed is likely not to be a factor in poor VoIP performance, as download bandwidth from the big players is usually very generous even on low plans.
WAN pipes like DSL lines, which we are sadly stuck with as a backup pipe at our own office, are pegged at nominal speeds like 6 down / 1.5 up. If you are sharing a single WAN line for both regular computer/server data needs and your VoIP connections, then you are at very high risk for VoIP quality issues on such small lines.
The logic here is pretty easy to remember: be more watchful of the UPLOAD you are getting from an ISP; download speeds will rarely if ever come to be an issue.
3. Go With Cat6 Wiring When Possible, And Always Use PoE Switches
Core wiring infrastructure is critical to great VoIP performance. As I said earlier, you're only as good as your weakest link. In many situations, customers are fighting with old wiring and failing switches, and calling me curious as to why their VoIP experience is sub optimal.
Without getting too technical, VoIP is a VERY delicate transmission on your network, and traffic degradation/fluctuations due to poor wiring or cheap switches/routers shows very quickly. In some cases, we have had to come in and re-wire a customer network with fresh Cat6 and brand new gigabit switches to solve the problem. Many times, we catch this up front before a client even is allowed to make a move to VoIP, so as to avoid a bad experience preemptively.
If a wiring contractor is offering to install lesser Cat5 or Cat5e, just say no. The extra cost of going Cat6/Cat6a is very minimal these days, and will not only guarantee a cleaner experience on cloud VoIP, but will also doubly future proof your network for 10gig ethernet which will likely be standard on devices in 6-10 years.
Going with PoE-enabled switches is a no brainer these days. It allows for the delivery of data and power over a single line (Cat6 usually) as opposed to forcing desk phones to use AC adapters plus a data line. Filtered power is guaranteed to all desk phones, and AC adapter hassles are completely avoided. We don't roll out VoIP without it these days. (Image Source: CableOrganizer.com)
Lastly, if you are replacing your wiring, it's usually a good time to address putting in new quality switches. Don't go with the cheap stuff here, but there's likewise no need to go top shelf either. We like to go with Meraki switches when possible, but due to their higher cost, great subs are Cisco Small Business SG200/SG300 series units, or Netgear's PROSAFE business line. And don't skip on the PoE (power over ethernet)!
The reason why using PoE to power your endpoint phones is critical is twofold. First, it allows you to ensure you are passing clean, filtered power to all phones since the switches should have power supplied from a secure power source like a UPS. Second, you are reducing the points of failure on the phones since you don't have to worry about staff unplugging AC adapters, kicking them, damaging them, or using up AC ports below desks.
PoE is almost a necessity if going with VoIP these days. The number of problems we have seen on non-PoE deployments is not worth the headache.
One last word on switches: if financially feasible, get separate dedicated switches for your VoIP phones and ones for data connections to computers. This is especially key on large deployments. We make this a requirement for most big VoIP rollouts we handle. The nice part about this is you can apply separate management/QoS policies to the switch ports for a given side en-masse without worrying about affecting the other side of your network. Plus, you can easily use subnetting segmentation that is controlled from the upstream port (in our case, usually a Meraki firewall) to hand out dedicated addressing for that side of the network.
This makes voice/data separation dirt easy and super manageable without much of the headache that traditional per-port VLAN mapping used to entail.
2. Two WAN Pipes Are Always Better Than One
Going with cloud hosted VoIP brings up the danger that we are potentially relying on the stability of a single pipeline back to the internet for our voice traffic. Pipes get cut, outages occur, and all of this is a common fact of life. Even dedicated circuits with supposed uptime policies/SLAs get hit -- and I've seen this countless times over the last few years.
Which is partially why I don't trust my eggs in one basket, even in the case of expensive fiber lines with lengthy SLA language. Going with dual WAN pipes, even from "best effort" carriers, is usually better than relying on the trusted word of a single carrier.
Our standard fare approach to dual WAN lines is usually to seek out two coax line providers that can offer great bang for the buck service, and likewise, bring in service from two separate networks. NEVER settle on two distinct pipes from a single carrier -- like Comcast, for example, which we've seen done on some installs. If the provider as a whole is experiencing issues with their coax backbone in the area, both of your supposed separate lines will be toast.
So double up with separate lines from different providers. Comcast as your primary, and maybe RCN as your secondary. Or ATT Uverse as your primary and WOW cable as your secondary. Whatever combination you choose, just be sure to size your bandwidth needs properly as mentioned earlier.
The likelihood of two carriers going down at the same time is super slim, and probably only going to happen in the face of a major disaster, like a tornado hitting a property. But at that point, you've got bigger issues on your hands than worrying about internet uptime.
The other bonus about dual pipes? As I discuss in further detail on my number 1 point below, dual pipes can be used to offer load balancing for your entire network. Given you are employing a smart firewall/router than can apply such logic, traffic load can be spread out amongst two lines simultaneously to take advantage of the additional bandwidth afforded by a second pipe.
In the case of a single line going down, your entire network then goes into a single-WAN operation until the other pipe comes back up. Then, you can go back to leveraging both connections at once. A crafty, neat feature which is well worth investigating. We use it at our own office on a Meraki MX60, and at many client sites, and it rocks.
One word of caution: stay away from cellular/satellite service as a primary or secondary line. VoIP, as mentioned before, does NOT like the latency/jitter problems with these wireless carrier services, and your VoIP quality will suffer greatly because of it.
1. Don't Skimp On the Firewall -- AKA the Brains of Your Network!
Not all routers/firewalls are created equal. And anyone that tries to say as much clearly doesn't have real world experience deploying high density, high stability VoIP environments. That $100 off the shelf unit from Best Buy or Staples likely won't do the trick. And yes, I do know many cloud hosted VoIP companies advertise some of these cruddy models on their websites. But the number of times we've had to replace these with business class units are innumerable thus far.
Even just searching for a "business class firewall" in general won't yield the right results without some fine toothed combing across what the innards of the actual devices truly offer. Here are the definite items you should be requiring of any new firewall to manage the backbone of a cloud hosted VoIP network:
While the above won't guarantee that you won't get stuck with a non-compatible or functional firewall when it comes to cloud VoIP, these are tried and true facets that we have come to rely on over the years.
In general, our company stands behind the reliability and stability that has been proven from the Meraki MX-series firewalls we have in the field. These units are so entrusted with our cloud hosted VoIP rollouts that we pretty much exclusively have settled on using them for nearly every client situation. In fact, if you want to come onboard as a managed support customer with our company, we won't let you on without having one of these boxes at your office.
The Meraki MX64W, shown above, is a firewall we have battle tested time and time again with excellent results handling cloud hosted VoIP. We've tried Sonicwalls, Cisco ASAs, Netgear units, and everything in between, and keep coming back to Meraki's MX line of devices. They just work. What more could we ask for?
Many people have had pretty decent luck with Sonicwall, but after a messy support experience during a stressful deployment on a TZ-series router last year, we are steering clear of the devices now. I know full well it's a situation that could be isolated to us, but we have little reason to move away from the rock-solid nature of Meraki's boxes.
Regardless of which direction you go with your router, just remember: DO NOT SKIMP HERE! A cheap firewall usually yields cheap results.
Cloud Hosted VoIP Works Well When Planned For Accordingly
Like with any new technology change at your business, proper planning is key and downright critical in many ways. Jumping the gun and ordering service from the likes of a RingCentral or CallTower Hosted Skype for Business offering without due diligence is recipe for disaster.
And trust me, we have seen it countless times out in the wild. Just like you can't push the best race car in the world to its peak performance if the road beneath it stinks, cloud hosted VoIP is very much a similar beast.
It loves low latency. It expects redundant, dual WAN pipes. It excels when delivered over quality Cat6 lines and matching switch equipment. And downright thrives when managed with intelligent, VoIP-aware firewalls such as Meraki MX-series boxes.
Is there a cookie cutter approach to delivering the sole "answer" for quality cloud hosted VoIP? Absolutely not. Each office and scenario will require its own dedicated plan for filling in the blanks.
Hopefully the above best practices can give you the tools necessary to make the right decisions when investing in cloud hosted VoIP. For most companies, they only have one solid chance to do VoIP right -- make it count on the first try.
Photo Credit: Gazlast/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
I owe Sony a lot of credit. It wasn't the original Xbox and the young Xbox Live service that gave me my first taste in console online gaming. It was my beloved PS2 that connected me to the early adopters skating around Tony Hawk Pro Skater 3, and wannabe commandos practicing flanks together in the original SOCOM.
With an add-on Ethernet adapter and LAN cable strung halfway around my house, the PS2 allowed me to engage in an activity that gamers on most modern consoles take for granted. Getting online with PS2 games was part test of patience and part geekdom experiment. PlayStation Network wouldn't launch for another four years or so, and one console later -- on the PS2, you were truly on your own when it came to navigating online gaming.
But I didn't care too much. The joy of just skating with others in THPS3 or partaking in intense team battles on SOCOM was something I had not done before, outside of split-screen match ups in a single room, fighting for tube TV real estate. Sony rightfully was haphazardly dipping its toes into console online gaming, seeing that SegaNet couldn't keep the ill-fated Dreamcast afloat, as novel as it was. Xbox Live it was not, but pains aside, it sure was a blast.
Getting online with Tony Hawk Pro Skater 3 (above) on the PS2 wasn't easy by today's standards. But for 2001-era online console gaming, it was a novel experience worth the trouble. PS2 introduced me to online gaming on consoles, but Xbox Live took the concept and has trounced Sony ever since. (Image Source: GiantBomb)
So fast forward to 2015, and I can finally say I've leapt into the modern console generation faster than any other point in previous generation console wars. I went Xbox One, and I don't have any qualms about my decision. No regrets, either. While many around me decided that PS4 was their console of choice, I just couldn't justify going back to Sony after such a great 7+ years with my Xbox 360.
From the known quantity and gold standard for online gaming, Xbox Live, to exclusives I just care more about (namely Halo), PS4 didn't get me excited in enough areas to pull me back into Sony land. Before the online trolls pounce on me for picking a camp, let me run through the justifications for sticking with Xbox this go around.
9. Price
Sony may have had the lead over Xbox One both at and for a decent time post launch with the PS4, but the tables have effectively turned. Microsoft has ditched the requirement for Kinect to come with every Xbox One, and as such, pricing is more aggressive than ever now. While PS4 bundles with games like Arkham Knight or The Last of Us are retailing for $400 USD, Microsoft offers Assassin's Creed Unity and Master Chief Collection bundles for a nice $50 USD cheaper.
And for gamers who want the fatter 1TB edition Xbox One (w/ Halo MCC), that edition takes the same $400 price space that PS4 occupies with a 500GB console and a single game.
As if the above deals weren't sweet enough, Microsoft is even offering a SECOND free game with a new Xbox One at no extra cost through June 27. That's two free games, once all is said and done, for $350 USD before tax and a 500GB console. Generous is one way you can put it.
Pound for pound, Microsoft has better value on the table across the board against the PS4. While I got my Xbox One via extra special pricing through some coupon and credit action I had accumulated, Microsoft's console is still the better deal for the average shopper.
8. External Storage Capability
While the PS4 allows you to upgrade the internal hard drive with a larger unit, it's a touchy process that involves many of the things which make computer users cringe when it comes to doing the same. Backing up data, manual drive replacement, installing OS software, etc. I personally don't care to mess around with that on my game console, even though I'm a professional geek by day.
With the Xbox One, I can pick up any off the shelf external USB 3.0 hard drive and plug it right in. That's it. The Xbox One handles the rest, and I've got a massive increase in storage space for game saves, game installs, and all the other files that quickly consume internal HDDs on consoles.
This is great because I can take advantage of the benefits of massive storage space without fretting about bricking my console with a botched hard drive upgrade step. Microsoft was smart in bringing this capability to Xbox One so early in the console's life.
7. Aggressive Console Firmware Update Cycle
While Sony has been trickling out new features for the PS4 via firmware updates, Microsoft has literally unleashed a torrent of updates on the competing Xbox One in the last year and a half. From functionality like Snap to refined TV watching down to interface updates, Xbox One has been gushing with improvements each month -- easily more than any release cycles seen on the Xbox 360 or even original Xbox before that.
Some people may attribute that to an argument that Xbox One was not as refined or polished as the PS4 at launch, but it's a subjective statement at best. The fact that Microsoft (like Sony) is committed to bringing monthly improvements to its console is reassuring that it's not only listening to fans, but proving it's standing behind its product for the long haul.
If you're unconvinced at the massive update list the Xbox One already enjoys, just take a read at the lengthy changelog Microsoft hosts. One can arguably say it trounces the PS4 changelog in both size and breadth over a nearly identical timeframe.
And big changes are still on the horizon, with news and video of the upcoming Xbox One Fall Update 2015 just being released. Microsoft's tossing us a revamped home UI, Cortana integration, and very likely the inception of Universal Apps -- something unique to Xbox due to its unified Windows 10 backbone. I'll touch more on that benefit further down.
6. (I Think) One of the Best Controllers Around
I'm still torn whether I prefer the Xbox 360 controller more, by a hair, but the Xbox One controller follows in the same style and design choices which provided me so much joy on last generation's 360. After years on a PS2 controller, in Dual Shock land, I can comfortably say that for my larger hands, the Xbox controller is a few notches more comfortable.
And it's not only comfort that does it for me. I prefer the offset nature of the thumbsticks opposing each other, with a D pad separating the two. The current, and last few, Dual Shock controllers continue to feel way too... utilitarian, for lack of a better term? Sony's controller isn't bad, but it's just too stubby in the wrong ways which creates for long gaming sessions that are just not as easy on the hands as the Xbox One controller.
(Image Source: TechRadar)
One thing I also never understood with the PS4 controller is the inclusion of the touchpad. It has no true outstanding real world application in a game yet, and much like these Reddit commenters, I'm convinced it's currently nothing more than a gimmick. Sony may prove me wrong, but almost two years in, the still-too-new-to-develop-for excuse won't hold for much longer.
The Xbox 360 controller was a huge update to the chicken sized original Xbox controller, which I never particularly adored. With Xbox One, Microsoft kept all the best aspects of the 360 iteration and added in a few new touches. But it's still one of the most comfortable controllers I have put my hands on.
It's not too surprising, then, that VR gaming yuppie Oculus Rift chose the Xbox One controller for its first major offering to consumers.
5. HDMI-In Port and Live TV DVR Likely Coming Soon
One year ago, I rallied against Xbox One for missing on one major opportunity, which is a lack of live TV DVR capability. It's still the big missing elephant in the room, for reasons we may never know, but I'd be quite shocked if Microsoft wasn't preparing to roll this into a system update in the next half year or so.
The writing is all but on the wall by now. On top of the bevy of TV/OneGuide updates that have hit the Xbox One in the last year and a half, Microsoft finally brought live TV viewing to Xbox One for US buyers. While it doesn't feature any kind of TiVO style functionality, which is what I am dearly waiting for, it's not far fetched to believe this is the next logical step in crafting the Xbox One as the one-stop-shop media center to replace all lesser devices in your home theater.
And why wouldn't Microsoft bring us live TV DVR capability? It only makes sense -- especially since Microsoft has billed the Xbox One as a gaming+media hub since its very inception. Filling that final piece of the puzzle will give Microsoft the bragging rights it wants: to be the unified experience in your living room; the only electronic you will need for any and every entertainment medium from the comfort of a couch.
The new 1TB Xbox One only leads further credence to this notion, as the original 500GB console was getting too cramped for most people's liking. While 500GB was a tight fit, 1TB is quite nice, and the option for external storage to fill any media hog's needs is icing on the cake if TV DVR is truly on the way. All we need now is a proper CableCard tuner for Xbox One; you can read my thoughts on that aspect in a previous piece.
And even if we don't end up seeing native CableCard support on Xbox One soon, the inclusion of the native HDMI-In port on the rear of the console will still allow for streaming from a cable DVR for the interim. I actually stream my TiVO into my Xbox One without issue right now in this manner and it works for the most part. I haven't tied OneGuide into my setup yet, as I don't have Kinect and decided to buy IR blaster cabling to get the job done, but it will only make the experience that much cleaner for me.
A unified media and gaming experience under one device? I like that a lot, and Sony doesn't seem interested in covering the same media bases on the PS4. The Xbox One was a no brainer in this regard.
4. More Exclusive Games I Care About
I'd almost argue that the PS3 had a better exclusives lineup against the Xbox 360 than what the PS4 puts up against Xbox One. Aside from Uncharted and Killzone, there's not much that excites me on the PS4 side. Even Sony themselves half-admitted this fact when Sony CE CEO Andrew House referenced the company's tough time with landing solid exclusives, and having a "sparse" first party lineup.
Xbox One has numerous exclusives, some of which I already purchased. Titanfall is an awesome titan action game I am just getting into. Sunset Overdrive is wildly addicting once you get into it (especially if you're a Jet Set Radio fan from the Xbox/Dreamcast days). And it goes without saying, as a Halo fanatic, that Halo 5 and Halo MCC are excellent additions to the series' legacy.
(Image Source: Xbox Wire)
Add in the fact that Xbox One is getting Metal Gear Solid treatment with MGS5, and has access to numerous cross-platform titles like the Call of Duty series and most sports titles, there's little reason I would consider a PS4 from a games perspective. Of course, this is completely subjective and relative to your taste in games, but for me, Sony has little concrete to sell me over that of an Xbox One.
MGS was one of the few remaining things making me regret not getting a PS3 (due to MGS4) but I wouldn't be surprised if even that game got wrapped in a dual-pack flavor with MGS5 at some point on Xbox One. Assassin's Creed, another series with relative new fondness for me (Black Flag got me hooked), is also solidly cross platform with few signs showing much changing there.
And one of my favorite PS2-era games, SOCOM, has no evidence so far it will be making any kind of return to action on the PS4. I was holding out for a revival, but it seems Sony has no appetite for bringing me back to my original online console gaming roots.
PlayStation's grip on the exclusives I truly can't live without is long gone. Since the 360, Xbox has held that crown for me.
3. Universal Apps
With Microsoft unifying the code base of all Windows devices to Windows 10 this summer/fall, Xbox One is one of the lesser-discussed beneficiaries. Apps on consoles has always been a love-hate affair for most, since development effort for them is usually steep, and approval policies for making it into a console's available app selection considerably tougher than on mobile or desktop devices.
That's about to change for the better. Reports are saying that the Universal App change for Xbox One will reap real-world rewards early on with "thousands" of new apps coming to Xbox One upon launch. While Microsoft will still treat Xbox One app approval with a stricter wand than its other platforms, the notion that developers can opt to create a single unified app and have the potential to serve Xbox One customers if they please is a pretty big deal.
Write once, deliver everywhere. At least that's the mantra Microsoft is offering up for developers to jump into the Universal App scene on Windows 10. A Windows 10 laptop app can run on tablets the same way it can, theoretically, run on Xbox One as well. This could make or break the Xbox One app scene in the long haul. (Image Source: VRWorld)
Serving Xbox One users won't be a discussion of "it's too much work" for developers; it will merely be another checkmark on their supported devices list in the Windows 10 ecosystem (given Microsoft's final approval, mind you). While this won't translate into things like Office 2016 being usable on Xbox anytime soon -- realistically, we can expect a plethora of additional new age media apps to hit, like additional video/music streaming apps, news apps, gaming apps, and plenty of in-between stuff.
This also means that modern cross-platform experiences will be made possible, like game saves, for example, being consistent between Xbox One and a Windows 10 tablet. Or the excellent YouTube app, MetroTube, allowing you to start a video on your Windows 10 laptop, pause it, and finish watching from that exact spot on your Xbox One. The possibilities are limitless.
Either way, Universal Apps and the Windows 10 transition for Xbox One can only be positive for the console as a whole. A unified app store is something only Microsoft is bringing to fruition in such scope, and for Windows users, it's a win-win situation.
2. Xbox Live
Even though I got my first taste of online console gaming on the PS2, once I tasted Xbox Live on the original Xbox, there was no comparison. Consistent friends lists, messaging, and seamless access into online capabilities for any network-capable game on the console. That was a hard to beat proposition which I quickly grew to appreciate back in the original Xbox days.
PlayStation Network has had a good track record and is a distant second to Xbox Live. But Sony's debacle with how long its 2011 PSN outage lasted (23 days total) was embarrassing, and makes me question the technical quality of Sony's network backbone for the service. Even TIME Magazine penned a million dollar question: is it time for gamers to abandon PSN due to its meandering uptime issues?
Microsoft is no angel with Xbox Live, don't get me wrong. But given that Xbox Live got its start back in 2002 and PSN only entered the fray in 2006, Microsoft's overall track record has been quite better over a longer timeframe, with no major month-long outages to its name. Never say never, but Xbox Live is a solid, stable gaming service that is well worth the $60/year (note: I get my XBL subscription cards online for only $45 or so each year; hunt around and you can find the same).
The tight integration with the Friends system, Party sessions, game chat, and the other bells and whistles that Xbox Live advertises makes it an easy sell for the gamer who loves playing with friends. While I consider myself a casual online gamer now, preferring random Halo sessions to drawn out nights with XBL friends, the platform works the way I would want it to. Even if PSN were completely on-par with Xbox Live, it would have to offer something compelling to draw me over.
That, at least for now, is something Sony doesn't offer -- a compelling reason to jump ship.
1. Xbox 360 Backwards Compatibility
Microsoft dropped a huge bombshell at E3 this year in that Xbox One will finally be able to play Xbox 360 games. While the list starts small at about 100 titles by this Christmas, just like Xbox 360 did for original Xbox games, the list will start to swell as time goes on. And that's a great thing for gamers who are heavily invested in 360 titles (like I am).
For the skeptics, yes, there are a few gotchas, and I'll be the first to shed light on them. First, publishers need to OK every single title that will be available via backwards compatibility. And secondly, Microsoft will need to "work its magic" to enable the functionality for any title approved by the publishers. But given those two aspects, we can still expect a large portion of 360 titles to make their way onto the Xbox One.
I can see some hesitation among some publishers, like Activision with yearly-moneymakers such as Call of Duty, in bringing older releases over. Paul Thurrott insinuated as much on the Windows Weekly vidcast last week, and I do agree with him on this sentiment to a degree. But even so, I'd rather get a sizable portion of the 360 library ported over onto the Xbox One, then not see any.
One of the biggest worries I had when looking at the Xbox One was: what the heck do I do with my sizable Xbox 360 game collection? Set the console aside and leave those games to collect dust alongside it? Sell them off and forget 'em entirely? There are some which I still wish to replay, or go back and finish achievements on, and this seems to finally be coming into a reality for Xbox One. Hopefully, publishers will play nice and turn this functionality on en-masse for their titles.
I can't see anything but goodwill coming back from gamers; both in easing the decision on transitioning to an Xbox One without a fear of ditching older game collections, and further, option to continue buying into the franchises they adored from last generation.
Early reports are already coming in that the feature, surprise-surprise, works just as advertised with few issues seen so far. I want to see how big blockbusters from 360 do, however, and not just games I don't care too much about.
Sony has something which seems to compete nearly directly with Microsoft's backwards compatibility feature, which is its paid PlayStation Now service. But the service comes at a price tag which isn't necessarily expensive, but also not free like Microsoft's solution. PSN Now runs $20/month for month to month subscribers, or you can buy 3-month terms for $45/term. I will give it to Sony that the playlist is considerably larger than what Xbox One backwards compatibility currently offers, but that disparity will close as time goes on.
Any way you slice it, no one saw the feature coming at E3 this year, so the announcement truly hit it out of the park for hesitant 360 holdouts like myself. I'll be keeping that 360 game collection after all, thank you.
The Xbox One Value Proposition: Gaming + Media Nirvana
Unlike some that see a console as a pure-play device for mere gaming, in the year 2015, a console in the home theater plays a much bigger role. Unless you prefer using a Roku for Netflix, an Apple TV for music, a tablet with HDMI for YouTube, a gaming console for games, and other single-function devices, I'm fully of the mentality that unifying under a single device that can (almost) do it all is a rather neat value proposition.
Luckily, Xbox One replaces all of the above single-use devices, except for the cable DVR/TiVO. And as I mentioned above, I would not be shocked if that missing link was added into the Xbox One's arsenal in the near future. $350 for a console that can do what numerous devices previously were needed for in the past? I'm all ears.
Microsoft's got some work to do still. Xbox One is a great console that is much more valuable than a PS4 to me personally, but it's not perfect. That Xbox 360 backwards compatibility list needs to grow tenfold. Live TV DVR capability and CableCard support is still sorely lacking. And it's still not clear how much of a win Universal Apps will be for early adopters of the console.
But with that said, Xbox One still gets a lot right. A game selection that has the right mix of exclusives, genres, and cross-platform titles for a casual gamer like me. Solid media-center integration that is only on the up and up. And hands down the best gaming network to tie it all together into a seamless experience.
It's not too late for Sony to lick Microsoft's chops. Until that happens, I'm darn happy I got an Xbox One.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
No less than a week ago, young Dylann Roof, a self-described hardline racist, decided to take nine wonderful lives in Charleston, SC. This was an individual who, through a manifesto since unveiled, had shaped his worldview so narrowly that he irreversibly joined the corrupted fringes of society.
But what really killed nine innocent people here? As much as our President wishes to believe it so, a gun wasn't the real cause of Roof's killing spree; it was an accessory to murder. Nor was the Confederate flag; this was merely hijacked window dressing to glorify one man's twisted reality. With as much debate currently focusing on these non-issues, the real causes which bred a monster by all definitions are sadly diluted from discussion.
The real issue which few seem to be questioning today: What exactly happened to Roof's support system? While the news stories seem to be stitching a clearer picture together, it's already evident that Dylann Roof had little in the form of a stable family life. The few acquaintances which Roof actually had contact with were distant drinking buddies at best. He had no positive fallback in his spiraling life, and found resonance in some of the worst gutters of the internet stained in race hatred.
Recovered photos of Roof posted online show an individual lacking any soul or sense of humanity. It's almost as if the devil sucked the emotion out of Roof and left behind a shell whose sole purpose was to destroy life. This was not an overnight occurrence; more likely, it was the systematic devolution of an individual.
Support Systems of Hate Fill Roof's Void
Lacking any real set of friends, Roof's family situation was one which was severely fractured due to a divorce which took away Dylann's last semblance of familial stability. And as it has been proven time and again, people in such circumstances usually find something to fill this void in their life.
Inner city youth fall back on vicious gangs for their support structure. Disaffected youth in regions overwhelmed by religious fanaticism turn towards the likes of Islamic State or HAMAS. Dylann Roof chose the comfort of racist vestiges of the internet instead.
While the messages and belief systems of gangs and online hate groups differ in substance, they share much common ground with how they prey on impressionable victims. They both march to a never ending drum of victim-hood. Both point fingers at incessant scapegoats as the reasoning for their die hard beliefs. And both see themselves in consistent war with enemies both literal and symbolic in nature.
The American flag is flown at numerous KKK and related hate group rallies every year, as seen above. And it's public knowledge that the state flags of Florida, Alabama, and Mississippi all have varying creative reference to the Confederate battle flag. Where do we draw the line on censoring endless symbols of American heritage? Knee jerk politics are comforting in times of despair, but they bring few long term solutions to the table in the Roof debate. (Image Source: BBC)
Roof's descent into years of enthrallment with white power nostalgia and fascination has come to light all over the web. Numerous photos of Dylann show him gazing emptily into the camera while wearing a jacket adorning patches of the defunct states of Rhodesia and apartheid-era South Africa. Recent day trips of his seem to have centered around Civil War era plantations, slave mannequins, and beach fronts adorned with his own writing of the codeword "1488" that is an allusion to the white power movement at large.
Dylann's dedication to his newfound support structure and belief system didn't stop at mere adoration for a fringe cause. His morals quickly became twisted and corrupt after spending so much time dousing himself in these bowels of the internet. So much so, that his own manifesto outlines the ultimatum he set out to fulfill:
I have no choice. I am not in the position to, alone, go into the ghetto and fight. I chose Charleston because it is most historic city in my state, and at one time had the highest ratio of blacks to Whites in the country. We have no skinheads, no real KKK, no one doing anything but talking on the internet. Well someone has to have the bravery to take it to the real world, and I guess that has to be me.
It's fairly clear that Roof's infatuation with racial purity and white supremacy inbred a vicious circle upon Dylann. Exposure to one-sided fringe media around the time of the Trayvon Martin situation in 2012 grew into an outright militancy for a cause considered as nothing less than outcast by any average citizen.
But without a clear moral compass as a guiding beacon, and no familial structure to provide that clarity, Roof consumed himself with the lowest common denominator available -- all at his fingertips, in the comfort of his isolated bedroom.
Yet it's important we look at all of the other failed pieces of the puzzle which very much attributed to Roof's ability to commit such a heinous crime. Why was the father of this individual providing his depressed son with a handgun for his birthday? And what about Roof's proven terrible history with prescription drugs like suboxone? A fellow classmate even went on the record to say that he used drugs "heavily." Very much a "pill popper" the same student claimed.
Someone as depressed as Dylann, drunk on white supremacy teachings, popping drugs like a 1970s rock star, and without a family structure to provide a safety netting for his out of control life spiral? Pinning blame on guns or Confederate flags is downright absurd, and short sighted at very best.
Is Internet Censorship the Answer?
Dylann Roof's fractured family unknowingly left him to his own downward regard, and he sank into a void consumed with blind hatred and racist pride. The silver bullet in answering what could have saved Dylann from this bastardly path is easy to speculate on, but near impossible to pinpoint.
Is the real answer here to censor and close down hate-filled ghettos of the internet? Absolutely not. As I argued in the case against censoring Islamic State and its extremist propaganda, the survival of a just society is pinned on exposing the worst offenders of immorality and injustice. If the vermin of society are able to sway select portions of humanity in their beliefs, then we need poster examples of what evil and hatred looks like to fight the good fight. You can only defeat an enemy which you can identify.
Years of drugs and family neglect, compounded with endless hours combing race-hate internet sites, bred a lifeless militant on a hellbent mission. The result, Dylann Roof, carried out the promise laid out in his manifesto. Failed societal support structures in Roof's life were the true cause of his moral unhinging, not lax gun laws or the Confederate flag. (Image Source: Radar Online)
Terrorist, gang member, or race warrior -- all have distinguishable traits, culture, and support systems which follow consistent patterns for easy identification. And as such, we have an obligation to use this information in proactively protecting today's potential next Roofs, and actively working towards preventing another collapse in positive support structures like in Roof's case.
But the most vulnerable in society, those easily caressed into directions of discredited beliefs, need to be caught before they are allowed to descend into moral oblivion. Dylann Roof had no life vest to cling onto. His family was non-existent for him. He was distanced and off-putting towards any resemblance of friends. And the criminal justice system, which came into contact with him on numerous instances, also failed to identify the growing monster behind an innocent face.
For who can we, or should we, pin this blame on? It's safe to say that this shared blame should be cast in many directions. The family for letting Dylann slip away. The father for purchasing a weapon for a potential menace to society. And potentially many others who may have, could have, noticed a disaffected soul crying out for meaning in life.
But debates about flags and gun laws have little to do with explaining the Roof situation -- or more importantly, preventing his spiritual successor.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Microsoft made it pretty clear: Windows 10 Technical Preview for Phones is still only meant for serious technical testers only. Of anything I've learned after one full week using my daily Lumia 925 on the preview release, it's that this OS is far from ready for primetime. But that's not necessarily a bad thing, per se.
I decided to write up some of my thoughts about Windows 10 Technical Preview for Phones 10051 before I rolled back my phone to Windows Phone 8.1. Yes, willingly, I spent the last work week using my daily Lumia 925 on preview build 10051, the latest and only second release Microsoft has publicly doled out for the community.
The first preview release was extremely underwhelming, only including a handful of phone models, many of which are either extremely new or used by a minority of Windows Phone users. But with 10051, Microsoft finally opened the gates to the rest of us (well, most of the rest) and I decided to play guinea pig.
As a relatively unabashed Windows Phone fan since I publicly and unapologetically ditched Android in late 2013, I figured I had little to lose here. If I'm going to be running Windows 10 for Phone at some point this year, I might as well get some familiarity with what's coming, right?
That was the thought. And I'm not going to hold the release's bugs against Microsoft because, as they warned, I shouldn't have chosen to use this on my daily phone but did so anyway. My opinion of the OS as it stands right now takes that into account, and I'm merely offering up the areas where the OS shines, where it doesn't yet, and most importantly: what the preview hints at for the biggest changes coming to Windows Phone as a platform.
The Install Process
Seeing as I was already running the latest, greatest version of Windows Phone 8.1, moving up to the Windows 10 for Phone preview wasn't that difficult at all. I downloaded the new Windows Insider app which replaces the former Preview for Developers app, and after a reboot or two, was able to see the phone preparing the download for the new test OS release.
Little did I know, I actually first received Windows Phone 8.1.2 (or 8.1 GDR2) on my Lumia 925 before build 10051 of the Preview release was loaded on. I'm not entirely sure why this happened, but many people seem to be reporting on it widely thus far. Paul Thurrott has an excellent walkthrough of 8.1.2 if you're interested in the last big hoorah for Windows Phone 8.1.
Once I had Windows Phone 8.1.2., the update process began again, and finally loaded the bulk of the update in the first go-around which took about an hour from start to finish. There was a smaller, secondary update which came after the first major install was done, which seemed to be called a Configuration Update of some sort. I'm not sure what it pushed down, but it was quick to put on and was seamless.
Lack of Cell Data: Why I Had to Clean Install
One of the most frustrating initial aspects holding back my adventure on Windows 10 Technical Preview for Phones was cellular data which seemed dead in the water after a single reboot post installation. It was as if my phone had no idea of how to connect up to Tmobile for data. Regular texting and calling seemed OK, but LTE for all it was worth seemed to be gone entirely.
I scoured the Windows Central forums for hours trying to see if anyone else had encountered the same mess of a result, and numerous threads referenced the issue. There were a few tricks that worked for some, namely creating a fake APN to trick the phone and then delete it, but it was fruitless.
I took other suggestions to wipe the phone and reinstall the preview build, this time refusing to use my backup to restore from (for app data, settings, etc). No go. I tried a few more hard resets, thinking that it was maybe a fluke.
After losing most of my Saturday pulling my hair out about a lack of data, I finally took another user's advice and went back all the way 8.1 on a fresh install, THEN upgraded up cleanly to 10051 without choosing to bring back any backed up data.
For one reason or another, that last attempt did the trick. It seems that there was some deep down APN or data settings which just refused to work properly without coming off a clean 8.1 install and moving directly up to 10051. Regardless, it was about Saturday late afternoon by the time I had my cell data issue under control.
I spent the better part of Saturday evening and some of Sunday morning finalizing the way I wanted things, so I could start my work week on Windows 10 Technical Preview for Phones 10051.
The weeklong experiment has come to an end, and here's what I love -- and don't -- about the next iteration for Windows on the handset.
The Bright Spots: Action Center; Universal Apps; and Familiarity
One of the biggest things that Windows 10 for Phones has going for it so far? Keeping the best of what made Windows Phone 8.1 unique and special over that of Android and iPhone. I had a fear that Microsoft was going to make some changes to Windows 10 for Phones that would be too far away from what I grew to love about Windows Phone. Luckily, most of the former's guts are present, with some welcome changes in other areas.
By far the biggest positive differentiating aspect about Windows 10 for Phones is its inherent design around the concept of Live Tiles, an ever-present feature spanning all Windows interfaces since Windows 8. While it got a lot of heat on the desktop, it's absolutely refreshing on a cell phone, and is so much easier on the eyes, and the fingers, compared to the drivel that Android/iPhone consider "app organization."
Not too much about that concept has changed on Win 10 for Phones. The option to have even more tile space is still present, a holdover from Win Phone 8.1, which allows for greater tile density and sizing options. Not all apps support the bevy of sizes Microsoft allows for, but it's only a matter of time until this spans the entire ecosystem of Universal Apps that will soon become ubiquitous over the Windows world.
The Windows 10 for Phones Start Screen interface, complete with Live Tiles, is still front and center in this iteration. Windows Phone 8.1 introduced the option to have even more tile sizing options, and that facet is still present in the preview release. One lacking feature? Complete tile transparency which was so cool about Win Phone 8.1. Not entirely sure why Microsoft decided to remove the option, as I grew to love it.
This whole concept of Universal Apps is still a bit fuzzy, as there's not much of it present yet in Windows 10 for Phones. The biggest new examples are the Outlook Mail and Calendar apps, along with the new Maps app, which will be sharing center stage with the upcoming Office Touch apps.
The new Outlook apps definitely expand on the feature set which was a tad limited with the older integrated email/calendaring from within Windows Phone 8.1, but they are currently plagued by speed issues and a nasty bug which, for some reason, forces you into the Calendar app even though you intended to go into a new mail item from the Action Center. I'm hoping that's a temporary issue.
The formatting options and email fidelity from within the new Outlook Mail app are awesome, and finally bring the rich email experience on par with what I used to enjoy in the Gmail app on Android on my old Galaxy S3. However, autocorrect seems busted in this build when typing emails, so mistakes on outgoing mail have been much higher over the last week.
Outlook Calendar is leagues better than the old Windows Phone 8.1 calendaring app, especially because proper shared calendar support for email accounts is present. And the lush UI for Outlook Calendar in the preview release is also much more appealing than the flatter, drab design in Windows Phone 8.1. Speed, like with Outlook Mail, are still problems, and need to be addressed before this OS goes live.
One area which got introduced in Windows Phone 8.1 was the Action Center. Well, with Windows 10 for Phones, we now have three rows of four icons each, for a total of 12 editable buttons you can customize in your Action Center. The default Action Center view is a flattened look, similar to Windows Phone 8.1, but you can expand/collapse on demand as needed to bring out the full set of options.
Action Center in Windows 10 for Phones builds upon what Windows Phone 8.1 introduced. Now, we have a full three rows of settings options which can be edited, bringing out all of the common settings we could ask for in a single swipe from the top. New text messages, emails, calls, etc still stack in the lower part of the Action Center as before, too.
The in-line replies option, which is brand new, where you can respond to text messages for example without having to leave your current app, is extremely useful. I found myself using it to reply to texts much more quickly without losing my place in my news apps or within RSS feeds, which was commonplace previously. It's a bit unwieldy at times, however, and doesn't always respond to the touch sequence normally used to stage a response. I'll attribute that to bugs on the preview release.
There is a new Alarms app in Windows 10 for Phones which features a new world clock view, along with a neat new timer, stopwatch, and a deeper array of alarm options. It's a quirky thing to be excited about, but I really like where the Alarms app is heading in the next release.
I did try out Project Spartan, the new web browser taking over for IE Mobile, and was fairly impressed with the rendering speed on most sites. The problem? The app is still way too buggy, and Microsoft seems to still place a lot of reliance on reverting back to IE in situations where I guess they predict Spartan wouldn't be able to load or render right. I wish Spartan was more fully usable as a daily browser to get an idea of what it can truly do, but it's still early as they claim.
As a very rough preview, the app seems to be coming along nicely, and offers a glimpse of how Windows on mobile will make further strides in bringing rendering equality to Windows Phone devices.
While still extremely buggy and slow, the new Maps app (a Universal App now) combines the previous feature set of HERE Maps and HERE Drive into a common single experience. Microsoft now uses imagery and data from Bing Maps and the Nokia HERE database to power the new Maps app experience. I find it refreshing to be able to rely on a single app going forward, which means a common UI and one place to go for any mapping or navigation needs.
I never liked the fact that mapping and driving was divided between the various HERE apps. They worked, but it was clunky to have to remember which app controlled which functions. It was one aspect of Google Maps which I missed. Well, that seems to be soon a thing of the past.
The Not So Great: Slowness; GPS Bugs; Lack of Office Touch
Windows 10 for Phones 10051 is in essence a technical development preview. As such, it was billed as buggy, experimental, and not-even-close to ready for primetime. And for the most part, I have to agree with these warnings. After one full week on the build, numerous things I grew accustomed to on Windows Phone 8.1, like a fast camera start time, OneDrive camera uploads, and swift autocorrect across all typing interfaces (in 10051, email has a complete lack of it) was broken or faulty.
I'll start with perhaps the most frustrating issue I had with 10051: the Maps app. While I adore where MS is heading with the unified experience, the app is far from complete or functional yet. For example, when I had to rely on the app for numerous client trips during the week, I frequently had to tell the app where my location was, because it couldn't render my Lumia 925's GPS responses properly, as shown below:
Another related issue with the Maps app was an overall buggy experience with real time driving navigation. For example, the app would sometimes just refuse to have the live map follow my true position, but the actual distance from location would properly shrink.
Likewise, I had to deal with numerous trips where my cursor denoting car location on the map was all over the board, and at times, being duplicated off the road I was currently traveling on. Expected, I agree for a preview, but still caused me much pain during some important trips I had to make.
I happen to rely heavily on my camera app during the workweek, as I'm taking pictures of new client sites we are preparing projects for, or of what was done at jobs that are complete. I downright love the fact that my Lumia 925 has a dedicated camera button (newer Windows Phone handsets are dropping this... why?) and it seems that the speed of which the new app comes up is noticeably slower than before.
In addition, the native camera app (not Lumia Camera) has some strange bug where most of the live camera screen gets cut off into black upon entry, and just shows a copycat three-box replication of the camera's live view in a part of the screen. Again, a bug I most likely bet, but annoying that I couldn't test out much of the new Camera app experience.
It's hard to tell if Microsoft is including much of the speed improvements that some of the latest Lumias got with the Denim firmware, but here's hoping my 925 can get some of it once Windows 10 for Phones goes officially live.
Another rather perplexing aspect of Windows 10 for Phones 10051 is its rather peculiar choice on UI placement and navigation area bloat, for lack of a better phrase. On Windows Phone 8.1, space wasted due to menu areas and title bars was minimal, but on this release, big offenders like the new People app just completely lose the balance of core content space and "supporting" area space.
For example, take a look at the new People app and its choice of space designs:
If I had to guess, I would say that a mere 50-60% of the screen is showing actual content, and the other portion is all wasted to increased search bar area, a large header area for the spot of the address book you are in, and a completely nonsensical bloated lower area. With some phones that have screens on the smaller side like my Lumia 925, why would Microsoft give away so much real estate to nothing?
It would be nice if there was an option for an expanded view, as seems to be the case now in the People app, or for a collapsed, more dense view, akin to what Gmail offers for its web interface users. I have great eyesight still and don't need such large text or navigation areas. I would much rather have more "at a glance" data on my screen at once so I don't have to scroll as much.
Another area which I rely on heavily in Windows Phone 8.1 is the divided email count between accounts, personal and work, on my lock screen. Windows 10 for Phones seems to have this feature busted, as they refused to work as I remember them. I thought I adjusted my settings properly to have it how I liked it previously, but it never functioned right.
I'm confident this will be usable once again once it hits final form, and here's also hoping Microsoft will allow Outlook to have separated inbox tiles on the Start Screen like we do on 8.1. Currently, Outlook mail on 10051 is a single tile that doesn't even show live mail counts, let alone counts per inbox. This is a BIG feature that needs to be kept.
Similarly, Windows 10 for Phones has other odd UI bugs, namely with overall icon sizing and some issues in the text sizing arena, too. For example, some notice screens that come up from the top of the phone, are downright disproportional in the text size choices. The main title of the message will be largely obese with supporting text below is that is far smaller. It may be a bug or temporary, but it's far from elegant or clean, as per what I would expect on Windows Phone.
Another new feature in Windows 10 for Phones is the unique "trackpoint" cursor you can use in text areas to move around within words that was otherwise a chore in "whack a mole" taps to try and get the correct placement. While the idea itself is novel and welcome, the actual execution of the trackpoint feature is still strange.
Many times, when trying to use the trackpoint, I found that it either didn't respond at all, or took way more lateral movement of the finger than seems warranted to initiate the feature's up/down or left/right positioning. I would expect for the option to be slower on movement at first, with more motion causing a faster cursor jump, but this isn't the case. Moving your finger just slightly in the direction you want it to go, or much further out, gets the same result, which just doesn't feel intuitive as it should.
As a longtime Thinkpad user, now on a T420s, the trackpoint is something I'm well accustomed to. Microsoft should take some cues to how Lenovo implements the trackpoint on its laptops, as the function should similarly smooth in operation on a smaller phone screen.
This Lumia 925 has been my primary daily phone for work and personal needs since late 2013. It's probably the best smartphone I've ever used. I'm considering moving to a BLU Win HD LTE, as Microsoft doesn't have any excellent flagship Windows Phone devices on the roadmap... yet. How long do we have to wait for a proper Lumia 925 successor, Redmond?
Overall speed and responsiveness of the preview release is a mixed bag. At times, my 925 behaves on par or a bit better than on 8.1, but at many other times, apps are "Loading..." incessantly, or just downright crash and exit. Cortana on this release is a big culprit, and I couldn't use the app very much without the app closing out on initial searches.
Bluetooth functionality was also a bit on the awkward side with the preview release. In my car, there were numerous times where I would either have choppy audio on Pandora or on a Lync call that would stutter, and at other times, Bluetooth would show as connected but no audio would flow through. The only way to overcome this would be to completely shut off Bluetooth on the phone and turn it back on for a reconnection.
Will Windows 10 for Phones Win Over Android/iOS Converts?
I think the next release of Windows on phones will be an interesting mix of the familiar positives from Windows Phone 8.1, with a breath of fresh air in the right areas. Those who don't care for Live Tiles or the Start Screen won't have much better feelings about the new OS, but for the rest of us who enjoy reprieve from the usual drivel of the Android/iOS UI, Windows 10 for Phones is going to be a solid upgrade.
Microsoft needs to patch up the areas of Windows 10 for Phones that will matter most on final release. First off, the concept of Universal Apps needs to come together full circle, namely with the release of the expected flagship apps including Office Touch and Outlook Mail/Calendar. And Microsoft shouldn't expect to merely match the level of quality of these apps that exist on competing platforms -- Windows 10 for Phones needs to have the best experience. Best on Windows should still mean something.
The expected set of bugs and sluggishness need obvious attention as well. As a daily use OS, the release is far from what most people can swallow. As a technical expert, I was willing to swallow a week of functionality to experience the bleeding edge. But I really want to see Microsoft release more stable, feature-complete builds that will be able to showcase what users can truly expect upon release.
While I'm moving back to Windows Phone 8.1 for daily usage for the time being, I'll be on the lookout for the next (more stable) builds of Windows 10 for Phones.
Here's hoping Universal Apps and a common experience across all Windows devices will tide more Android/iOS holdouts over. But to compete better in user share, Microsoft needs to differentiate the benefits of Windows 10 for Phones over the competition. That's where this great platform will truly be able to organically grow beyond its niche user base.
There was a very frank question pegged towards Slashdot readers a few months back, which I happened to stumble upon just by chance during some Googling. It was a pretty simple question that merely asked: for the non-coders out there, especially ones taking advantage of open source software, why aren't more of you contributing back to the open source community?
A legit, honest conundrum that is likely true for most who use such software.
Way back when, in my high school years, I used to do some light contributing for the Mozilla browser project, providing my input with light bugs I found here and there on what they still call "nightlies". I just don't have the time or energy anymore to circle back some goodwill which is unfortunate but honestly the case. So this open ended question on Slashdot intrigued me, namely to see how the rest of the crowd stands on the issue.
The responses were pretty far ranging, but most interesting were those from individuals who actually claimed to have tried giving back, like myself. For one reason or another (or many), most of these disaffected contributors all paint differing pictures that share a common theme: the open source community is anything but open, as its name implies.
The discussion as a whole was pretty eye-opening, especially due to its relatively one-sided nature.
For example, one user, thisisauniqueid, penned this response to the original poster:
"I estimate I have reported over 3000 bugs over the years across maybe 80 different open source projects. I would say that 5 percent of the bugs I have reported have ever been fixed intentionally by the developers. Some of the bugs have become obsolete or 'accidentally fixed' with subsequent code changes; some have been marked WONTFIX with a range of justifications; but the vast majority have been ignored... Some projects like Fedora close most of my bug reports after the bugs expire a couple of releases into the future. I'm not quite sure why I bother..."
Another user, ledow, had common feelings:
"The larger projects do attract an attitude of kinds. I used to contribute to a large open-source game but when all my feature-patches (actual working patches, with code, that I'd be playing the game with for months) were pushed, they were disparaged to oblivion. Why would anyone want that?"
Similarly, a user by the tag of nine-times, had this to say of open source:
"I can say that I've been in situations where I submitted a bug and had it ignored or else told that it wasn't a priority and the developer didn't care. I've offered feedback on ways that I thought the software could be improved, and was essentially told, 'If you want that done, write it yourself. I'm just here to scratch my own itch.' ... I've encountered the attitude that if you're not a programmer that can contribute code, you should butt out of the conversation."
And finally, a user going merely by the tag Anonymous Coward, sums it up like this:
"As a career technical writer, I once tried to help out a few open source projects by improving their universally bad documentation. In all cases, my contributions were belittled, and often far worse than that, eliciting scorn and disdain from the 'l33t programmers' who thought I was just wasting repo storage and bandwidth. This was something I did on my own time, to improve projects for the benefits of others, for no money. As a result, it didn't take me long to say "fuck it" and leave those open source projects to wallow in their own filth. They're little more than a cult, and if you don't conform to the leaders' idea of what a contrib should be and do, you're not welcome."
The recurring theme of the responses were pretty concrete: for those who tried their best to help open source, they were either shunned or pushed away in one way or another. And it's not like Slashdot attracts any kind of trolling anti-open source following -- quite the opposite is actually true.
Let's not chalk this off as a problem of a "few bad apples", since this isn't the first or last place to exhibit such feelings towards the open source community.
Linux systemd creator: Open source world is "sick"
The creator of the systemd Linux system management software, Lennart Poettering, had his festering feelings published as part of a piece on The Register late last year. For those who don't know, systemd is one of the leading centralized system management platforms available for inclusion within Linux distributions, and as such, is one of the most widely used alternative Linux init systems around.
The commentary by Lennart was aimed squarely at Linus Torvalds, the anointed pope of sorts for the wider Linux community, who is well known to be rage-prone when it comes his vision for Linux kernel code purity. This is the same man who publicly lambasted Kay Sievers, another systemd developer, referring to him as a "f*cking prima donna" in a rant on coding opinion.
And not to outdo himself, Linus was the same one back in 2012 who gave nVidia a widely-covered middle finger after calling them the "single worst company" that the Linux community has had to deal with. Never mind the ironic fact that nVidia is one of a number of large corporations providing the Linux community with considerable new corporate-sponsored development talent.
But the Linux community is now getting public blowback from heavyweights like Lennart, who claims that the open source world is "quite a sick place to be in" and attributes much of that problem with the culture that Linus Torvalds perpetuates.
On Linus, Lennart was pretty to the point: "A fish rots from the head down."
Lennart continued by saying "If you are a newcomer to Linux, either grow a really thick skin. Or run away, it's not a friendly place to be in... It is sad that it is that way, but it certainly is."
Poettering's comments are directly in line with what I saw over on the original Slashdot discussion thread -- feelings of disenchantment, disillusion, and elitism shared by numerous past contributors. Do Linus and his antics speak to the open source community as a whole?
Absolutely not. There is plenty of great community-driven enthusiasm in many parts of open source today, as even Lennart alludes to about his fellow systemd developers.
Linus Torvalds, the godfather of Linux as we know it, is known to be a confrontational and dividing leader of the Linux open source community. You can read one sample exchange Torvalds had with a code contributor named Mauro Chehab, which is nothing less than an idealistic, cuss-filled temper tantrum. If open source is to change for the better, that change needs start at the top. (Image Source: The Register)
But even as much as the Linux community claims itself to be the sum of all its parts, rather than just the sum of one or a few, it can't be ignored that negative leaders like Linus help contribute to an aura of exclusivity rather than inclusivity.
For as much as open source clamors to ideals against class structure, in some areas, it's falling victim to its own unintended consequences stemming from decentralized leadership.
Is this a problem of too many hands in the cookie jar? Absolutely not, as Linux and other large projects have proven thus far with vast developer bases and contributor structures. Rather, large personalities have managed to fill the void that open source innately creates, and therein lies the source of where negativity is allowed to fester.
Right at the top -- just as many others and myself firmly believe with communities like Linux today.
But the real question still remains: how do these communities reclaim focus and deflate insular personalities that rule their roosts? That's an answer that may take quite some time to unfold. One of open source's greatest traits -- slow methodical evolution -- is its own greatest enemy in this particular regard.
How Purist Idealism Breaks the Back of Open Source Progress
I've written about this aspect of the open source community before, in an article that highlighted how one German city was moving back to MS Office from OpenOffice. I attribute this realm of particular ongoing headache for the open source community back to groupthink mentality, namely in regards to its near incessant drive for code and design purity.
Even if it's at the cost of overall progression, elegance, better feature sets, or improved compatibility with other standards. In the world of OSS, from what I can tell, sticking to the high road is the preferred path even if it means never reaching your intended destination.
There's no better example of this dilemma than the saga that continues on as LibreOffice, but began as OpenOffice years ago. (NOTE: If you're interested in the full backstory to the OO/LO schism, some commentary on Reddit lays it our fairly well). It's designed to be a modern, free alternative to the heavyweight incumbent from Microsoft's camp that most know already.
Here's something most people don't widely realize: LibreOffice, if you take into account the fact that it's the birthchild of what used to be OpenOffice, which itself was born out from StarOffice, is actually an older office suite than Microsoft Office.
That's absolutely right. LibreOffice's great grandfather StarOffice came out in 1985 while MS Office didn't rear its head until the end of 1990.
While debatable, many (including myself) naturally beg the question: why hasn't LibreOffice progressed further and faster than where it stands today? Why all these years later is the suite merely still trying to just catch up to Microsoft Office on the basics? LibreOffice seems to be in a perpetual self-improvement drag race on the topics of cross-suite document fidelity, formatting compatibility, and other seemingly trivial matters after this many years.
For example, LibreOffice is just getting into the browser-based office suite bandwagon, while Microsoft's Office Online has been out in some fashion since 2010, and Google Docs has been around for years longer.
While, in contrast, for as evil as Microsoft is claimed to be, they have gifted the world with genuinely creative and useful additions to the core Office suite at a record pace.
Sway is a downright eye-grabbing alternative to PowerPoint for presenting content to crowds. OneNote is becoming the de-facto multi platform solution for digital inking and note taking. Likewise, LibreOffice has no answer to compete with Outlook or Project, with little discussion to bring anything to the table. Lync (now Skype for Business) is another fabulous IM/VoIP/video product my company FireLogic uses daily which LibreOffice, and open source in general, has little answer to.
It's very well true that LibreOffice or related competitors bring some bits and pieces of Microsoft's powerhouse to the table. But in terms of a total package, LibreOffice just isn't there. It's why well publicized experiments with big scale rollouts of open source are sometimes met with unfortunate reversals of fortune -- take Munich, Germany's backtrack to Windows and Office as just one example.
Open source advocates claim that the fruits of their efforts bring greater variety and choice to consumers, but is this notion really true in reality? Would a business professional or college student, barring financial reasoning, truly choose LibreOffice over MS Office for the functional or other advantages it brings to the table?
The underlying code that makes up modern LibreOffice (formerly OpenOffice, formerly StarOffice) has been in development for even longer than Microsoft Office. So why does LibreOffice continue to look and behave like an Office 2003 clone, a decade removed? Many of open source's largest projects have issues such as the holistic obsession with code purity, along with vast decentralized oversight which catapults blind equality over measured progress.
Likewise, how many graphics professionals would choose to ditch their Adobe products in favor of GIMP or Paint.Net? The same argument can be directed at those who work with AutoCAD, music development, and video editing.
While open source has given us numerous positive examples of where it can foster results (Linux, Firefox, Thunderbird, VLC, etc), its successes are still rather muted in contrast to the admitted plethora of options that exist in any commercial software vertical.
I'm not by any means a blind believer in the notion that straight usage figures are the sole judge of a platform's success, but something must be said as to the penetration (or lack thereof) of a software title into the hands of the greater masses.
As opposed to closed-source development of software, that doesn't promote endless debates about code purity and design, the hardcore that make up the deepest trenches of open source (take Linus Torvalds) are relentlessly obsessed with obfuscated reality.
Nit-picking lines of code. Calling out developers that don't share the same vision as them. And the list goes on. Their obsession with perfection is no more than a blinding to the reality of what could truly improve more important facets like end user UI or documentation, just a few examples of areas that consistently suffer in open source projects.
To these purists, the end goal isn't nearly as important as the sheer laser focus on the journey that gets them there. Wherever there may be.
The Linux Tax: Why Linux Isn't Really Free
If you've ever been curious as to who exactly makes up this big pie of Linux developers out there, who better to answer that question than the very own Linux Foundation. And they make no attempt to hide the truth about Linux and its developer community.
According to them, less than 1/5 of the total development braintrust dedicated to building Linux are individual developers. By far, the overwhelming majority of Linux development is contributed by the likes of "paid guns" that are employed by large corporations who have one vested interest or another in the progress of the core product; in this case, Linux.
There's nothing wrong with this mentality or approach, per se. As they say, what's good for the goose, is good for the gander, and if keeping Linux moving necessitates outside corporate development time donations of sorts, then so be it.
But doesn't this taint the image and ongoing message that Linux is the greatest achievement of a global open source community?
Therein lies my issue with how the Linux community paints the project as this massive "coming together" of developers of all walks. Surely, there is a large number of indy developers that are giving back to the cause, as shown on the graph below. But big companies with even bigger interests of their own, like Google with their ChromeOS/Android efforts (Linux derivatives) and Red Hat (commercial Linux distribution), are proving to be large players in the grander Linux army.
As some of the commentary from Slashdot above alluded to, there is an unwritten belief in open source that much of what gets contributed to one project or another is the result of individualistic wants by certain parties.
The understanding is, if you wish to devote the needed time/energy to make their mark and gain inclusion of the features you're most concerned about, then so be it, the notion goes.
It goes without saying, then, that someone is footing the bill for these development efforts. Developers on corporate time are devoting slivers of their paid employment time to donate braintrust back into Linux, as proved by the Linux Foundation's own numbers.
And another truth in life is that nothing is truly free. You may not be paying for it, but the money for efforts must come from somewhere. What we call "free Linux" in the form of Ubuntu and FreeBSD and OpenSUSE is actually the summation of efforts built upon the backs of true volunteers AND just as importantly, corporate "paid guns" as the raw numbers prove.
While a thriving independent community still makes up a large portion of efforts on Linux, it's clear that the clear vast majority of contributions are coming from companies with natural vested commercial interests. All of whom benefit in their own manner from the additions they offer back to the Linux code base. The fact that so many "paid guns" make up the Linux dev community reinforces the theory of Linux effectively being supported by the "Linux tax". (Image Source: ArsTechnica)
How do we, as consumers and businesses, then, end up paying for the results of these subsidized efforts? In the form an indirect tax. Let's call it the Linux Tax. It's small, nominal, and likely spread across vast product lines that companies like Google and nVidia and Samsung have on the market, but we're paying for it -- even if it's not explicitly discussed or called out on line items within corporate P&L statements.
The Linux Tax has reared its head in other ways over the last decade, for example, with OEM systems that have been placed on the market at higher price points than similar Windows counterparts. Some have made the argument that much-hated "bloatware" (which I also despise) has notched costs down so heavily that it doesn't save OEMs money to sell Linux machines.
But other, substantive arguments have been raised as well, like the fact that Linux still has a perplexingly tough time with hardware compatibility.
In fact, WikiPedia has an entire section dedicated just to the issues with wireless driver compatibility that continues to pain Linux. Another section highlights the mess that is the Linux audio API ecosystem, being so overly convoluted that developing audio solutions for Linux is a mess some developers won't touch with a ten foot pole.
Much of this driver mess, as such, swings right back to the discussion of why OEMs just can't sell Linux system en-masse profitably. OEMs need to use different parts and configurations for Linux-loaded systems, which raises part costs in many cases and also increases behind-the-scenes configuration and support time on the OEM's side.
As a business owner myself, I can definitely see how OEMs can't justify in dedicating large, disproportional swathes of time to supporting a very minimal percentage of their consumer base.
I think this is a valid point which many Linux purists refuse to admit when questioning pricing tactics of OEMs and why Linux, ironically, ends up costing us as consumers more.
Linux is free, they say, and therefore, my system should be cheaper. As we've seen, this has time and again not been the case. And for this dynamic to change, the Linux community needs to bring down its lines in the sand, and come together to clean up the compatibility mess that plagues hardware running Linux.
Where Does Open Source Go From Here?
I'm not one to say that proprietary, commercial software development is without its flaws. Microsoft and Apple are perfect examples of companies which are constantly playing catchup in the realms of security patches, features flaws, and other botched rollouts (take Apple's cruddy iOS Maps app debacle, for example).
But success in the world of open source seems to be a rarity, if you look plainly at the raw metrics. For every breakout hit like Linux or Apache or LibreOffice, there are on average five more failures per every successful project that simply close up shop. It's one of the dark secrets of open source in general that many have absolutely no clue about.
Directions on Microsoft VP, Wes Miller, shared the above on Twitter not too long ago. It's a funny take on what makes much of open source so unappealing to the general populace: convoluted design, broken feature sets, and development that focus on the needs of the "technorati" over the average user. If open source is to stay relevant, it needs to overhaul its perception problem with the public. (Image Source: TechRepublic)
High profile open source projects have failed on the community's watch. Last year, we saw the biggest cross platform encryption project (TrueCrypt) close up shop due to its own admission that its software wasn't secure and ended up recommending a wholesale move by its users to an unlikely foe: Microsoft's BitLocker.
The stats come from a study done by Charles Schweik and Robert English, which can be picked up off Amazon in hardcover. They spent the better part of 1998 to 2005 dissecting the life and death of 174,333 open source projects being hosted on SourceForge. What did they find out?
A lot of the findings by the Schweik and English study can be easily found in the largest, most visible projects in open source. Linux? LibreOffice? Their user bases tend to include large numbers of its own developers, naturally, and they also receive very nice comfortable outside funding, either via direct donations or via "paid guns" that companies employ to dedicate contributions back to.
But likewise, as proven by the study, Linux's large developer count doesn't have much bearing on how much continued success it may see. Naturally, it's plausible and possible the tide may very well turn in the other direction.
I guess it all depends on the definition of success that Linux will consider for itself. Is overall increased desktop OS penetration important, or merely treading water and keeping pace with what foothold it enjoys today? That's up to the Linux community at large to decide.
On the whole, I think open source as a community needs to clean up a few things if it is to increase success rates, community engagement, and overall strides in what sets out to achieve. Here are just a few aspects which I see as common roadblocks in open source software seeing greater adoption from outsiders.
Open source should take notes on governance and leadership from the corporate sector. Being the initial creator of an open source project should not entitle you to a lifetime of membership in the pseudo-CEO's chair. Take Linus Torvald's ongoing position at the top of key decision making in Linux development. Sure, he's contributed plenty in one of the most successful open source projects to date, and no one can take that away from him. But his continued antagonism and negativity (as clearly displayed above) is only dividing, not uniting, the Linux community at large. Supposed "neutral" spokes mouth organizations, like Linux Foundation for Linux, should drop their commitment to innocence and neutrality and promote common sense leadership change at the top.
Metrics for success are ambiguous, not clearly defined, and rarely enforced. No one knows where Linux development, for example, is headed. It's absolutely true. A response to a question about a roadmap on LinuxQuestions.org ended by one user simply saying there "is no Linux Kernel Roadmap", and even a discussion on StackExchange concluded that all major knowledge about the Linux Kernel's future was being disseminated solely on Linux mailing lists. What major commercial software effort would be allowed to simply coast on a plane of ideal goals, with no driving agreement on where its headed? That's in essence how Linux Kernel development is handled. Linux has zero chance at unseating Windows at the top if its mere goal is to coast incessantly (into obscurity).
Development on new features takes disproportionate priority over fixing known flaws. Just look at how some of the bug contributors mentioned earlier in this article were treated and you have to wonder. Yet, it makes perfect sense: without some kind of motivation, developers have little incentive to trawl the trove of older bugs plaguing projects. "Fix it yourself" is the answer from some open source developers, hitting home the notion that self-interest may play too large a role in why open source projects go in the directions they do.
Open source, on average, has an incestual focus on merely what its own hardcore, technical users want. If any open source project is to attract swathes of new users, which is the ideal end goal we presume, then why is it so common for its results to cater so heavily to the "technorati" as I call them? For as much hate as they get, Apple and Microsoft have wrapped software packages in interfaces and features that users can relate to. Linux distros are guilty of exactly the opposite, save for Ubuntu and a few other user-focused distros. This trend needs to end for open source to regain its adoption consciousness.
A focus on universal standards, common core features, and less "forking" should be a bigger priority. While despised by some, there's a reason why American politics are centered around two major parties. Larger, unified groups get more accomplished than numerous fractured, small ones. Even if they have common sense beliefs, minority parties like the Greens and Libertarians will indefinitely enjoy small, marginal stakes in the overall direction of the country. Open source efforts should think similarly. Forking has done little to increase latest-release adoption in the Android sphere, for example, and has created an Android mod scene which caters solely to the most dedicated, technical folk. It's a big reason why I dumped Android for good and moved to Windows Phone -- and haven't looked back.
With a few changes, is there any large reason why open source can't become a major force in software beyond the exceptions like Apache or Linux? Certainly not. But a big part of why open source is seemingly running in place, making marginal strides here and there, can be attributed back to some of the biggest problems it allowed to fester on its own watch.
Leaders like Torvalds that are out of touch. Endless forking of forks that unnecessarily pollute the waters with more variants than anyone cares to use. And newcomer difficulty levels, both on the end-user and contributor sides, that keep adoption rates artificially low indefinitely. All issues that, if properly addressed, could clean up the image that open source carries.
Unfortunately, I don't see the status quo changing much very soon. The community that makes up open source today is fairly entrenched, and the driving forces to inspire change from within won't come that quickly. These communities are fairly insulated, empowered by widespread groupthink among their most hardcore followers, and have no outside stakeholders they need to answer to.
I really want to see open source succeed on a much grander scale than it ever has. But much of what open source represents is a mere collection of little ships, sailing endlessly, without an end in sight or a destination to reach.
If you don't know where you're heading, how do you know you've gotten anywhere?
That's a good question -- and one that the open source community just doesn't know how to answer succinctly, yet.
Photo Credit: rvlsoft/Shutterstock
There's no need to ask for a show of hands. To get a sense of how long the Windows RT hate-train is, you can just spend a few minutes Googling. A few weeks ago when Microsoft let loose that official Windows RT devices, like the Surface 2, were not getting Windows 10 in any proper shape, the anti-RT chorus cheered that they have been finally vindicated.
Stories like this one which adorned The Verge planted their flags pretty clearly: "Windows RT is officially dead".
Admittedly, I've been one of the few lone champions of the intriguing slimmed-down OS this side of the interwebs. I outlined my favorable position on Windows RT in a lengthy op/ed about one year ago, hitting on numerous aspects of why RT (and Windows on ARM, in general) is a fantastic idea that has been sputtering due to execution, not capability.
The false-start that just recently came to fruition, now called Universal Apps (formerly Modern UI apps), is something Windows 8 as a whole could have used during launch season back in 2012.
But even so, we've got customer deployments out in the wild built upon Windows RT-based Surface tablets which replaced x86 laptops, and believe it or not, these clients are still raving about these devices to colleagues. Their extreme portability, full featured access into native Windows RDP, very good peripheral compatibility, and splendid battery life makes them much more desirable than a traditional laptop -- in many, but not all -- cases.
Windows RT: Dead in Name Only
It's well known that sensational news is great news, especially for today's tech media. But while most of them were busy jumping on the "RT is dead" bandwagon, many of them failed to do the proper due diligence to actually figure out what Microsoft really meant when they discussed the end of Windows RT a few weeks back.
So for the sake of setting the record straight for the majority of the copy-paste tech media, here's a breakdown on how they got the news wrong, and what actually is the case for the future of RT.
What The Tech Media Reported
Microsoft Kills Off Windows RT (Geeky Gadgets)
RIP Windows RT: Microsoft murders ARM Surface, Nokia tablets (The Register)
What Microsoft Actually Said
No Windows 10 Update for Surface RT (ZDNet)
Microsoft reiterates support for ARM amid reports of Windows RT’s death (GeekWire)
That's it? That's what all the hubbub was actually about? Sadly, yes. The tech media had a fun time playing with semantics to toss more gas onto the Windows RT firestorm it loved to stoke.
The long and short of what is happening with Windows RT is pretty darn simple:
Nowhere did Microsoft say that all ARM devices will not be getting Windows 10. Nowhere did Microsoft say that the strides made in the "Modern UI" approach to eradicating the x86 desktop were going away.
In fact, just as recently as the revived WinHEC 2015 conference, Microsoft confirmed that Windows 10 SKUs will be fully supported for gadgets/gidgets as well as phones and small tablets:
(Image Source: AnandTech)
And for those doubtful about Microsoft's commitment to ARM-based devices, one of the cheapest small PCs around, the Raspberry Pi 2, has already been confirmed to be getting full Windows 10 treatment for free, too.
For the uninitiated, the Raspberry Pi 2 is a mini PC featuring a lowly 900MHZ ARM CPU with only 1GB of RAM. For Microsoft to dedicate an official blog post of its own to the news, this whole "ARM thing" must be less of an experiment and more of a long term vision that is just warming up.
On top of Microsoft's Raspberry Pi 2 boasting, some news sites have unveiled mention of benchmarks culled from a graphics testing website which highlight specs of a yet-unknown big screen Windows (10?) tablet that runs on a quad core ARM, supports DirectX 11, and has a 2560 x 1440 display at 10.1".
Is it a prototype Surface 4 ARM device? Or perhaps a new Lumia ARM tablet? Or even another OEM's unnamed tablet? We can't say. But for me, it's hard not to connect the dots and come to some limited conclusions about Windows on ARM keeping pace or picking up with Windows 10.
How Windows RT Gave Windows 10 Mobile Its Mojo
For all the commentary out there claiming that Windows 10 is going to be exactly that which Windows RT wasn't, it's quite interesting in how much common ground the two operating systems share from a structural and UI perspective. In fact, I'm fully convinced that Windows RT's most important guts and aspects are direct inspirations for much of what Windows 10 Mobile/ IoT is shaping up to be.
The commonalities are numerous, with Windows RT and Windows 10 Mobile/IoT sharing all of these overlapping aspects:
So, once again, I have to ask: what's so different about Windows 10 from Windows RT? They are descendants of one very common lineage, separated merely by time and generation. The most innate defining features of what made Windows RT, well, Windows RT, are present and kicking in Windows 10 Mobile and IoT.
Some claim that Microsoft kicking Surface and Surface 2 to the wayside are evidence of no future for larger Windows on ARM devices. But as Paul Thurrott insinuates, and I agree with, there is nothing even short of confirming this is the case.
Per a recent article going into some post-RT commentary, Paul put it plainly:
What if you wish to get a Windows 10 Mobile device with a screen bigger than 7.9 inches? Again, a guess: I think Microsoft will let PC makers sell such a device. I just don’t think they’re going to sell that version of Windows at retail.
Could this lead further credence to an upcoming scene of new large Windows 10 ARM tablets or convertibles? Who knows. It's too early to say, honestly.
The clear takeaway here is without debate: Windows 10 Mobile shares far, far more in common with Windows RT than many in the media wish to admit to.
And that, in my opinion, is a great win for the former RT, even if its namesake is heading to the coffers and it never gained much steam in its own right.
There's a reason we are in a cultural, military, and cyber messaging war with a twisted group called IS -- still widely called ISIS, its old name. It doesn't call itself HISIS; they don't claim to represent Hinduism. Nor BISIS; they aren't Buddhist radicals. CRISIS and JISIS would also be incorrect, since they aren't Christian or Jewish radicals, either.
They may not represent mainstream Islam, but they wholeheartedly believe in their evil calling to establish a global Islamic caliphate based on a radical Sharia ideology -- affectionately and simply called the Islamic State.
This diseased breed of Muslim fanaticism shares a common core with the likes of Boko Haram in northern Africa, and the present day Taliban in lawless Pakistani and Afghani regions. Peaceful followers of Islam claim that these groups don't represent them -- but if the world is to understand this, we need to see the symptoms of their hatred on display, especially in the connected cyber world.
But this isn't merely a far-away problem, one that is relegated to those caught in the crossfire of a self contained civil war. Man hasn't seen such inhumane scum sweep into power since the time of the Nazis, which one should remember was only slightly longer than a mere half century ago.
Instead of concentration camps slaughtering Jews by the masses, IS is sweeping the Middle East in the effort to rid the planet of Christians, Jews, and ironically without missing a beat, fellow Shiite and Yazidi Muslims as well.
Which brings up a great question on the silence surrounding our modern day Holocaust. Where are the righteous Christians calling for the destruction of IS? Where are the righteous Jews, coming to the defense of Christians and Yazidis being slaughtered town by town? And the same can be asked of the rest of the global community, especially those refusing to commit resources in the battle against this modern evil?
The silence is deafening, sadly.
The savages of IS practice modern day slavery against Christian and Yazidi women in Iraq and Syria, shown above in this photo from an Iraqi news website. Islamic State even went so far as to release a publication outlining their Islamic-based justifications for slavery of women captured in holy war. Were it not for social media, the uninformed would have believed slavery was extinct. (Image Source: IraqiNews.com)
The above point on a lack of ground swelled outcry from the civilized world has been a common daily theme on the radio show of Michael Savage in the United States. He has been one of the few in the media championing the discussion on IS, and specifically why the world has been repeating the same mistakes as were made in pacifying Hitler in his rise to power. "It's not our war" seems to be the passive justification coming from most corners of the world, geographically and politically disconnected from the effects of IS (for now, at least).
A lot of people feel rightly insulated from the nightmarish parts of the world falling under IS control. But one weapon on both sides of the isle, the internet, is a tool which some people believe should be kept censored in favor of one side. The savagery and barbarism of IS, some say, is too extreme to be allowed to float around on channels like YouTube or Twitter.
But I don't share this view. Let the IS propaganda flow, I say. Let the world see what they are all about. For open and free flowing information is the only way for the insulated world to truly awake to the twisted, sadistic truth IS represents.
I've always held such a belief, and in fact, was joined in this opinion by Michael Ingram of GigaOm a few days back. He penned a fantastic piece which dove deeply into the reasons why YouTube and other social media sites shouldn't be taking down IS imagery and media.
Aside from the usual central belief that preservation of free speech shouldn't be at the behest and confirmation of private firms, he properly affirmed the notion that there is a concrete duty to mankind that we have a right to be informed -- even when it rises to the scale of evil currently on display in the form of beheadings and mass slayings by IS warriors.
In a telling response to a question Michael Ingram posed on Twitter, one respondee intelligently quipped:
During the Holocaust, many Germans claimed they "didn't know" the horrors of the death camps. With ISIS, we know. (@sylviebarak)
Her response couldn't be closer to the truth. In the age of the internet, with an enemy determined to propagandize its cause utilizing this digital carriage, people of the free world have little excuse to be hiding behind naivety.
The Real Reasoning Behind Free Speech Protections
Many of us in the West like to think that free speech was a concept created in the effort to merely protect political opinion, freedom of thought, and other idealisms at the heart of a free society at large. But the founding fathers of America had a different innate vision for why free speech protections were needed: to protect the most offensive forms of speech. The kind currently emanating from the hot zones under IS control.
Joshua Goldberg referenced this corrected, nuanced definition of free speech on an excellent post up on Thought Catalog:
If freedom of speech only protects popular speech, then it is not free speech at all. Freedom of speech is intended to protect the most unpopular and irresponsible forms of speech – the only kinds of speech that people would actually want to be censored.
We consider the extremism flowing out within IS propaganda as being "unpopular" and "unwanted" kinds of speech, but we need to be mindful that limiting such speech in the effort to cleanse social media won't really have the effects that most well-to-do'ers believe.
The country of Jordan has stepped up blistering airstrikes against Islamic State rats and vowed to go after them "wherever they are", after one of its captured pilots was burned alive in barbaric tactics not seen since medieval times. It took a horrific act against one of its own to turn public sentiment in their country. What will it take for the rest of the world to get serious? (Image Source: DailyMail)
Think about it. Islamic State, for as long as the internet is free and open, will always have venerable channels to disseminate its propaganda. Its own websites, forums, chat rooms, and other dark spots of solitude to distribute media in the effort to attract new followers and believers.
But the ones who don't follow the news closely, those pinned to social media like Twitter and Facebook, is the crowd we should be more concerned about informing regarding IS and its atrocities. Vile, inhumane, and disgusting, yes -- but a lack of information by the free world is exactly what IS is hoping for. Secrecy of its horrors, and apathy among the general distant populace, is the key to IS continuing its rampage with impunity.
These throwback vermin relish in keeping their horrors secret from the greater society, preaching primarily to impressionable disaffected Muslims in regions of high unemployment with nowhere else to turn but Sharia. Just like Hitler's forces perfected mass murder under the guise of normalcy, these cockroaches have no lesser goal than planting their flag on the White House one day if given the chance.
It was vowed by the free world that the horrors of the Holocaust would never ever be allowed to repeat. "Never Again" became the unofficial slogan of the pact the free world committed to in forever preventing another such widescale atrocity.
But the sad truth is that IS is succeeding in its goals to wipe out "apostates" and those who oppose its tortuous reign. A UN report released last year provided numbers of about 24,000 Iraqi civilians being killed at their hands -- in just the first eight months of 2014, mind you.
And the map of territory that this marauding religious army covers continues to expand as well, as documented by the BBC online.
I'm fully in belief that knowledge is power when its comes to arming the free world with intellectual ammo. Hitler expanded his ideology and control at a time when there was no such thing as the internet, social media, and Twitter hashtags. Documentaries like Night Will Fall, recently played on HBO, place on full display the horrors and atrocities committed at the hands of the Nazi regime. And so we said, Never Again.
If we are to uphold this promise to the free world, we need to preserve the notion of uninhibited free speech, even if it places on display the worst of humankind, currently in the name of IS. It took a heinous killing of one of its countrymen for the people of Jordan to realize the scope of what humanity is up against in the fight with IS. That nation and its people finally awoke in unison.
Islamic State punishes those severely who do not adhere to their strict interpretation of allowed speech under Sharia. Yet, they take the full liberty in using free speech to propagandize their cause, as displayed in their puppeteering of hostage John Cantlie in slick documentary-esque videos, as per above. We may not agree with them, but putting IS free speech on open display is the only way for the world to understand their cancerous cause. (Image Source: YouTube)
What will be the 'Jordan moment' for the rest of the world? How many more documented horrors need to be passed over social media for the likes of Obama, Hollande, and Merkel to stand up for what is right?
That's a good question. Until it happens, unfettered social media exposure of Islamic State is the best concerted effort we can hope for to accelerate the destruction of the largest threat to mankind since the Third Reich.
If Islamic State doesn't represent mainstream Islam, where are the righteous Muslims in this telling moment in history? Here's hoping the power and reach of social media in Muslim countries will thrust this conundrum into prime focus.
Until the righteous Muslims, in particular, stand up with unity against Islamic State, the YouTubes and Twitters of the world need a constant flow of IS media to prove a point: the only way to overcome pure evil is to put it on display for all the world to openly see.
When scoping out new servers for customers, we usually look towards Dell, as their boxes have the right mix of price, performance, expandability, and quality that we strive for. RAID card options these days are fairly plentiful, with our sweet spot usually ending up on the PERC H700 series cards that Dell preinstalls with its midrange to higher end PowerEdge server offerings.
But recently we were forced into using one of its lower end RAID cards, the H200 PCIe offering. This internal card was one of the few dedicated RAID options certified to work in a refurbished server we had to put back into production, a Dell R210 1u rack unit. The specs looked fine and dandy in nearly all respects, except for one area that I like to avoid: the lack of dedicated battery backed flash cache.
This is clearly denoted on Dell's full PERC rundown site. You can dive into the benefits of BBF on this excellent Dell whitepaper. For the uninitiated, battery backed flash caches on RAID cards allows for near bulletproof data safety in the face of power failure, while increasing performance of RAID arrays considerably.
I didn't think going with an H200 for an R210 refurbishment would be the end of world. Even lacking a dedicated cache unit on the card, I knew full well that disk drives in a RAID have their own onboard cache that works pretty well on its own. Heck, most drives made in the last ten years have some form of dedicated cache space to provide workstations some performance benefits of writing to cache before writing to the disk.
The advent of SSDs has rendered the need for caching mechanisms almost extinct, but the reality of the situation in the server world is still that spinning disks make up a majority of the current and ongoing new server installations. Until SSD prices come down to the cost/GB levels of what SATA or SAS can provide, spinning hard drives will be the cream of the crop for the foreseeable future.
And this described the exact situation we were in for this customer Dell R210 rebuild. We paired a dual set of (rather awesome) Seagate 600GB SAS 15K drives together in a RAID 1 for this H200 controller so that we could pass a single volume up to Windows Server 2012 R2. Our storage needs for this client were rather low, so employing Windows Storage Spaces was a moot point for this rollout. It also doesn't help that the R210 only has a pair of 3.5-inch internal drive bays; not the 4-8 hot swap bays we usually have at our disposal on T4xx series Dell servers.
The problem? After we loaded on a fresh copy of Windows Server 2012 R2, things just seemed strange. Slow, as a matter of fact. Too slow for comfort. All of the internal hardware on this server tested out just fine. The hard drives were brand new, along with the RAID card. Even the SAS cable we used was a brand new Startech mini SAS unit, freshly unwrapped from the factory, and otherwise testing out fine.
It seems that we weren't alone in our sluggishness using default settings on the PERC H200 card. The Dell forums had numerous posts like this describing issues with speed, and the fine folks at Spiceworks were seeing the same problems. After some Googling, I even found sites like this one by Blackhat Research that detailed extensive testing done on these lower end RAID cards proving our suspicions true.
Most people are recommending that these low end H200/H300 cards be replaced ad hoc with H700 and above level cards, but this wasn't an option for us. The Dell R210 is rather limited on what RAID options it will work with. And seeing how many other parts we doled out money for on this particular server, going back would have proved to be a mess and a half.
What Is Dell Doing to Cripple the Performance on These Cards?
I didn't want to take suspicion at face value without some investigation of our own. While collective hearsay is rather convincing, I wanted to see for myself what was going on here.
The culprit at the heart of these performance issues seems to be Dell's boneheaded policy of disabling the native disk cache on SAS drives, even though they seem to leave it enabled for SATA disks as a matter of default configuration for H300 level controllers.
Blackhat Research pinned a nice screenshot of this language in a Dell user manual:
Since we are using the H200 card on our R210, it wouldn't matter if we had SATA drives in our RAID array. They were going to disable the cache regardless.
Dell is going out on a limb here to make the lives of server admins rather hellish in situations where low end RAID cards are in use. Selling such lower end RAID cards without BBF is not my issue -- cutting costs on lower spec'ed hardware is the name of the game.
But to assume that those installing these servers aren't educated enough to be using battery backup units in any way? That's a bit presumptuous on its part, and honestly, anyone installing servers into a production environment without proper battery backup power to the core system likely shouldn't be in the driver's seat.
I know very well it can happen, and likely does happen (I've seen it first hand at client sites, so I digress), but then again, you can assume lots of things that may or may not be done. A car may be purchased with AWD, and a car manufacturer may disable the functionality if the vehicle is sold in Arizona, assuming that the driver will never encounter snow. Is that potential reality something that should be used to decide configuration for all users of the product? Of course not.
So Dell disabling the cache on hard drives for users of its low end RAID cards is a bit of an overstepping of boundaries in my opinion. One that has likely caused, judging by what I am reading, legions of PERC purchasers to completely dump their product for something else due to an incoherent decision by Dell's server team.
I decided to do a little testing of my own in an effort to resurrect this server rebuild and not have to move to a plan B. All firmware was updated on all ends -- server BIOS, RAID card fw, as well as the firmware on the hard drives. We made sure the latest PERC driver for this card was pulled down onto the server (which happened to come from Windows Update).
Prior to enabling the local cache of the underlying disks, here are the results of a single run of CrystalDiskMark 3.0.3:
Not terrible numbers, but I knew something was up, and seeing how everyone on forums was up in arms over this H200 controller, turning that cache policy back on was likely a requirement at this point for workable performance.
I ended up using LSI's MegaRaid Manager utility to get the cache enabled on the card, and here are the numbers we saw after a full server reboot:
The biggest change from prior to the cache being enabled? Look at the sequential write figure with caching at 86.11MB/s compared to the pre-caching figure of 52.42 MB/s. That's more than a 64 percent change for the better -- pretty darn big if you ask me.
The pair of 4K write tests experienced decent boosts, as did the sequential read number ever so slightly, which I'm not sure can be directly attributed to the cache enablement (but I may be wrong). The 512K write test went up slightly, too. All in all, flipping a single switch improved my numbers across the board.
Is Dell crippling these cards by turning off the native HDD cache? Most definitely, and the numbers above prove it.
How Can You Easily Rectify This Problem?
While I used the LSI MegaRaid Manager utility (from the maker of the actual card that Dell rebrands), you can use the more kosher Dell OpenManage Server Administrator utility that is tailor made for overseeing Dell servers. It's available for free download on any supported Dell PowerEdge unit (nearly every one in existence), and it only takes a few minutes to install.
After going into the utility, just navigate down into the virtual disk(s) in question, and you will find the Disk Cache Policy as set to "disabled" most likely (if it's an H200, most definitely). Just flip the selection to Enabled and hit save, and reboot your box just to ensure the change takes place.
Once your server comes back up, you will have the performance improvements that local cache can provide. While not as powerful in whole as a dedicated BBF like on the H700/H800 RAID cards from Dell, this is as good as it's going to get.
Run your own tests to see what kind of results you are seeing. CrystalDiskMark is a great utility that can measure disk drive performance on any Windows system, including servers, and is a good baseline utility that can tell you what kind of numbers you are getting from your drives.
Should Dell continue to cripple future generations of low end RAID cards by doing this to customers? I hope not. The safety of your server data is up to you as the customer in the end, and if you choose not to use a proper battery backup power source like an APC or Tripp Lite unit, then so be it.
But having to dig through forums to see why my H200 card is behaving abnormally slow, none of which is from my own doing in misconfiguration of any sort, is downright troubling.
Here's hoping someone from Dell is reading the legions of forum posts outlining this dire situation and changes their stance on newer RAID cards for future servers. Trimming features off your RAID cards is acceptable practice, but not when you begin forcing otherwise default functionality off on disk drives that offer caching.
Long story short? Stick to getting more capable Dell RAID cards, like the PERC H700 series units, when possible. You can avoid this necessary roundabout then altogether, and you've got yourself a higher quality, more capable RAID card, anyway.
Photo Credit: nito/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Most of us have hopefully managed to get off the sinking ship that was Windows XP. As much of a recent memory as that has become, a new end of life is rearing its head, and it's approaching fervently for those who haven't started planning for it. Microsoft's Windows Server 2003, a solid server operating system that's now about eleven and a half years old, is heading for complete extinction in just under 300 days. Microsoft has a fashionable countdown timer already ticking.
Seeing as we just finished our second server migration in a single week (a personal record so far), sharing some of the finer aspects of how we are streamlining these transitions seems like a timely fit. This braindump of sorts is a collection of best practices that we are routinely following for our own customers, and they seem to be serving us well so far.
The best practices below all assume that you have gone through a full inventory of your current servers, taking note how many servers are still in production and what ongoing workloads they support. If you don't know where you stand, you have no idea where you're heading -- so stop reading here and start getting a grasp on your current server layout. I'm going to pen a fuller piece about how to inventory and plan a server move that addresses all of the non-technical criteria.
Microsoft has put together a fairly good four-step first party guide that you can follow on their Server 2003 EOL website, but as you can expect, it's chock full of soft sells on numerous products you may or may not need, so take it with the usual grain of salt and get an expert involved if necessary.
Given you have a solid inventory, a plan for replacement, and hardware to get the job done, here's a rundown of some of the things saving us hours of frustration.
There's Nothing Wrong with Re-Using Servers -- In Some Cases
Need to re-deploy another physical server after ditching 2003? Refurbishing existing servers for usage in your upgraded environment is not sinful, like some traditional MSPs or IT consulting firms may make it out to be. Many of the voices always pushing for "buy new!" are the ones who are used to making fat margins on expensive server purchases, so follow the money trail when being baited into a brand new server when it may not be necessary.
The last server I just finished deploying was a re-purposed Dell PowerEdge R210 1u rack mount server that was previously being underutilized as a mere Windows 7 development sandbox. With only a few years of age, and no true production workload wear (this is a lower end box, but it was used as anything but a server), the box was a perfect fit for the 20-person office it would end up supporting for AD, file shares, print serving, and other light needs.
We didn't just jump to conclusions on OK'ing the box to be placed back into production, mind you. Re-use of the server was wholly contingent upon the unit passing all initial underlying diagnostics of the existing hardware, and upon passing, getting numerous parts upgrades.
For this particular Dell R210, we ended up installing the max amount of RAM it allows (16GB), the second fastest CPU it could take (a quad core Intel Xeon X3470), dual brand new Seagate 600GB 15K SAS hard drives, and a new Dell H200 Perc RAID controller to handle the disks. A copy of Windows Server 2012 R2 was also purchased for the upgrade.
We also picked up a spare power supply for the unit to have on hand in case the old one dies, since the unit doesn't have warranty anymore. Having a spare HDD on hand doesn't hurt, either, for those planning such a similar move. You don't have to rely on manufacturer warranty support if you can roll your own, and the two most likely parts to fail on any server are arguably the PSU and HDDs.
Still have a good server that has useful life left? Refurbish it! Consultants pushing new servers blindly usually have fat margins backing up their intentions. We overhauled a Dell R210 for a 20-person office for less than half the cost of a brand new box. Proper stress testing and diagnosis before deciding to go this route are critical. (Image Source: Dell)
Instead of spending upwards of $5000-$6000 on a proper new Dell PowerEdge T420 server, this customer spent about half a thousand on refurbishment labor, and another $2000 or so in parts. In the end we ended up saving thousands on what I found to be unnecessary hardware.
We also did a similar upgrade on an HP Proliant DL360e just a week prior. Second matching CPU installed, RAM increased, brand new Samsung 850 Pro SSDs put into a RAID 1, Windows Server 2012 R2 Standard, and a couple of extra fans. We took a capital outlay that would have been no less than $5K and turned it into a $2K overhaul.
Want to go the extra mile with the refurbished system and extend fan life? On all of our overhauls, we lubricate all of the server fans with a few drops of sewing machine oil. You can read about how great of a cheap lease on life this is per this TechRepublic blog post. A $5 bottle at Ace Hardware has lubricated dozens of servers and still has years of oil left.
One last key: it's super important to ensure you are using the right software to diagnose the system's internals with all parts being installed. On server overhauls, we run diagnostics before the system is approved for an overhaul, and also after all new parts go in. Unlike a workstation where we can afford downtime due to a bad part in many cases, a server doesn't have this kind of leeway for being down.
Almost every server maker out there has custom software for testing their boxes. Our favorite server OEM, Dell, has excellent utilities under the guise of Dell Diagnostics that can be loaded onto a DVD or USB stick and ran in a "Live CD" style. Only after all tests pass with flying colors is a box allowed to go back into production.
In addition, we always stress test servers days before they are meant to be placed back into production with a free tool by JAM Software called HeavyLoad. It does the equivalent of red-lining a car for as many hours as you wish. We usually stress a server for 6-10 hours before giving it a green stamp of being ready for workloads again.
In another related scenario last year, we had a client who had dual Dell PowerEdge 2900 servers in production. We refurbished both, and kept one running as the production unit on Windows Server 2012, with the second clone kept in the server cage as a hot spare and as a parts depot. It was a rock solid plan that is humming away nicely to this day, one year later nearly to the day.
We have numerous clients running such refurbished servers today and they are extremely happy not only with the results, but also with the money they saved.
Move to Windows Server 2012 R2 Unless You Have Specific Reasons You Can't
I've talked about this notion so many times before, it feels like I'm beating the dead horse. But it's an important part in planning for any new server to be in production for the next 5-7 or so years for your organization, so it's not something that should be swept under the rug.
There is absolutely zero reason you should be installing servers running on Windows Server 2008 R2 these days. That is, unless you have a special software or technical reason to be doing so. But aside from that, there are no advantages to running Windows Server 2008 R2 on new servers going forward. We've put Windows Server 2012 R2 and Windows Server 2008 R2 through extensive paces already in live client environments and the former is leagues ahead of the latter as far as stability, performance, resource usage, and numerous other areas, especially related to Hyper-V, clustering, and related functions.
Need further reason to stay away from Windows Server 2008? Seeing as we are already halfway through 2014, you would be doing yourself a disservice since Microsoft is cutting support for all flavors of 2008 by January of 2020. That's a mere sliver of just over five years away -- too close for comfort for any server going into production today.
Server 2012 R2 is getting support through January of 2023 which is much more workable in terms of giving us wiggle room if we need to go over a five year deployment on this next go around, with room to spare.
At FireLogic, any new server going up is Windows Server 2012 R2 by default, and we will be anxiously waiting to see what is around the corner as Windows Server releases seem to have a track record lately of only getting markedly better.
A quick note on licensing for Windows Server 2012 R2: do know that you can run three full instances of Windows Server 2012 R2 on any Standard copy of the product. This includes one physical (host) instance for the bare metal server itself, and two fully licensed VMs at no charge via Hyper-V off the same box.
It's an awesome fringe benefit and we take advantage of it often to spin up VMs for things like RDS (Remote Desktop Services). You can read about the benefits of Server 2012 licensing in a great post by Microsoft MVP Aidan Finn.
Ditch the SANs: Storage Spaces Is a Workable, Cheaper Alternative
Like clockwork, most organizations with large storage needs or intent to do things like clustering, are listening to the vendors who are beating the SAN (Storage Area Network) drum near incessantly. If the clock were turned back just four to five years, I could see the justification in doing so. But it's 2014, and Microsoft now lets you roll your own SAN in Server 2012 and up with a feature I've blogged about before, Windows Storage Spaces.
I recently heard a stat from an industry storage expert that nearly 50 percent or more of SANs on the market run Windows behind the scenes anyway, so what's so special about their fancy hardware that justifies the high price tags? I'm having a hard time seeing the benefits, and as such, am not looking at SANs for clients as first-line recommendations. Unless there's a good reason Storage Spaces can't do it, we're not buying the SAN line any longer going forward.
Think Storage Spaces isn't capable of large production workloads yet? The Windows Release team replaced eight full racks of SANs with cheaper, plentiful DAS attached to Windows Server 2012 R2 boxes, using this production network to pass upwards of 720PB of data on a weekly basis. They cut their cost/TB by 33 percent, and ended up tripling their previous storage capacity. While far larger in scale than anything small-midsize businesses would be doing, this just shows how scalable and cost effective Storage Spaces actually is. (Image Source: Aidan Finn)
The premise is very simple. Tie sets of JBOD roll your own DAS (direct attached storage -- SATA/SAS) drives in standard servers running Server 2012 R2 into storage pools which can be aligned into Storage Spaces. These are nothing more than fancy replicated, fault tolerant sets of DAS drives that can scale out storage space without sacrificing performance or the reliability of traditional SANs.
Coupled with Microsoft's new age file system, ReFS, Storage Spaces are highly reliable, highly scalable, and future proofed since Microsoft is supporting the technology for the long haul from everything I am reading.
While Storage Spaces aren't bootable volumes yet, this will change with time, probably rendering the need for RAID cards also a moot point by then, as I questioned in a previous in-depth article on Storage Spaces.
You can read about Microsoft's own internal cost savings and tribulations in a post-SAN world for their Windows Release team, which has far greater data storage needs than any business I consult with.
Clean Out AD/DNS For All References to Dead Domain Controllers
This nasty thorn of an issue was something I had to rectify on a client server replacement just this week. When domain controllers die, they may go to server heaven, but their remnants are alive and well, causing havoc within Active Directory and DNS. It's important to ensure these dead phalanges are cleansed before introducing a new Windows Server 2012 R2 server into the fold, as you will have an uphill battle otherwise.
In a current Windows Server 2003 environment, you can easily find your complete list of active and dead domain controllers via some simple commands or GUI-based clicks. Match this list with what you actually still have running, and if there are discrepancies, it's time to investigate if any of the dead boxes were handling any of your FSMO (Flexible Single Master Operation) roles. A simple way to view what boxes are in control of FSMO in your domain can be found on this article.
While a potentially dangerous operation, if any dead boxes are shown as controlling any FSMO roles, you need to go through and seize those roles back onto an active AD controller (NOT on a potential new Windows Server 2012 R2 box). The steps to handle this are outlined here.
In most small/midsize organizations we support, the FSMO roles are held by a single server, and we can easily transition these over to new Windows Server 2012 R2 instances after the Windows Server 2012 R2 box is promoted as a domain controller.
Be sure that you also do a metadata cleanup of AD for all references of the old dead DCs, and finally, clean out your DNS manually for any references leftover as well -- this includes fine tooth combing ALL forward and reverse lookup zones for leftover records. Even a single remaining entry to a dead box could cause messes you want to avoid down the road.
Once you have a fully clean FSMO role structure, which sit on a healthy DC, you can initiate proper formal role transfer over to the Windows Server 2012 R2 box after you have promoted the system properly through Server Manager. Canadian IT Pro blog has an excellent illustrated guide on handling this.
Remember: your network is only as strong as its lowest common denominator, and that is the core of AD and DNS in most respects. A clean AD is a happy AD.
Getting "Access is Denied" Errors on New Windows Server 2012 R2 DC Promotion? Quick Fix
I lost hours on a recent migration to a Windows Server 2012 R2 server from a Windows Server 2003 R2 box due to this Access is Denied error. After cleaning out a few things which I thought may have been causing Server Manager's integrated ADPREP on 2012 R2 to bomb, I finally found the fix which was causing the below error:
The fix? The Windows Server 2003 R2 server had registry issues related to not giving a proper key permissions to the LOCAL SERVICE account on that box. Prior to the adjustment, the registry key in question only had read/write access to the domain and enterprise admins, which was fruitless for what ADPREP wants to see in a full domain controller promotion of a Windows Server 2012 R2 box.
The full fix is described on this blog post, and don't mind the references to Windows client systems -- the information is accurate and fully applies to Windows Server 2003 and likely Windows Server 2008 as well, depending on what your old server runs.
Other problems that could lead to this nasty error include having multiple IP addresses assigned to a single NIC on a domain controller (not kosher in general); using a non-Enterprise Admin or Domain Scheme Admin account to perform the promotion; and having the new Windows Server 2012 R2 server pointing its DNS requests to something other than a primary Windows DNS server, likely your old DC itself.
Follow Best Practices When Configuring 2012 R2 for DNS
DNS by and far is one of the most misconfigured, maligned, and misunderstood entities that make up a Windows network. If I got a dime for every time I had to clean up DNS in a customer network due to misconfiguration... you know the rest.
You can read my full in-depth post on how DNS should look inside a company domain, but here are the main points to take away:
Take some time to ensure the above best practices are followed when setting up your Windows Server 2012 R2 DNS, because an improperly configured network will cause you endless headache. Trust me -- I've been knee deep in numerous server cleanups in the last few years where DNS was the cause of dozens of hours of troubleshooting down the drain.
Use the Chance to Implement Resiliency Best Practices: Dual NICs, PSUs, RAID, etc
A server update is the perfect chance to implement the kinds of things I wrote about in a piece earlier last week outlining what good backbone resiliency looks like on critical servers and network components. If you are purchasing new equipment outright, there is no reason at all you shouldn't be spec'ing out your system(s) with at least the following criticals:
Most of the items on the above list aren't going to increase costs that much more. For the amount of productivity loss and headache that not having them will cost otherwise, the expense up front is well worth it in my eyes and that of my customers.
Promote, Test, Demote, Raise: 4 Keys to a Successful Domain Controller Replacement
PTDR is a goofy acronym, but it represents the four key items that we usually follow when implementing new domain controllers into an environment, and wiping away the vestiges of the legacy domain levels. These steps ensure that you aren't removing any old servers before the functionality of new replacements are fully tested.
It goes something like this:
Some of the enhancements made in the Windows Server 2012/2012 R2 functional levels are described in an excellent WindowsITPro post.
Virtualize Up Internally, or Better Yet, Virtualize Out via the Cloud
It's no secret that virtualizing is an easy way to reduce the need for physical hardware. We tend to prefer using Hyper-V from Microsoft, mostly in the form of Hyper-V that is baked into the full Windows Server 2012 R2 Standard edition. It's Microsoft's go-to type 2 hypervisor option that allows us to host VMs on domain controllers for companies that don't have money for extra separate servers and don't want to move VMs into the cloud.
The other option, which we haven't adopted with clients that widely yet, is the VMWare alternative called Hyper-V Server 2012 R2 which is a completely free type 1 hypervisor that lives natively on the raw bare metal itself. You heard right -- it's totally free, doesn't cost a dime, and is not feature crippled in any way. Do note that the sole purpose of this edition of Windows Server is to merely host Hyper-V virtual machines. It can't
But Hyper-V Server 2012 R2 benefits from being lean, mean, and extremely stable -- not to mention near bullet proof when it comes to security. Aside from hosting VMs, it does next to nothing and therefore has less than a quarter of the traditional Windows dependencies running actively on it.
And as shocking as it may sound, Hyper-V Server 2012 R2 is one of the only editions of Windows that actually doesn't need any kind of antivirus running on it. No joke. Microsoft MVP Aidan Finn explains why in detail.
For many organizations, especially ones either looking to move from expensive VMWare rollouts, or considering going down that path, Hyper-V Server 2012 R2 is an awesome option. It receives full support from Microsoft in terms of Windows Updates (there are relatively few needed each month, but they do come out) and follows the same exact support lifecycle policy as Windows Server 2012 R2 Standard, which has a sunset of Jan 2023.
A lot of customers are curious as to what exact areas they can offload by using a cloud IaaS provider like Azure. This neat infographic explains it fairly well. Microsoft handles the networking backbone, storage arrays, servers themselves, virtualization hypervisor, and two items not shown -- the maintenance and geo-redundancy of the instances, too. Not as nice as what SaaS offers, but if you need your own full blown Windows Server instances, IaaS on Azure is as clean as it gets. (Image Source: TechNet Blogs)
While keeping VMs internally is a great first-party option, and getting more mature by the day, we are actively recommending to many clients that moving their VM needs off to services like Azure or Rackspace is a much better bet, especially for organizations with no formal IT staff and no MSP to fall back on. These platforms take care of the hardware maintenance, resiliency, geo-redundancy, and numerous other aspects that are tough to handle on our own with limited resources.
For example, we were at a turning point with a ticket broker who either needed to replace 4-6 standalone aging boxes with new ones, or opt to move them to the cloud as VMs. We ended up choosing to make the full move to Azure with the former boxes being converted into IaaS VMs. With the ability to create virtual networks that can be linked back to your office, we ended up tying those Azure machines back over a VPN being tunneled by a Meraki MX60 firewall. After stabilizing the VPN tunnel to Azure, the broker hasn't looked back.
When deciding on whether to go cloud or stay on premise, there are numerous decision points to consider. We help customers wade through these regularly. But you can follow the advice I penned in an article from about a year ago that went over the core criteria to use when outlining an intended path.
Evaluate What Internal Needs Can Be Offloaded to SaaS
Pinning up extra servers just to continue the age-old mess of "hosting your own" services internally, whether it's on-prem via Hyper-V or in the cloud on Azure IaaS, is just plain silly. SaaS offerings of every shape and size are trimming the number of items that are truly reliant on servers, meaning you should be doing your homework on what items could be offloaded into most cost-effective, less maintenance-hungry options.
For example, it's a no-brainer these days that Office 365, in the form of Exchange Online, is the best bet for hosting business email these days. Businesses used to host Exchange, their own spam filters, and archiving services -- all of which needed convoluted, expensive licensing to work and operate. Office 365 brings it all together under one easy to use SaaS service which is always running the latest and greatest software from Microsoft. I haven't been shy about how much my company loves Office 365, especially for email.
Another item that can likely be evaluated to replace traditional file shares is SharePoint Online. Our own company made the switch last year and we are nearing our year mark on the product soon. While we keep a small subset of files on-premise still (client PC backups; too large to keep in the cloud), the bulk of our day to day client-facing documentation and supporting files are all in SharePoint Online now, accessible from OneDrive for Business, the web browser, and numerous mobile devices. It's an awesome alternative to hosting file shares, if its limitations work with your business needs.
And for many companies, hosting their own phone server or PBX for VoIP needs has always been a necessity. But this day in age, why continue to perpetuate this nightmare when offerings like CallTower Hosted Lync exist? This hybrid UCaaS (Unified Communications as a Service) solution and VoIP telephone offering from CallTower brings the best of a modern phone system along with the huge benefits of Lync together into a single package. We have been on this platform ourselves, along with a number of clients, for a few months now and cannot imagine going back to the status quo.
Still hosting your own PBX, plus paying for GoToMeeting/Webex, and maintaining PRIs or SIP trunks? CallTower has a UCaaS solution called Hosted Lync which integrates all of the above under a simple, cheap cloud-hosted PBX umbrella that runs over any standard WAN connection(s). I moved FireLogic onto this platform, and fully believe this is the future of UC as I can see it. (Image Source: CallTower)
Hosting your own servers has its benefits, but it also comes with its fair share of pains that can, as shown above, be avoided by using cost effective alternative SaaS offerings. We almost always offload capable needs to SaaS platforms where possible these days for customers as it continues to make business sense from a price, redundancy, and maintenance perspective.
Moving File Shares? Use the Chance to Clean House!
Most organizations don't have good formal policies for de-duplicating data, or at the least, ensuring that file shares are properly maintained and not flooded with needless data. If you haven't skimmed your file shares recently, a server move to Windows Server 2012 R2 is the perfect chance to evaluate what exists and what can go.
In some cases, we won't let clients move data onto clean volumes of a new Windows Server 2012 R2 server until we have been able to sit down with them and verify that old garbage data has been culled and cleaned out. Garbage in, garbage out. Don't perpetuate a bad situation any longer than needed. A server upgrade is ripe timing to force action here.
Is a Server Migration Above Your Head? Hire an Expert
Moving between servers is something warranted once every 5-7 years for most companies, and especially for smaller entities without IT staff, handling such a move first party is rarely a good idea. Companies like mine are handling these situations at least once a month now. We've (almost) seen it all, been there, and done that -- not to brag, but to say that professionals usually know what they are doing since we are doing this week in, week out.
The number of times we've been called in for an SOS on server migrations or email migrations gone south is staggering. And it's usually when we get the call which is generally the time it's too late to reverse course and start from scratch the right way. Customers hate emergency labor fees, but honestly, the best way to avoid them is to avoid the mess in the first place!
Not only can a professional help with the actual technical aspects of a Windows Server 2003 to Windows Server 2012 R2 server move, but also help evaluate potential options for slimming down internal needs and offloading as much as possible to SaaS or IaaS in the cloud. Increasingly, it is also important that organizations under PCI and HIPAA compliance umbrellas consider the finer aspects of end-to-end security such as encryption at-rest, in-transit, and other related areas. A professional can help piece such intricate needs into a proven, workable solution.
With Windows Server 2003 being deprecated in under 300 days at this point, if you haven't started thinking about moving off that old server yet, the time to start is right about now. Seeing as organizations such as healthcare offices and those that handle credit card payments have much to lose with failing HIPAA or PCI compliance, for example, there's no reason to procrastinate any longer.
Have any other best practices to share when moving off Server 2003? Share them below!
Photo Credit: dotshock/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
A few weeks back, I finally had a really good chance at stress testing our company's still-fresh hosted Lync solution from CallTower. Merging calls. Transferring calls. Starting ad-hoc conference calls with clients. All the while IM'ing my internal staff and fellow clients, and checking voicemails that were coming through as MP3s in my email along with associated text transcriptions.
If you think I was sitting at the comfort of my desk with the power of a desk phone at my side, you guessed wrong. I was nearly 900 miles away from our home base in Park Ridge, out in the beautiful city of Stamford, CT helping clean up a messy VoIP rollout for a customer who needed some dire help.
And much of what I was doing on Lync was over a throttled T-Mobile data tether to my Windows Phone 8.1 Lumia off my Thinkpad. At a lovely 150K up/down roughly, due to my own fault in eating through my monthly data too fast, I was stuck on a connection just about 3x the speed of dial up.
Pitiful, I agree, but as you'll see, I didn't need as much juice as I was expecting. Lync performed like a champ in what would have caused Skype or Google Hangouts to choke most likely.
My trials with Lync on the road never missed a beat. Customers had no idea I was on a soft phone, and they likewise had little clue I was calling them from a hotel room while enjoying some Boston Market for dinner. Even with a substandard data feed, Lync was able to dynamically adjust the codecs it was using to match the bandwidth available.
After now spending the last few months using cloud-hosted Lync with full PSTN dial tone and phone capabilities enabled, I'm pretty convinced of one thing. If Alexander Graham Bell were to invent the telephone today, it would probably look and feel a lot like Lync.
For all intents and purposes, most traditional VoIP phone services aren't meeting the heavy demands of today's newfound needs out of the box. Hence the reason why so many "fancy" expensive PBX systems need tack-ons and add-ons galore; leading to a frankenstein of a solution before all is said and done.
I'm also going to go out on a limb and say that desk phones, in my humble opinion, are a dying facet of old-style telephony. Yes, they may still be popular in office settings, and many will refuse to switch to a softphone driven future. But in just a few months on full blown "Enterprise" style cloud-hosted Lync, I'm convinced that a softphone not only drives better productivity and saves massive time, but allows for a smoother, more seamless experience for that magic word: Unified Communications.
I haven't done any scientific testing, but my informal feelings on the amount of time I used to lose daily trying to copy down and then re-type numbers for dialing into a desk phone or cell phone was criminal. And it's something you just don't realize until you are offered an alternative. Like most people, I used to think that softphones were a fad that only the trailblazers were using. Real people still used desk phones, was my misconception.
If your only time with Lync has been through the version that Microsoft bundles with Office 365, you've been deprived of the full blown experience -- as my company and myself have, until recently.
Little did I know, Lync is actually built from the ground up for advanced call handling and conferencing capabilities. But since my company FireLogic was using the version of Lync 95% of us are exposed to, the Office 365 edition (Lync Online), my perception of what the tool actually had under the hood was skewed for some time.
Office 365 Lync Without Dial Tone Is Crippled
Lync Online, which comes standard with most Office 365 plans, is a very capable and quality product. I spoke highly of the service slightly over a year ago in a formal review I penned. But my review was squarely looking at Lync Online as a mere alternative to GoToMeeting/Webex and like offerings, with the addition of IM/voice/video chat for internal needs. I didn't even touch on the missing piece of the puzzle: PSTN-enabled dial tone phone service, aka regular phone capability.
A few months back I wrote about how Microsoft could easily put a choke hold on the cloud VoIP market if it finally turned on dial tone within Lync Online. I know very well that Microsoft already offers a semblance of this functionality in its consumer Skype product, but due to the number of limitations within Skype, I publicly made the case for why Skype should be permanently killed off in place of Lync. For anyone that has used the full capabilities of each product, you can easily tell that Skype is a child's toy compared to the power that Lync brings to the table. There's no comparing the two, I'm sorry to say.
Lync has been rising in the charts in terms of adoption at a rather impressive pace. Lync as an IP PBX system ranks 3rd overall in the States in terms of number of seats being rolled out, and 11th overall worldwide. Recently, Gartner even gave Microsoft the lead in the Magic Quadrant space for UC in 2014, pulling ahead of powerhouse Cisco -- as shocking as that is to some.
That's right: Microsoft (through Lync) is finally considered a technology leader in the arena of UC and Enterprise VoIP. Hopefully that quells anyone's disbelief in Microsoft's stature and direction in this sector.
For a relatively young competitor as Lync still is, its reach so far in the digital PBX arena is moving in a decisive upward direction, especially here in the States. Interest among my own client base is at rather high levels I'll admit, with fresh Office 365 converts needing a phone system overhaul being extremely interested.
Lync is game changer in that it allows us to break down the silos around having to choose to do conferencing either "Webex style" or "conference bridge style". In the Lync world, a one-on-one phone call can transition into a multi-party call, dive into a desktop sharing conference, and back to a simple phone call, all with just a few clicks. It's truly how seamless UC was meant to work. (Image Source: TechNet Blogs)
As I touched on in my previous article, Microsoft has created a marketing mess in how it advertises Lync to the masses. Trust me on this one; as someone who has to take weekly phone calls from prospective Office 365 customers, many of which are looking to use Lync Online as their phone system, I'm all too often the bearer of bad news. "Sorry, but the Lync that comes with Office 365 can do everything but that one last thing: make and take phone calls."
I don't blame customers for being so confused about what Office 365 does and doesn't offer. This is Microsoft's mess, and partners like myself are constantly playing expectation-check with clients. For starters, it doesn't help that the official master comparison chart for Office 365 for Business plans officially alludes to auto-attendant and PBX-replacement capabilities in the E3/E4 plans.
But alas, like many things in Microsoft land, the fine print is where the actual truth lives:
"Lync Server 2013 Plus CAL requires a customer to purchase and deploy an on-premises Lync Server to enhance or replace traditional PBX systems."
Until recently, I've had to bear the bad news to customers yearning to ditch their old PBXs in favor of Lync. For all the marketing bliss they read online about how large companies were enjoying this thing called "unified communications" in a Lync world, Microsoft was subtly saying that as a small business, they were just not the right fit for such a solution. This in and of itself is a load of crock, and I was determined to find a better answer for my own company and my clients.
I thought to myself: how could this many smaller organizations be interested in cloud-hosted Lync as their VoIP PBX, with Microsoft just turning a blind eye? Hard to believe Microsoft isn't getting the upswell of interest through their call centers, and my company being some anomaly here in the suburban Chicago area.
I've wanted to get my company onto Lync since about late 2012, when I was convinced that Google Voice just wasn't meeting my growing company's needs. I started on a journey, scouring for any solutions out there that could provide a cloud-hosted Lync PBX that I could use for my own phone system.
After a few months of discussions with some of the providers solely offering this back then, I had nothing but a sour taste in my mouth. There was a lot of hot air in the hosted Lync market in late 2012. I'm tasked with comparing vendors for my clients all the time, and can easily wade through the BS pushed by sales reps. To say the least, between high prices and feature gotchas, I wasn't at all impressed.
As such, we settled on going with RingCentral for our cloud PBX system. By no means am I conferring that RingCentral hasn't served us well. It's a very powerful, cloud hosted PBX that we have in place at organizations both small and large. The amount of control that the service provides you, the customer, in terms of self-administration is far and beyond the best in the business as far as I know, just a step ahead of my other favorite, 8x8.
For organizations looking purely for rock-solid VoIP desk phone service, RingCentral or 8x8 are excellent bets. But for companies looking for the holy grail of VoIP connectivity -- the notion of unified communications we keep hearing about -- then RingCentral or 8x8 won't fill all of your needs. Traditional cloud-hosted VoIP, for lack of a better name silo, is too much of a disconnected entity between the ways that many are looking to tie in voice, IM, video, web conferencing, and presence in platforms like Outlook and SharePoint.
That's because the situation all too often for organizations is handled in a very last-decade mentality:
When the IT industry was still very much nothing more than a wild west of proprietary technologies, with providers unwilling to tie into other services and proprietary standards ruling the day, we had no choice but to dole out our needs in a pure best-of-breed viewpoint. If one service couldn't do multiple things well, then piling on disparate, disjointed services was as good as it got. And so we settled on this status quo.
But there was nothing unified about such a communications approach. Vendor X had awesome desk phone service but couldn't tie into vendor Y's IM client, and it was compounded by standalone vendor Z's expensive tacked on conferencing service. Add in the complicated, expensive maintenance and hardware that a lot of these solutions have required, and you've got an octopus that needs constant care to merely keep afloat.
And for some of the other options on the market for unified communications platforms, like the Ciscos and Avayas of the world, there are endless promises of features galore, but with a big stipulation: all the bells and whistles need dedicated PBX servers, appliances, SIP trunks, and other countless on-premise items that must be installed, licensed, supported, and replaced as they reach end of life.
This has been the status quo for many organizations, for they all believed that's just the way it was.
As a big proponent of the cloud-hosted PBX approach, as per what I am familiar with knowing what RingCentral and 8x8 can do, I knew it could be done with Lync in order to reach my own company's unified communications nirvana.
After years of hunting, I can finally say that CallTower Hosted Lync was exactly what I have been yearning for.
Is Lync as a Cloud PBX Possible Without A Server On-Prem?
Prior to running into CallTower, a number of high-end Lync consulting firms nearly convinced me that using Lync as your phone system was nearly a pipe dream without an on-prem appliance or server, SIP trunks and related telco complexity. All the hot air was justifying itself, in my mind, partially due to the troubles I previously had with finding quality hosted Lync providers.
The vendors pushing expensive on-prem plans for Lync used this very fact to one-up their own selling points. "Some have tried it, but it just can't be done reliably." That was the mantra I heard time and time again. When you hear the same thing over and over from different vendors, you sometimes take things at face value.
A slightly more simplified approach was pushed by a company called Sangoma, which builds what they in essence call "Lync server in a box." It's an all-inclusive appliance unit that combines all of the complexities when it comes to separate hardware components of a traditional Lync rollout, and packages it into a pretty rackable package which seemingly any IT person could mount and setup.
The drawback? A nearly $12K price tag, and from what I know, this does not even include the necessary Lync licensing to light your users up. While this may be a feasible approach for midsize organizations, no small business I know would ever jump on this server-lite path.
Prior to finding my sweet spot with CallTower, I came across a provider which offered almost everything I was looking for. It's a UK firm called Fuse Collaboration, and they were the first ones that were actually very approachable, knowledgeable, and dedicated to a fully cloud-hosted Lync Voice product.
However, after discussions with their team, I found out they had one big gaping hole in their offering: no ability to use pure Lync SIP desk phones over an ethernet line without a USB connection back to a PC. PoE-driven desk phones are near ubiquitous for any organization that requires traditional phones, so I just didn't like the sound of this nagging requirement of being plugged into a PC at all times.
I couldn't settle on a service that couldn't function on traditional desk phones that weren't tied to PCs over USB. Knowing that any Lync solution needs to go head to head against the likes of RingCentral and 8x8 in terms of simplicity, such a limitation would be a non-starter for many clients.
If I'm going to go with a provider to dogfood the service and present the solution to my own clients, I couldn't be hamstrung by such a feature gap. After some further prowling, I came across CallTower and their hosted Lync solution.
Did I need a full Lync server at my office? Nope. How about a Session Border Controller? Nada. SIP trunks? Zilch. To my surprise, I didn't even have to tweak any firewall or QoS settings on our office Meraki firewall in any way. Even the test Polycom CX600 desk phones we had shipped from CallTower plugged right into a PoE LAN connection and fired up, logged in, and were functional with dial tone within 15 mins. I was duly impressed on all ends.
I've been using the service on a combination of desk phones, Lync client on the computer, and Lync on the mobile phone side without any major hitches. And the service works equally well even on cruddy internet connections. Doing a string of customer calls from a Starbucks a few weeks back on my cheap $20 headset and Lync on a Thinkpad wasn't any worse than being on a $300 desk phone. In many ways, it was better.
Barring a few voice quality blips at the height of lunch hour at the coffee shop, the stability and reliability of the connection on calls is one of the best I have seen of any of the soft phone clients I have put my hands on. I'll go into more on that in a bit.
Long story short? Anyone who claims that a cloud-hosted Lync PBX isn't possible without on-prem hardware or servers is selling you a plausible can of snake oil. I've personally confirmed this can be done, and already have customers on the product as well. Given a quality provider such as CallTower, it's most definitely a reliable PBX and unified communications solution.
Who the Heck is CallTower? Can I Trust My Phone System With Them?
You're probably thinking what I was when I had first heard of CallTower. Who are these guys? How come I never heard of them in all the IT circles I converse with?
I have no hesitations in admitting that I actually came across CallTower on a thread from the SpiceWorks forums. In my recent hunt for a cloud-hosted Lync vendor that was outside of the few previous underwhelming providers I came across, I stumbled upon a thread where another IT pro was discussing his ongoing good experience with onboarding to CallTower's Hosted Lync solution.
And I generally take the opinions posted on SpiceWorks with fairly high regard. Unlike some other half-baked online IT forums, this one is full of IT pros such as myself, knee deep in handling customers or their own IT department needs. These guys don't namedrop just for the sake of it, so it was a genuine recommendation that carried weight.
One thing led to another, and I happened to stumble across the contact info for one of the VPs at CallTower. A day later, we were already on a call and all of my questions were being answered with considerable technical detail, and CallTower's solution overall was looking more attractive by the minute.
This was the Lync solution I had been looking for -- for almost two years. And to my surprise, CallTower was not some new entrant to the VoIP arena, either. The company started out in 2002, first offering a pure Cisco VoIP solution stack, but has since branched out into a fully featured hosted Lync PBX offering.
For those curious, yes, they can offer cloud-hosted Cisco PBX if you are interested. They can even do hybrid tie-in to existing on-premise installations of Cisco PBX products.
The Lync 2013 soft phone client makes ad-hoc conference calls feel natural; not disjointed like most other services are, where you're forced to offer up conference bridges just to host more than 3 people at once. With CT Lync, I can conference in more callers with the click of a button and keep the conversation flowing, saving time and hassle. (Image Source: TechNet Blogs)
Within a week, I had two trial CX600 Polycom Lync phones sent over to our office, and CallTower provided me with two fully functional demo accounts that had Lync dial tone enabled. I put the product through its paces and unlike most previous situations where the actual experience was considerably underwhelming, the expectations CallTower put into place were actually on par with what I saw.
I even went so far to use the demo account to make a majority of my customer-facing phone calls, both on my Lumia 925 and on my Lync 2013 PC client.
This is where I started to become convinced that compared to the Lync soft client, a traditional desk phone was looking more and more obsolete in my eyes. In fact, after playing with how powerful Lync on the desktop was (surrounding phone capabilities), even a rather full featured phone like the CX600 felt like a step backwards for me.
For those with the preconceived notion, like I previously held, that they could never imagine using a phone without an actual phone, I beg you to read on and hear me out. You don't know what you're missing.
Cloud-Hosted Lync With Dial Tone: There Is Life After Extensions
For years, we've always viewed business phone service in the lens of extensions. Jack is an extension. John another extension. The conference phone was an extension, as was any other phone at the organization.
In plain terms, if you had to call someone, you almost always viewed that other party as an extension to be dialed. Sure, in some cases, you were able to dial the DID (direct inward dial) of that person, but the premise was still the same: there was an innate lack of humanity in how phone service has always been structured.
Cloud-hosted PSTN Lync, such as what CallTower provides, turns this 20th century viewpoint of phone service on its head. People are people once again, not extensions or just numbers. Placing a call to John in accounting is just that -- double clicking on John in Lync, opting for a voice or video call, and connecting with your coworker in the same way you would if you walked over to his desk.
Communication between internal users is reverted back to a seamless, personal experience -- not one marred in extension hell. I'm not saying you are forced to get rid of the notion of extensions for external callers. But for internal needs, extensions are a thing of the past.
Need to bring another party into the call? Simple. No need to fumble for the guidebook to your desk phone in order to three way call with someone else; just conference in that extra person right within Lync and search for them by, you guessed it, their name. If they aren't at their desk, their cell phones will be rung over the Lync mobile app and/or a traditional cell call and you can continue on with your conversation between parties.
Just because Lync is a Microsoft product, don't mistake it for being a Windows-only solution. Android, iOS, Mac all have excellent first party apps available from Microsoft for download. You shouldn't have to pick and choose which staff will be able to use a UC platform merely due to their device preference. (Image Source: TechNet Blogs)
I can check on presence of my coworkers within Outlook or on my Lync client; if someone is on a call, they will automatically be denoted as such on my screen. Previously, I could do this with my Cisco desk phones, but the setup process was still one limited by the number of open lines that could be tagged with presence for other parties, and further limits were instilled by RingCentral.
In reality, only fancy receptionist phones with expensive "side carriages" that had bevies of extension presence indicators could mirror this functionality. Now, you can view detailed presence on hundreds or thousands of people in a mere glance.
And voicemails that I get over Lync come in over email, as MP3 files, which are also transcribed for me in pretty good manner so I can get an idea of who called, what they had to say, before even having to listen to the voicemail.
You can see how this notion of unified communications is finally a reality once you have dial tone enabled within Lync. Before, with RingCentral and Office 365, we were able to do almost everything in a single service, except for making/taking traditional phone calls. RingCentral has served us well in the phone arena, but this duality has kept our dreams of unified communications disjointed and irregular.
That's solely because Microsoft has still to deliver on turning on dial tone within Lync Online, the standard Lync option that comes with Office 365 plans.
Microsoft made some commotion in the Lync IT pro circles earlier this year, back when Lync Conference 2014 was held, by alluding in numerous avenues to a potential reality that full enterprise voice capability would be coming to Lync Online. But with each month that passes, those "in the know" are continuously pointing to solid rumblings that what will actually be launching won't be what most are hoping for.
Microsoft made it fairly clear at Lync Conference 2014 that it will be turning on limited facets of PSTN voice capability within Lync Online, but much of what exists in full blown Lync 2013, like desk phone support, persistent chat, and integrated voicemail will likely not be hitting Lync Online. So Microsoft is moreso shooting for a Lync Voice-lite style of service, which will slowly have features turned on as time goes on.
How long this drawn out catch-up game will take, no one knows for sure. I couldn't wait for Microsoft to get its act together over a multi year timeframe. Our company, and clients, have solid needs for UC right now.
Which is why I threw my arms up in waiting for Microsoft to get its act together, and decided to look towards the hosted Lync route through CallTower.
The service itself is rather straightforward in how CallTower approaches hosting the product. You are free to bring your own Exchange environment -- whether that be Office 365, another hosted Exchange service, or even your own internal server system. Their hosted Lync 2013 infrastructure merely sits alongside your existing Exchange service, providing user accounts for Lync that have the same login name as your users already use.
You are free to have them provide the licensing needed for Lync (which is rather complex; blame Microsoft for this) or you can bring your own licenses. The most common scenario here is Office 365 users that are licensed for the E3 or E4 levels of cloud licensing. Having E3 licenses will subsidize your Lync license costs considerably, while E4 licenses will provide the best discounts from CallTower for their hosted Lync service. This is because of the higher Lync Enterprise Voice CALs that these Office 365 plans entail.
Have Office 365 Small Business or Midsize Business? These plans don't have the level of Lync licensing needed, and just another reason why I hesitate recommending these levels to clients. If you are potentially looking at tying in a hosted Lync provider like CallTower, just move straight into the Enterprise plans right off the bat.
But I'm not interested in drowning readers in licensing hell. What kinds of neat things can you do with dial tone enabled Lync? Here's a sampling of some of the scenarios I have taken advantage of in the last few weeks:
And keep in mind that the above list of features are merely those which are enabled with a dial tone in Lync, as we get from CallTower. All of the classic Lync functionality, the stuff that comes with Lync Online, is present and usable at the same time. IM, HD video chat, file sharing, OneNote sharing, etc etc -- it's all available and works as advertised on the CallTower hosted solution.
One of the big things I want to make clear as well, that some people may be curious about, is how natural the above phone functions feel on a soft phone like Lync. I refuse to put a desk phone on my desk again. And none of my staff are using desk phones anymore, aside from our core reception phone which I am on the fence about keeping or getting rid of. The part of me that wishes to get rid of that final desk phone happens to be slowly taking hold; we'll see where it ends up.
Since we switched our company from Google Apps to Office 365 late last year, I've been a big proponent of looking at UC solutions in a perspective of ecosystems. The traditional "best of breed" approach, where providers and solutions are stacked one on top of the other, without cohesion, is not an ecosystem-centric stance. While the company deploying all of these disparate systems may be winning the technology war, or the "feel good" checklist, the actual users tasked with using this myriad of services ends up suffering.
And when end users have to climb a mountain just to take advantage of new UC technology, adoption falls, and these solutions billed as nirvanas bomb flat on their face.
We see it in the field all the time. It's nothing new. In the drive to piece together an ideal nirvana of solutions, the end result is one drowning in a low end-user adoption because the experience at hand is sub-optimal. The natural ebb and flow of usability takes a backseat to penny pinching or using competing systems for one-off purposes. The VoIP desk phone provider ends up being different than the conferencing vendor which is also different from the internal IM/video chat client.
And this is exactly where a solution like CallTower Lync hits this notion home. Workers don't have to learn multiple systems and jump hoops between vendor interfaces. You're in the same tool for making calls, sending IMs, launching video chats, hosting conference calls, and getting your voicemails.
Next to the Office suite, Lync could very well become one of your most used daily staples if you were to replace the PBX wholesale in exchange for a Lync-driven solution. It's extremely powerful when implemented right, and I'm seeing this first hand at our company.
When Comparing Costs, You Must View Cloud Lync PBX In The Proper Lens
Since many decision makers are still looking at phone service as a disconnected, standalone entity, they aren't accurately portraying their apples-to-apples cost comparisons as a result. It's that Vendor X, Vendor Y, Vendor Z dilemma that I alluded to above, with gotchas across line items that makes objective comparison tough, if not impossible.
If you're contrasting the costs of going with a hosted Lync provider like CallTower against the likes of a RingCentral or 8x8, you need to keep in mind two important aspects: what services Lync can fully replace, and what you won't need to be purchasing/installing if going with a soft-client approach to Lync enabled voice.
Take for example what an average, traditional office may do for their various needs:
Phone Service: They could be looking to keep things insourced, and opt for an on-prem PBX. Or a pure cloud-hosted service like RingCentral or 8x8, which eliminates the servers and PRIs, but either approach requires desk phones in most cases, as the soft phone clients from these services isn't as mature as what I find in Lync.
IM/voice/video chat: Some phone services come with a limited solution in this arena, or companies may opt to buy something commercial for this aspect. Others may do as we did, and run RingCentral side by side with Lync or Google Chat for these intra-company communication needs. It works, but furthers the vendor-on-vendor dilemma of stacked services.
Web Conferencing / Conference Bridge Lines: Everyone knows that the likes of GoToMeeting and Webex are not cheap. If you are looking to equip a subset of your staff, or worse, your entire staff, you'll be ponying up large extra sums each month for this ala carte components that is becoming ever more important for companies, especially with the rapid rise of telecommuting.
It's pretty evident that comparing RingCentral vs hosted Lync is not an apples to apples comparison. Nor is hosted Lync vs GoToMeeting. Because Lync, in a CallTower scenario, ideally replaces all of these separate services altogether, you need to be judging your cost savings on a broader spectrum.
Lync takes the place of the PBX or cloud-VoIP provider. It also renders your web conferencing service like GoToMeeting null and void. If you had traditional infrastructure supporting your phone system, like a PRI or SIP trunks, you can wave those goodbye as well. And it goes without saying that you can ditch any standalone IM platforms -- Skype, Google Chat, Yahoo Messenger, Facetime, etc.
It's not surprising that Microsoft has taken the lead from Cisco in Gartner's Magic Quadrant for UC in 2014. From solid desk phone connectivity that ties back into strong desktop/mobile apps, the seamless experience afforded is in my opinion leagues ahead of the disjointed experiences being pushed by Cisco, Avaya, and others. (Image Source: CallTower)
When taking into account all of the above unnecessary third party services that can be trimmed away, the overall cost savings by moving to a Lync system can be very drastic and real. Especially if your company is going to be ditching the desk phone approach and use pure Lync client for phone needs; for large organizations that would have otherwise dumped $100-200 per desk phone otherwise, this is another massive capital expenditure avoided.
So the million dollar question is: how much does CallTower Lync cost our company? While I won't pin an exact number on it, since we got some special promos when we signed up, I will say that you can expect it to run around $30 USD per month per user, give or take some.
Each situation is different and pricing is affected by the number of users you have, whether each person needs full Enterprise voice capabilities, and also whether you have licensing for Lync Voice through Office 365 E3/E4 service levels. If we were able to bring E4 licenses from Office 365, we could have further trimmed our monthly bill down even further.
If you're taking on a prospective move to Lync and viewing your situation in the proper holistic sense, as I outlined above, then you'll be much more aware of the savings as opposed to someone who may not be realizing all of the fringe costs they are trimming off.
How Does Lync Compare to Skype and Google Hangouts in VoIP Capability?
I mentioned earlier that Lync as a full blown UC solution, especially in the dial tone arena, is a few steps ahead of what Google Hangouts and Skype offer. I am pinning Lync against these other two powerhouses since they are the faces of soft phone VoIP today for all intents and purposes, from what I hear and see being talked about. So it's only fair to peg them in a comparison to Lync.
And indeed, I see Skype and Hangouts being used in business scenarios all the time. Usually relegated to pure video/IM web conferencing, but I do have a few clients who swear by Skype for cheap dial tone service, for example. It's quite clunky as a phone solution, they admit, but the cheap price is worth the hurdles to many people I guess.
I admit that I personally used to use Skype for some phone calls, like when I wanted to do a long 800# call on my headset, and the service works pretty well. The intentions behind what Skype tries to offer isn't my issue with the product -- it's the lackluster execution, arbitrary limits, and lack of inbound number porting, to name a few.
Hangouts is a service that is undergoing a personal identity crisis more than anything right now. After Google announced it would be killing off standalone Google Voice a year back, and rolling those features slowly into Hangouts, the Google community has been a bit flustered with how this "transition period" is being handled.
And the stupid requirement Google had, which I railed them on for a while, that forced everyone who wanted to use Hangouts to have a Google+ account has finally been dropped. I wonder why it took so long. Another great improvement has been the bundling of Hangouts into the core Google Apps suite, and in turn, getting SLA backing for the first time in its history.
As of this writing, 9-28-2014, here's an objective comparison of Lync (via CallTower), Skype, and Google Hangouts:
Without a doubt, CallTower Lync takes the cake in call handling capabilities. Skype offers some basic call handling functions, but in Lync you can natively (and easily) transfer, merge, forward, and take advantage of an auto attendant. While Google Voice billed itself as a stripped down phone system, in reality, its call handling functions are limited to forwarding only.
When it comes to functional user limits on things like screen sharing and conference calls, Lync blows away the competitors pretty easily. Google Hangouts recently increased its voice, video, and screen sharing limits to 15 people but this is no match for the 100 users that can sit together on a Lync session. Might I add, this mix of 100 can be any combination of Lync VoIP or dial in callers.
Desk phone support in CallTower Lync again beats the competition, especially if you go with the CX600 from Polycom which is not only one of the best Lync phones out there, but a very solid VoIP desk set overall. It doesn't feel overly cheap like Polycom's popular IP-300 series SIP units which surprised me.
Don't get me started on Skype's handset selection. Skype used to offer an awesome analog phone adapter called the FREETALK Connect Me box, but for some odd-headed reason, discontinued selling this device. You used to be able to use any standard analog cordless phone system with Skype via this adapter, but not anymore. Now you're left choosing between one cheap, substandard Skype-certified handset or another. Skype clearly either doesn't want desk phone users on its service anymore, or just wants to make higher margins on first party devices it sells. Pick your poison.
I did not list it specifically on the above chart, but it should be noted that CallTower Lync has 24/7 USA based phone support for its product. Since Hangouts was rolled into Google Apps recently, it also receives 24/7 support but Google uses questionable outsourced support centers that have given me troubles in the past. Skype, as a freemium product at heart, doesn't have any phone support, just bona fide (and still pretty good) user support forums online.
The final area I want to touch on is the aspect of HIPAA compliance, something that got an Oklahoma, USA physician in trouble recently when caught using Skype to do telemedicine with patients. To make a long story short, Skype and Google Hangouts (and Facetime, for that matter) are absolutely NOT HIPAA compliant solutions for usage in healthcare situations. Due to the stringent encryption, auditing controls, and infrastructure security implemented into Lync, it happens to be one of the sole products I know of that meet HIPAA compliance regulations.
You can read at length what I had to say about using Skype and the like in HIPAA situations on a previous article last year.
The Areas Where Hosted Lync Still Suffers
While CallTower Lync is all around a very solid solution, nothing is perfect. After spending the last few months on the system, using it for my day to day client and internal needs, I've been able to pin down some of the aspects where Lync still has some rough edges.
One of the biggest glaring, gaping holes in Lync has to be a complete lack of faxing support. There is absolutely nothing in Lync that can either send or receive faxes, and it's unfortunate, because coming from RingCentral, I was quite used to having a very solid virtual faxing experience that did the job.
I know faxing is becoming an antiquated need, but we do use it a few times a month. Most notably, when testing other customer fax machines at offices or sending in signed paperwork to a few select vendors. We still have our RingCentral account, so we are utilizing faxing through there, but it would be great for CallTower to supplant this or Microsoft to just build something into Lync natively, which is the ideal option.
Another nagging bug that we didn't see on our RingCentral system is a very nuanced one that only arises if you are trying to establish a three way call, where the third caller being conferenced in is an 800 support line. If that support line uses a phone tree for initial menu navigation, you will not be able to pass commands to the support line (or third caller) phone menu.
I had too many three way calls with clients and support lines go awry and found out that CallTower support did indeed let me know this was a Lync system bug that has no immediate fix. The simple workaround is to establish the support line as the first caller and then dial in the client as the third caller, or place the initial call on hand and place a separate call, and use Lync to merge the calls together. Simple enough, but something we had to find out the hard way.
While we've been told it's coming, CallTower does not yet offer automated same sign on (SSO) capability for their hosted Lync product. This means that while your users will have the same email address as Office 365 (or other Exchange system), their ability to use the same password for Lync service is dependent on them changing their passwords to match. It works well enough, and it doesn't bother us as much since we are smaller, but with Microsoft offering DirSync for on-prem AD and Office 365 for a while now, this is a need that is only growing in popularity.
I'm also a bit perplexed as to how Lync mobile behaves in some situations on my Windows Phone 8.1 device. Sometimes, I get notifications for new IMs that I can jump straight into without issue. But sometimes, I tap on the notification from the Lync message, and the app just load and loads incessantly until I have to back out and go back into my Conversations area manually. Similar oddities hit phone calls that come in while my device is screen locked. I don't think these are CallTower problems, but more so snags that the Lync team needs to iron out fully.
I've heard that Lync 2015 is likely around the corner, and here's hoping these oddities get full attention before the next iteration launches. CallTower claims to be already testing initial versions of the product, which also eases my fears of them keeping customers on old versions of Lync for too long. Software age creep is not looking to be an issue at CallTower, unlike what I see at a lot of Hosted Exchange providers (names of which I will withhold).
Lync for Mac 2011 exists, and yes, it gets the job done for the most part. But compared to Lync 2013 for the PC, the app is far too buggy, crashes too often, and is plagued by too much "yesteryear syndrome" in terms of features. Rest assured, though, as Microsoft is on the brink of releasing a new Office for Mac edition any week now. (Image Source: TechNet Blogs)
Since I am coming from RingCentral, which is super customer-friendly when it comes to the level of control offered in self administration, this is one glaring area I find to be lacking in CallTower's solution. Don't get me wrong -- any change needed can be made with a short phone call to their 100% USA based support team, with wait times being in the mere minutes range even during the middle of the day.
I just wish CT would offer customers more fine-toothed web-based control over auto attendant rules, hold music, greetings, and other items which RingCentral has front and center on their online portal. Not a deal killer, but a "nice to have" for an infrastructure geek like myself.
And I alluded to it above, but I want to make it very clear that Lync 2011 for Mac is far from being as polished as the Lync 2013 client for the PC. One of our internal staff members is a diehard Mac user, and I hear from him constantly the troubles he runs into while on conferences, trying to screen share, and using other simple functions. It's not a secret that a new Office for Mac is coming out this year, but it can't come soon enough for our Mac friends.
CallTower Lync Succeeds Where Other UC Platforms Have Failed
The very definition of unified communications is bringing together a cohesive, seamless experience for all aspects of a user's communication needs. Whether it be telephony, conferencing, voicemail, presence, IM -- these necessities should feel like mere extensions of eachother, not distinct functions siloed in different barns. Too many solutions billing themselves as "UC" meet the criteria at the lowest common denominations possible.
CallTower's Hosted Lync product, in contrast, has truly been a breath of fresh air. Expectations established have held up in real world testing. And as a company on Office 365, already used to what Lync can do, bringing in the PSTN dial tone unleashed a whole other realm of possibilities that wasn't realized in pure Lync Online.
Is Lync a virtual PBX and UC solution for every single company replacing their aging on-prem PBX? Absolutely not. There has to be both an innate need for softphone driven communication, along with the realities of being able to achieve the right mix of licensing that Lync requires. Organizations looking for a traditional desk phone pure-play via a virtual cloud PBX provider, with no intentions in utilizing Lync, are better off looking at RC or 8x8 or CallTower's Hosted Cisco solution.
But if you've got a workforce already on Office 365 or Exchange, and are looking to tie in the rich conferencing capabilities of Lync in a phone system solution, CallTower Lync is a natural fit. Compared to what it would cost to piecemeal out the voice and PBX functions, bringing it under the same roof has proven to be a great cost saver for us at FireLogic.
A few other clients who decided to jump on board CT's Lync offering have been overwhelmingly positive in their feedback to us, so I know we aren't the only ones having positive experiences with the firm.
If there's one thing I do wish, it's that CallTower had this solution to market years ago. I know a lot of clients that have invested hefty sums into first-party PBXes and internal Lync servers due to the mere lack of any great hosted Lync PBX provider being available.
For those companies stuck using Skype or Hangouts for conferencing or virtual soft phone PSTN, they have no idea what they're missing.
Image Credit: ND Johnston/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
As overused as it may be, the old mantra still holds definitively true: you're only as strong as your weakest link. This goes for sports teams, business divisions, vehicles, and most anything else in life where multiple links make up the entity at large. It shouldn't be surprising that IT systems and networks follow the same logic.
Yet this very notion is what causes me to cringe when in discussions with new and existing clients. There is a large disconnect when it comes to the average small business owner, as to what technical improvements will actually lead to better stability and resiliency -- all encompassing what they truly care about: uptime.
I'm often on the listening end of hearing the joys of all the new hardware customers have decided to invest money in, yet are misguided as to how much of an effect these purchases will have on resiliency (aka uptime). Fleets of new Mac computers or Surface tablets. Shiny new desk phones. Or perhaps expensive new servers with the fancy bleeding edge Xeon processors and gobs of memory.
While worthy investments, and definitely mission-critical aspects to many businesses, these purchases do next to nothing when it comes to meeting the resiliency needs of most modern small businesses on their own. Resiliency is the ability for complex systems to stay operational in the face of disaster or outages. In the context of the average small business, this entails the aggregate whole of the core systems that power the IT backbone such as internet connectivity, server connectivity, application hosting, and more.
A $1500 Mac is as a crippled as a $250 Chromebook when your office server is down due to failed hard drives. Office 365 is great, except for when your only internet connection is down for half a day and no one can get emails internally. And that fancy $5K server is a massive paper weight if the network equipment it relies on has ethernet ports that fail in the middle of a busy day.
Before I get into the ways that we can enhance the resiliency of our IT systems, I want to shed light on the numerous examples of what this concept looks like in everyday life. Simple analogies make even tough IT concepts easy to understand.
Resiliency in the Real World -- How Does it Look?
As an infrastructure geek, resiliency and redundancy are things I am thinking of before comparing any of the finer specs on major IT systems for clients like networking equipment, servers, cloud IaaS, or similar undertakings. But the "average joe" doesn't always realize the how or what behind what solves these needs -- that's my job to spec, compare, and present as a digestible solution in plain terms.
Resilient systems, as I mentioned previously, are far from being relegated merely to modern IT systems. This concept is blatantly prevalent in things we come across every day, even though we may not realize it. Since I live and work in Park Ridge, a very close suburb of Chicago, IL (USA) , examples of the aforementioned are not hard to come by.
I happened to attend a Microsoft seminar in Chicago today, and had to take a drive over to the parking garage so I could hop on a CTA train, our local transit system for the city. On my way there, I had to pass numerous intersections with stoplights. These intersection stoplight systems are built for resiliency in the way that each intersection has multiple sets of stoplights for a given direction. Two sets are fairly common; major intersections may even carry three sets per direction. One failing still keeps traffic moving.
The gatekeeper at the parking garage was an entry ticket system, so that one can pay upon exit. There were two such machines on separate lanes, and as luck would have it, one was down and out with a cone blocking that lane -- so all cars were passing through a single ticket machine this morning. The larger system as a whole hit a snag, but continued operating in a delayed but functional manner.
The CTA train system in Chicago, while having numerous faults with its financials and politics, is fairly intelligently built from an engineering viewpoint. Each train line on the system has at two tracks, one going "inward" towards the city center, and one going "outward" to the phalanges of the city.
During normal operation the tracks are used concurrently for two-way passenger flow, but when problems like accidents or outages hit a track, both directions' trains can pass back and forth on a single track for smaller stretches at a time. Slow, but still resilient in the truest sense.
Ever travel on the Chunnel that connects France and England? It happens to be the largest underwater tunnel in the world, and is likewise one of the biggest resiliency engineering achievements of modern time. It's powered by redundant power sources, and features a triple-tunnel design -- two for normal operation, with a third emergency backup in times of crisis. An otherwise dangerous passage is made extremely safe via rigorous resiliency planning. (Image Source: BBC)
The CTA system straddles the famous Kennedy expressway that heads into the inner city, which is another system built for resiliency in time of need. When repairs close down lanes, traffic still flows. And heavily congested parts of the expressway feature "express lanes" which are usually used to carry overflow traffic in the direction heavily needed during certain times (generally inward during the morning, and outward from the city center in the afternoon). If disaster strikes or other issues close down too many normal lanes, these "express lanes" can be used to provide additional capacity. Again, resilient by design.
Numerous other complex systems in our daily lives, such as airplanes, follow very similar engineering logic. Due to the inherent dangers of having single points of failure while 30,000 ft in the air, planes almost always now have most core backbone aspects either duplicated, or even triplicated. Core infrastructure needs on planes like power systems, air systems, hydraulics and other pieces have two to three layers of redundancy for unmatched resiliency in the face of failure during life threatening situations.
Not surprisingly, the amount of expensive engineering might that goes into every commercial airplane has led in part to airline travel being one of the safest ways to travel.
How does all of this play into modern IT platforms and systems, like company servers and networks and communications services? Resiliency planning, no matter what end form it takes, still provides similar end results.
When it comes to the kinds of small and midsize businesses we support, these results are in the form of superior uptime; reduced data loss; lower yearly maintenance fees due to a reduction in emergency labor costs; and above all, less lost productivity. These are the things that almost any small business owner truly cares about when it comes to their IT operational status.
If IT systems are only as strong as their weakest links, what do these links actually look like? Here are the ones we are routinely evaluating during upgrade cycles with customers.
Dual WANs: Resiliency for Internet Connectivity
With as many businesses moving to the cloud as is the case today, especially among our client base, the weak link in a full cloud application stack is clearly your ability to have a working internet line during the business day. Some business owners place too much reliance on half-hearted promises from providers of things like Fiber or bonded T1/T3 lines, and believe that the ISP's promise of uptime alone is worth betting your business revenue on.
Coincidence or not, time and time again I find that these same pipes that promise unmatched uptime are the ones hitting longer downtime spans compared to their less-costly counterparts like business coax. T1 lines that are down for whole mornings, or fiber runs that were mistakenly cut. Whatever the excuse, placing too much trust in a single ISP of any sort is a bad plan for cloud-heavy businesses.
The alternative? Stop wasting money on premium lines like sole fiber or T3 and instead build your internet backbone in a resilient manner using dual cost effective options such as business coax and metro ethernet lines. In conjunction, a 20-40 person office can easily be run off of two connections which together run in the upper hundreds of dollars range instead of the thousands of dollars range, with better uptime overall and speeds that are near equal to their premium counterparts (barring Fiber at the higher levels, which is out of question for most small businesses anyway).
Meraki's firewalls are one of the best investments a company can make to enhance resiliency with dual WAN pipes, increase security through native filtering, and simplify network management. They are the only network equipment provider to fully allow for cloud management across all their devices. They are, in turn, easy to install and just as simple to support. I'll recommend them over Cisco ASAs and Sonicwalls any day of the week. (Image Source: Meraki)
We're routinely putting these dual WAN pipes into play on our firewall boxes of choice, which are units from Meraki's MX series of devices. A sub 25-person office can invest in a Meraki MX60W for just about $500 before licensing, and they can leverage intelligent internet connection bonding/failover, AutoVPN capabilities between branches, pipe-level malware filtering, and built in 802.11N wireless. Not to mention that Meraki's support is one of the best in the business from our experience.
For establishments that require 24/7 operation and cannot have any downtime, such as hospitals, we are even going so far as to implement high availability between multiple firewalls which adds complexity, but achieves near 100 percent levels of internet and connectivity uptime. You do have to purchase matching sets of the firewall you want to use in HA mode, but Meraki doesn't ask you to buy dual licenses -- making HA a cost effective option for something that used to be only viable for the enterprise.
NIC Teaming: Resiliency Between Network Components
Most new servers these days come with at least two NIC ports on the back. For higher end boxes, 3-4 NICs isn't that uncommon. And what do most customers do with all but port number 1? Nothing, to be precise.
While implementing resiliency in the form of NIC teaming used to be a touchy and daunting affair, back in the Server 2003/2008 days, Microsoft has brought the technology to the masses. Instead of placing reliance on the NIC makers and their sometimes-pitiful drivers, Microsoft now offers NIC teaming capability as part of the core Server 2012 and 2012 R2 experience. It's a mere click-click-click affair and it has worked flawlessly on each install we've done.
The core tenet of NIC teaming is to provide multiple resilient points of failure in a network communications path to and from servers, and even between switches and firewalls. It's not feasible to implement multiple switches per network segment for most small businesses, but there is no excuse now NOT to be using NIC teaming in Server 2012 R2.
NIC teaming used to be a messy affair on servers. You needed special drivers, complex configuration, and the right mix of luck. That's all a thing of the past. Every new server we deploy now is setup for NIC teaming by default, as Server 2012 R2 has it baked into the core OS -- no special crazy drivers required. As shown above, you can opt to connect both NICs on a server to the same switch, but for the less cost-sensitive, going across two switches is ideal as it further increases resiliency in the face of switch failure. (Image Source: TechNet Building Clouds Blog)
The form of NIC teaming on Server 2012 R2 which we love using the most, Switch Independent mode, allows this feature to be used with any dummy (unmanaged) switch and still function how it was meant to work. When two (or more) NIC links are active and connected from a given server they work in a load-balanced manner, providing additional bandwidth to and from the server.
If a link failure happens, either from a bad ethernet cable or switch port (or downed switch, in the case of multiple independent switches), the server can actively route all traffic blindly over the remaining working link or links. I won't go into the technical nitty gritty, as you can read about it on this great blog post, but the feature is probably one of my favorite aspects of Server 2012 (it's hard to pick just one).
If you have decent managed switches at your disposal, you can create bonded links using multiple ports with commonplace technology known as LAG. The implementation and technology behind LAG varies from switch maker to switch maker, but the core concept is the same as NIC teaming in Server 2012. Creating resiliency through pipelines that can failover in the face of disaster from a single link going down. We don't implement this often at small businesses, but it's out there and available on numerous sub $400 switches like models from Cisco Small Business, Netgear's PROSAFE series, and Zyxel's switches.
RAID & Windows Storage Spaces: Resiliency for your Data
Redundant Arrays of Independent Disks (RAID) is by far one of the most important aspects of resiliency a company should be using to keep uptime on servers and data sets at a maximum. It's proven technology, relatively inexpensive, and extremely mature. There are numerous levels of RAID available, all of which serve a different operational purpose, but RAID 1 is by far our favorite flavor of RAID to implement at the SMB level due to its simplicity and high reliability, especially when paired with SAS or SSD drives.
The concept of RAID, or specifically RAID 1, is very simple. Take two or more identical disks, employ a storage controller that can manage duplication of data between them (a RAID card, as it's called), and you've got resiliency-in-a-box for the most part. You can setup numerous storage volumes such as one for Windows, one for file serving, another for long term storage, etc. More and more, we're tying together high-end SSDs to work together in RAID 1 "pairs" -- one for Windows and HyperV, and another solely for data storage. This tag team has worked well on numerous server rollouts for us.
Many IT pros gawk when they see me mention RAID-1, but they forget the holy grail of why I love using RAID-1, as I discussed at length in a previous article. In case of disaster on a RAID-1 volume, I can easily pull a disk out of the array and recover data in next to no time, since the disk is not tied into a complex RAID set that can only be read and written from/to off the original controller card series employed in the server. The security that comes with this failsafe is well worth any of the small shortcomings of RAID-1 like slightly lesser performance.
If a disk in a RAID setup fails, we merely go in and replace the failed drive, the array rebuilds, and we are back in business. No expensive downtime. No costly repairs. It's a tried and true method of keeping critical systems humming, and for some time, this was the only method we would rely on for such resiliency at the hard disk level.
Who needs an expensive SAN? Windows Server 2012 and 2012 R2 offer Storage Spaces, a cool new technology that lets you build your own resilient storage backbone out of inexpensive drives. We've been using it as an alternative to RAID for our primary client file backup data at our office, and it hasn't suffered a hiccup in nearly a year. We opted to use Microsoft's new ReFS instead of NTFS for the filesystem, and this affords for increased reliability and resiliency of the underlying data. Microsoft got this new technology pretty right. (Image Source: TechNet)
Recently, Microsoft has introduced a very interesting, and so far very-stable option to using large RAID arrays for data storage (and SANs, for those businesses that have investigated them) called Storage Spaces. It has been baked into Server 2012 and Server 2012 R2 and our own internal production server has been using it for critical data storage for most of the last year.
Storage Spaces is a build-your-own alternative to expensive high end RAID arrays and SANs (codeword for nasty expensive storage banks that enterprises buy) which you can piece together with nothing more than servers, loads of standard SAS/SATA/SSD disks (you CAN mix and match), and Windows Server 2012 and up.
I won't dive into the finer tenets of Storage Spaces, as you can read about my thoughts on this excellent new technology and why I think it's near ready as a prime alternative to RAID for data storage needs.
I got a lot of emails from readers after that article asking about whether Windows itself can be installed on a Storage Space, and at least for now, the answer is no. With the improvements being made to SS and the underlying replacement to NTFS, ReFS, I would not be shocked if this will be possible soon.
When that happens, hopefully in the next release of Windows Server, we may be ditching RAID entirely. But do know that there are two great options for introducing enterprise class resiliency to your small business IT systems.
Dual Power Supplies: Resiliency for Server Power
This is considered a premium option on servers, but one that we almost always recommend these days for any physical boxes we install. Our favorite go-to servers are Dell PowerEdge boxes, as they are relatively cost friendly and provide such premium features like dual power supplies. Coupled with a 3 or 5 year warranty, we can have replacements in hand for any downed PSUs in a matter of a day at most with no strings attached.
Most cheap servers employ the use of a single standard power supply, just like what any standard desktop or laptop has. But if you are running your company's primary applications or data from a single or few critical servers, is praying for the best and hoping a PSU doesn't blow out really a contingency plan? I think not.
Dual PSUs on a server renders the same purpose as dual network cards: in the face of one failing, you don't have to incur damaging downtime or worse, lick the costs of emergency repair labor. A downed power supply merely alerts you via noise or flashing lights and a software alert, and a spare can be purchased and installed with ease; no downtime required. Running a server without dual PSUs is manageable, but only if you have a hot spare replacement on hand and ready to go, which most businesses trying to save a buck don't. (Image Source: StorageReview.com)
A good PowerEdge or similar server with dual slide-in power supplies will raise the price tag of the box by a good 10-20 percent, but it's almost surely money well spent. Not only are the units easier to replace when they go down, which reduces labor costs, but it also prevents downtime because a server can stay operational when a single power supply fails out of the blue.
And don't tell me your business doesn't care about downtime. It's likely that you probably just don't know what an hour of lost productivity actually costs your business. The Ponemon Institute, a fairly well regarded independent IT research tank, pits the most recent average cost of per-minute downtime at $7900. This is most obviously swayed due to the number of Enterprises factored into the figure, but you get the picture. The calculations to run your own downtime figures are shown here.
So is skimping on the extra couple hundred bucks on that new server truly worth it? Run your numbers and then get back to me.
Virtualize your Servers: Resiliency For Production Workloads
The argument for purchasing a new server for every extra need your business may have -- whether it be an additional file server, application server, devops environment, print server, etc -- is old hat and antiquated. While there is something to be said for keeping one or two solid servers on-premises to power your workloads, we consult with too many clients that are stuck in a 1990s mentality that a server a day keeps the doctor away.
Just the opposite, in fact. Physical servers not only are raw capital outlays that must be purchased and likely amortized, but they have loads of other needs that balloons their operating costs over 3-5 year cycled timespans. Electricity isn't free. Licensing isn't free. Security software isn't free. Backup, maintenance, configuration, and replacement just add to this mess. If you saw the total figures for what each physical server ran your company over the last three years, you'd be quite surprised.
Instead of building out on extra boxes, we should be in a mindset of building up by scaling out virtualized workloads on beefier servers. Our favorite hypervisor for getting this done is Hyper-V, the free hypervisor Microsoft bundles into every copy of Windows Server since 2008. We've got numerous clients in the field using HyperV on Server 2012 (and 2012 R2) who are running Remote Desktop Services servers, print servers, application servers, and much more.
The other big nicety? Every copy of Windows Server 2012 and 2012 R2 allows for up to (2) virtual server instances to be loaded and fully licensed. That's right. You only need to purchase (1) copy of Server 2012/Server 2012 R2 and you have out of the box rights for two more VMs on top of it, for a total of (3) instance of Windows Server at your disposal. That's a deal if I've ever seen one, and we take advantage of it for nearly every customer that gets a new server.
We have a client right now that has a separate 2003 Terminal Services box, a 2008 primary AD/file server and a Windows 8.1 "phone server" that is hosting their PBX software. We will be consolidating all of that onto a single repurposed IBM server running 2012 R2 and HyperV hosting the virtualized aspects.
The ultimate aspect of resiliency through Hyper-V is the functionality known as Hyper-V Replica which serves the same purpose as its name implies: replicate virtual machines between physical hosts across your office network, or even across branches. Unparalleled uptime matched with universal portability of virtual machines means your server is never tied to a physical box like we used to think with the notion of the "back closet server." (Image Source: Microsoft Press)
Where's the resiliency in hosting servers virtually? The benefits are endless. Virtual Machines are hardware-agnostic, which means a Dell PowerEdge can spin up a VM, which can then be passed off to an IBM server, and the chain can continue as and when necessary. The power of NIC teaming can be taken advantage of, along with other benefits of Hyper-V software defined networking.
For those with large, mission critical workloads that are 24/7 and need very high availability, you can employ out of the box features that HyperV offers like clustering where numerous virtual machines can provide a load balanced VM set for a given purpose. VMs can be live-migrated between servers which means you don't even have to take the server down for migration, meaning end users are not impacted in any way.
While Microsoft has yet to publicly release the technology, I have heard slivers of rumors from IT pros with connections at Microsoft that Azure is preparing to accept live-migration workloads from on-premise servers, which means you can ideally transition your workloads into and off of Azure on the fly. This will be very neat once released and I'm anxiously awaiting the announcement once this is ready.
In the End, Resiliency is Both Technology And Methodology
Just like money alone won't solve all ills, nor will whiz-bang technology. A dual-WAN capable Meraki firewall is only as powerful as the planning put into how two internet lines will work in unison and the subsequent configuration implemented. And the same goes for anything else, like a fancy dual PSU server or switches capable of LAG between one another, keeping in mind the upstream/downstream devices at each part of this logical "chain".
Resiliency through redundantly designed systems can only be achieved when coupled with proper planning, and more succinctly, a methodology for the level of resiliency a small business is planning to achieve through technology. Some questions a business owner should be asking prior to purchasing ANY kind of solution should be:
Without answers to the above, toiling over the joys of all the fancy equipment out there is fruitless for a long term approach. Piecing together one-off solutions that don't fit the agreed-upon needs of the organization may increase short term resiliency, but increase long term costs due to potential early replacement necessary and associated downtime, to name a few ills we've seen from lackluster planning.
Better yet, instead of having to redouble on efforts to incorporate systems with resiliency, spend the extra money to get equipment that offers the option to enable resiliency down the road. Don't need that Meraki firewall yet because you don't have dual internet connections at the office? So what -- buy the better firewall and enabling the option will take a mere couple hours to configure and test, with zero downtime to normal operation.
Making sensible decisions up front during upgrade cycles, especially when planning out server and networking purchases, can go a long way in saving time, energy, and expense down the road, either from replacement needed or from outage costs both in productivity and emergency labor.
If I got a dime for every time I had to tell a customer, while explaining emergency labor costs, "I told you so," I just very well may be rich by now. Or close to it.
Image Credit: frank_peters/ Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Of all the mobile platforms out there, Windows Phone 8.1 was literally the absolute last option I ever thought I would land upon. I had a burning hatred for Windows on the mobile side, seeing that I was forced into using a Pocket PC 6700 (Windows Mobile 5) years ago while working for a former employer. To say that experience soured my opinion of Windows Mobile is an understatement.
Frankly, and I don't care what the diehards say, Windows for phones prior to Windows Phone 8 should very well be erased from memory for anyone who had to deal with it. From unintuitive interface design, to cludgy touchscreen navigation with a stylus, it was a Picasso of a mobile OS for exactly all the wrong reasons.
When some of my technology minded friends (who are now Android heads, go figure) attempted to get me to take another look at Windows Mobile a few years back, I absolutely refused. After my PPC 6700 nightmare of years before, I made a conscious decision to never wade into Windows Mobile territory ever again. And for many years, I upheld my promise.
But as we very well know, Windows Phone 8.1 is a vastly different beast from what Windows Mobile of the mid 2000s exemplified. The terrible battery life was gone. The interface matched the ebb and flow of Windows 8, which I actually find intuitive. And the app experience finally resembled clean design that a mobile phone deserves -- not Windows desktop apps miniaturized for a 4" screen as was previously the case.
My first (and formerly promised, last) experience with the Windows smartphone platform was back in 2006 on a Pocket PC 6700 device. This pile of cr*p ran a terrible excuse for a mobile OS, Windows Mobile 5.0. It was a pseudo-Windows XP interface with the stability of Windows 98. Yes, using Task Manager to kill frozen apps was a near daily necessity. I vowed never to touch Windows on a phone again. Luckily I broke that promise and gave Microsoft a second chance. (Image Source: The Unwired)
So after giving Android a try for a few years, sporting a Galaxy S2 and finally a Galaxy S3, I gave Microsoft the benefit of the doubt and dove head first into rogue OS territory. Since I am on T-Mobile in the US, my best option late last year was a Nokia Lumia 925 that came with Windows Phone 8 in stock form.
Since then, I have already updated to Windows phone 8.1 in the Developer Preview program and have been on the new OS for a few months at this point. More thoughts on the latest iteration of the OS follow below.
Yet I am certain one big mental hump needs to be resolved before I can go into the things I love (and the few I don't) about Windows Phone 8.1.
Why the heck would I switch from Android to what some call a lesser platform, namely in popularity and selection of apps? Am I crazy?
Android, iPhone, BlackBerry: Been There, Done That
I'm not here to give Android the brunt of bad opinion when it comes to mobile OSes. It happens to be the previous mobile OS I was on for roughly two years, but it isn't by any means the sole prior platform I've got experience with. The last thing I need is the BlackBerry or iPhone army pegging me with "why didn't you try our camp?"
At this point, since my last six months on Windows Phone, I can proudly say that I've given all the major mobile OSes more than a fair shake. OK, barring BB 10 -- but I think we can all agree BB 10 has a much larger hill to climb than Windows Phone 8.1 in terms of mindshare. So let's shelve that platform for now.
But as far as the major options are concerned, I was a BlackBerry faithful (and quite happy one, too) up until my final BB handset, the Torch 9810. I always lauded BB as having one of the best battery life offerings of any phone I've worked with. It was hard not to chuckle when my Android friends always complained about having to plug in every 8-12 hours to charge up.
BlackBerry in the waning OS 7 days was nothing excellent, but met my needs. I was never an app-aholic so I was used to a truncated app selection. My biggest needs were always a solid typing experience, clean form factor, and I was in love with physical keyboards on the BB devices -- the slider keyboards on many Android devices of the 2009-2011 timeframe were quite cruddy in my opinion.
Once word started rumbling that BB 10 was taking over soon, I decided to jump ship and give Android a try. I started off on a Galaxy S2, and then when the girlfriend needed to upgrade phones, she took my S2 and I moved myself to an S3. Both phones were exceptional in many areas, the biggest one being app selection.
But Androids have two nasty thorns in their side which many enthusiasts glaze over. The first is their propensity to have subpar battery life unless you opt for one of the gargantuan handsets like a Note 3. Not everyone wishes to carry around a phablet just to achieve day long battery life. I think Google has a long way to go in optimizing battery usage in the Android system, because it seems ages behind especially compared to what I saw on BlackBerry OS 5/6/7 (which happens to be heavily based around Java as well!)
The other, more nagging, nasty edge is the mess that I saw first hand when it came to OS version selection, namely in the form of both first party carrier-driven OS updates and third party ROMs. I think the open source nature of Android is excellent in theory -- but it fails in the real world, running into many of the same harder-than-it-should-be qualms as the Linux platform does on computers.
Take my experience on the Galaxy S3, for example. I was routinely multiple versions behind what Google was showcasing in press events for the latest Android releases, because as they claim, the carriers were holding up versions. OK, fair enough, I know they aren't the nicest parties when it comes to keeping us up to date.
But I routinely updated my old BlackBerry devices on my own using official OS releases from other carriers without issue. It usually took me a mere 20 mins entailing a USB cable, the OS release image, and a helper tool to load on the OS. There was no special magic behind it, as is seemingly needed on the Android side.
The same dilemma on the Android side is a far bigger mess, and I don't care what the ROM community says at this point. Their disconnect from the "average user" (I am an IT Pro, but I have zero interest in becoming an Android ROM enthusiast) was fully on display in my trials and tribulations in not only trying to find a good ROM, but worse, trying to install it myself.
Since I a huge fan of Wi-Fi calling and cannot live without it at this point (an area that iPhone has until recently lacked in, I should mention) finding a proper S3 ROM that doesn't strip out Wi-Fi calling is a major pain in the rear. ROM developers claim that Samsung wraps Wi-Fi calling into its Touchwiz underpinnings, meaning that a ROM developer has to either layer on top of it or remove Touchwiz entirely.
It's all dev speak to me; the end user side of me doesn't care to wade through the intricacies of this menial technical discussion for Android enthusiasts.
I settled on going with a ROM called Dandroid, which was a clean rip of stock Android that didn't remove Wi-Fi calling. Loading it onto my phone was a whole other story! I had to wade through leagues of half-baked instructions spread across numerous websites in how to unlock my phone, install recovery software, and on and on and on. After over 10 hours of research and two near "bricking" instances of my phone (code word for destroying your handset), I ended up with a workable Dandroid installation.
After thinking that I learned the process well enough to do my own upgrades, I was quite wrong. As Android versions underneath the ROM continued to move up, there was discussion that users needed to update their radio files and other critical aspects stripped apart from Dandroid, and this required more fine tuned labor. I dropped my endeavors with Android ROMs and decided to stay put on that release.
For a platform that prides itself on openness and ease of customization, it couldn't be this hard, I was convinced. But when my Android enthusiast friends who are also IT pros had just as much of a struggle getting a ROM onto my girlfriend's S2, I knew I wasn't alone. Even the experts were having trouble, and that's a bad sign.
For as large as the Android community supposedly is, does it make rational sense that it could be that difficult to install and maintain a custom ROM on your phone? If it took myself ten hours, as an IT pro no less, to take my first stab, how could this community expect the average user to ever take advantage of the supposed endless wonders of Android ROMs?
I don't think the majority will, or do. So they are forever forced into using bloated stock ROMs full of bloatware you can't remove (like Zynga and Dropbox, the few nasties I recall refusing to uninstall on my stock S3). In turn, Android will continue to get a bad rap for battery life on most handsets aside from the beasts like the Note series that overcompensate for poor software battery utilization in exchange for monstrous battery sizes.
Just like the Linux community before them, the Android ROM community is full of those who are too entrenched with the wonders and joys of coding and modding, with a complete disconnect from the user base they will never be able to reach. The average user, the beginning IT enthusiast, the teenager who hears all the joys of Android ROMs.
I own a Google Nexus 10 tablet and the OS situation on that device is excellent. I don't have to worry about ROMs to stay up to date. But it's sad that you have to pick between Nexus and every-other-device in order to make a decision about your OS updates for the future.
Google's best asset (Android) happens to be its own worst enemy in my eyes, and they need to seriously clean up the mess surrounding its ecosystem. Until then, I'm staying far away from the Android ecosystem for the time being. Its promises of openness and flexibility are only skin deep -- unless you're willing to stay tunnel-visioned on the Nexus line.
On the Android side, I had to always make a decision: handset freedom (Galaxy line) or easy OS update capability (Nexus line). Microsoft doesn't force me into such boxes with Windows Phone. The Lumia line of devices come in an awesome variety of shapes, sizes, and colors and they nearly all receive similar treatment of access to the latest edition of WP through the Developer Preview program. Goodbye, Android ROM hell, I won't miss you. (Image Source: WinSupersite)
As for the iPhone, I spent a good one month stint on an iPhone 4 a few years back in between phones. I did it moreso as an experiment to see what I was possibly missing on the Apple side. Seemingly, it wasn't much. Of all the smartphones I've used, Apple's iPhone was one of the most lackluster experiences to date.
There was a complete lack of multitasking; the screen was way too small for a touch keyboard (I have fat fingers); and the battery life was nothing amazing.
Apple was always a holdout with never allowing Wi-Fi calling until very recently, so this was a big item lacking in my eyes. I also felt that the OS ran a bit sluggish on the device. I don't recall what OS version I had the device on, but it was noticeably slow in many respects like switching to and from apps. Perhaps I should have given a 4S or 5 a try, but I don't see much progress on the iPhone side to warrant another trial; especially with how happy I am on Windows Phone now.
For all the accolades Apple gets about masterful interface design, I truly felt that iPhone was (and still is) stuck in the past with its interface. Rows and rows of bland icons (now slightly enhanced with folders I should add) don't do much in the face of what Android and especially Windows Phone are doing for their UIs.
As is pretty clear by now, my settling on Windows Phone 8.1 as my favorite mobile OS so far isn't a conclusion reached through a lack of awareness or trials on the other platforms. I've touched them all in some form or fashion, and while Android met most of my needs, I've since been even happier on Windows Phone.
The Windows Phone App Gap: Becoming Irrelevant From Multiple Angles
The biggest, and most widespread, complaint against Windows Phone continues to be a constant harping on the fact that Win Phone doesn't have the million-app counts that Google and Apple's ecosystems enjoy. While there is something to be said for the most popular apps that people enjoy on other platforms daily, after a certain point, this becomes moreso a battle of the numbers and not one of a detrimental user experience due to a lack of ample selection.
Why do I say this? Because even as app store counts continue to balloon on the Apple and Android sides, this isn't translating into a related rise in the average number of apps used per person. In fact, there is almost no zero documented rise in the average used app count per person according to the latest numbers from Nielsen.
According to Neilsen's research, back in Q4 2011, the average number of apps used per month per person was a mere 23.2. Moving ahead to Q4 2012 it rose minimally to 26.5. Fast forward to Q4 2013 and we are nearly standing still, at a similar 26.8. A statistical brick wall, if you ask me.
What's the takeaway here? As app stores on Android and iPhone continue to get bloated, this isn't translating into a user base that is moving their app counts forward in step. The notion of an "upper limit" in terms of how many apps people realistically download and consume is looking like a truth at this point.
So for those that refuse to give Windows Phone a chance merely due to its sub-200K app count: so what? For all the hubbub about the competing app stores bursting at the seams, most people are never going to take advantage of a large majority of these apps regardless of how many you toss at them. And this was something I've believed for a few years already.
For those that are still concerned their favorite apps won't be available on a Windows Phone 8.1 device, let's have a look at some of the most popular apps on iPhones/Androids and see if they are on Windows Store already.
As of July 14, 2014:
It's pretty clear that when it comes to the biggest most popular apps, Windows Phone has you covered. For the few official apps that don't exist (namely due to Google's ignorance), awesome alternatives are around that work just as well, if not better (in the case of MetroTube, namely, as an app that blows away Google's YouTube apps).
Sure, some of the niche games out there which may be riding the popularity charts on a given week may be missing, but this gap will be closing further as time goes on.
Am I missing any apps I used to have on my Android phone? The only biggie in particular is RingCentral, as the company is being quite stubborn in releasing a Windows Phone app. However, that frustration has been quelled in large part as my company has shifted its mobile VoIP needs to hosted Lync from CallTower, and Microsoft's Lync support on mobile devices is far better than what RingCentral has right now. And the Lync experience is far more full featured than what RC offers on the mobile side with cross device unified messaging, VoIP, video chat, and more.
The purists may balk at the fact that there are no official Google Maps or YouTube apps for the Windows Phone still, but as I listed above, the alternatives are out there and are just as capable. Shame on Google for having such an anti-Win Phone policy; this isn't Microsoft's fault.
It's quite ironic, though, that Microsoft serves the Android side with official apps for nearly every piece of its ecosystem (Lync, OWA, OneDrive, and the list goes on).
So, Where Does Windows Phone Excel?
I'm quite firmly a believer that a mobile experience does not solely start and stop at its app count. Nielsen's latest findings just solidify my notions here; we have a selection of apps we love dearly and use often, but above that, most of us aren't cycling through adventurous app lists as much as the Apple/Google camps would like us to believe.
I guess I'm a part of that "boring" majority. I'm in email numerous times a day. Lync gets quite a bit of usage for customer phone calls and intra-staff messaging needs. GasBuddy gets me my latest gas prices locally. Pandora gets me music when I'm on the go. And NextGen Reader is a near addiction during spare time; I'm a news junkie and RSS feeds are easily browsed through this excellent app.
There are a couple of other common apps I'm within often, like my Maps app, OneDrive for docs and photos, OneNote when I need to access my digital notes, but I'm not nearly drowning in apps like some others I know. I guess I'm more of a mobile fundamentalist when it comes to my needs. The things I use are used heavily, but I don't truly care about browsing the Windows Store weekly for new apps.
My mobile needs center around my daily life as someone who runs a busy and growing managed IT services company. I need to stay in touch with staff members first and foremost, as well as family and friends. I don't have too much time to frolic like some may on their phones during the day, fumbling between Facebook and Twitter and whatnot.
I also hate gaming on phones; I'd much rather get on my Xbox 360 and play a few rounds in Halo than bear through gaming on my small Lumia 925 screen. That's just me, though. So I won't comment on mobile gaming since I am no authority in this arena.
Think iPhones or Androids have customizable home screens? Windows Phone 8.1 blows them both out of the water. The combination of being able to size your icon tiles along with per-app Live Tile capabilities that are caked into many apps means that you can truly fine tune your phone to your exact liking. Gone are the rows/columns of bland app icons you may be used to. This is truly a tweaker's dream, and no two Windows Phones will ever be the same as a result. Conformity is so 2011.
In many ways, Windows Phone 8.1 is a breath of a fresh air for me. My history with smartphones has been marred by experiences that used roughly the same UI combination with slightly different coats of paint. BlackBerry always used static rows of icons. iPhone without a doubt had the same. And my previous smartphone, the Galaxy S3, furthered this same boxed thinking albeit with a twist: the advancement of widgets. Of which I used only one (the clock).
Windows Phone tosses that thinking out the window. Instead of dividing your icons into separate home screens or hidden into folders, Microsoft extends the same concept that exists on the Windows 8.x tablets/desktops -- you guessed it -- Live Tiles.
Live Tiles offer many advantages over the traditional, bland icon approach of all the other ecosystems. First off, Live Tiles can be resized into numerous "tiled" sizes, ranging from very small, to larger boxes, to some that can stretch nearly the entire width of your phone into a rectangular shape. The shape options depend on the developer of the given app you are resizing; most of Microsoft's first party affairs take advantage of all size options, as a rule of thumb.
Doubly, they also take over the functionality that Android's separate widgets provide (again, on a per-app basis) by having automated capabilities to virtually flip into informational cards. My weather icon, as shown in my actual current Start Screen combination above for example, happens to switch between two screens. One shows me the current day weather at large, and the other brings up a quick 3 day at a glance.
This combination functionality not only saves on screen real estate (as Android's widget system requires an app for the program, and then a widget for live info to be shown) but also allows for excellent customization of the home screen. The combinations of tile placement, size, and transparency for a background image of your choice is by far one of the slickest UI decisions I have seen on a phone.
And the animations Microsoft baked into the OS, ranging from scrolling through the tiles, to going in/out of apps, is elegant yet not overly flashy. Just the right mix of modern muscle and minimalist design.
Coming from Android, I must say that being able to swipe up/down to be able to get to all of my apps, and see status info across all of my Live Tiles at a single time, is a big benefit. I downright hated the limitations of Android's home screens. Icons could not be resized, and you had to work with multiple screens like the iPhone in order to list all your apps out. As such I was always making concessions about what apps I could have on my main screen, as I didn't want to scroll between screens all the time. It was painful, and seemed so old-school to me.
The transparency feature for background images is only partially effective as of yet. This is because not all app developers have turned on transparency for their apps yet. You can see how culprits such as Lync, Skype, Ebay and others on my home screen above are blocking view of the background. But with each passing week, the developers are releasing new apps that are turning this function on.
When I had my Android powered Galaxy S3, I always loved the powerful notification center it offered. When I originally got my Lumia 925, it had Windows Phone 8 which never had this functionality baked in. Now, with the advent of Windows Phone 8.1, Microsoft has made one of the largest and most welcome changes to the UI on the platform and given most of us what we were wishing for.
The notification center in WP 8.1 isn't anything out of this world, but it's functional enough to get me what I need in a quick manner. I am constantly using it now to check on new emails and texts and to shut off/on my Bluetooth and WiFi as necessary. Placement of the options compared to Android is a bit different, but it works well in the end. All of the same things I was able to do on Android are found here.
Windows Phone 8 got a bad rap for not including a notification center. That oversight is a thing of the past. WP 8.1 now has a very capable drown notification area that displays everything you would expect -- quick links to common settings, latest calls/emails/texts, and other pertinent info. It's simple, easy to access, and most of all gets the job done. (Image Source: Windows Phone Central)
I make no effort to hide the fact that I am in love with T-Mobile's WiFi Calling functionality offered on most capable handsets. iPhone has always been a holdout in this area, and one of the numerous reasons I always refused to get one. As a technician and business owner that is constantly onsite with clients where coverage is not stellar in all cases, being able to take calls/texts over WiFi is a tremendous value add that you can't get (from what I know) on Verizon, ATT, or Sprint.
And while Windows Phone 8 did include WiFi calling/texting, it had a nasty bug which I could never figure out. For some reason, visual voicemails never wanted to pass through when I had WiFi calling on. I would either have to go over to regular cell signal or wait until I was out of WiFi coverage so that the voicemails could stream in over 4G/LTE. A pain in the butt, especially when I was in bad coverage zones, like the Sears Tower in Chicago where cell service doesn't reach above the 60th floor or so.
Windows Phone 8.1 seems to have fixed WiFi Calling for the most part. Not only do calls seem to hold up slightly better when WiFi bandwidth is a bit choppier, but I can finally check my voicemails immediately with WiFi Calling enabled. Not sure who to thank for this improvement, but I'm sure both MS and T-Mobile had something to do with it. This was one of my biggest gripes about WP 8 as a WiFi Calling addict.
I also want to touch on battery life briefly, as this is an area where my Android experience was slightly pitiful. The fact that I had Wi-Fi calling on my Galaxy phones was nice, as this helped me extend my battery quite a bit, to nearly a full day when the batteries were new. But getting through a completely full day, with WiFi on, Pandora being used, Lync messages being sent, and email pounding my phone was almost a miracle.
I consider a full day being marked by the moment I wake up and remove my phone from the charger, to the moment I am going to sleep and plug it back in. My days usually start 6-7am and finish up around 11-12 midnight. That's quite a bit longer than what most consider just the regular workday hours.
My benchmark for battery life has always been my time on BlackBerry phones, in the OS 5/6/7 days up through the Torch 9810. I would easily get to 10-11pm at night and still have 30-40 percent battery life. And this was on HARSH days with lots of activity. I will preface this by saying that I did not use as many apps as I do now, but Pandora, email, Wi-Fi, email, texting, and calling were all fairly similar. The old BB devices just rocked in this area. Their propensity to crash due to their Java backbone was not as great, though.
But back to battery life. Windows Phone 8 was already giving me about a full day of battery, and Windows Phone 8.1 is getting slightly better life for my Lumia 925. I would say battery life has increased about 10-15 percent since WP 8. This all depends on usage during the day of course, as I also just installed a Bluetooth radio system in my car which I use for all phone calls on the road, and BT is not friendly to batteries I know that much.
Yet I am very happy with what kind of battery life I am seeing on the Lumia 925 after installing WP 8.1. Perhaps a few times a week I have to plug in early, around 5-6pm some days, due to extremely heavy usage like loads of calls over cell signal and Bluetooth all the while using Bing Maps to route my courses to client sites (relying on GPS, no less). I never remember my Galaxy S3 treating me as well in this area. Let me clarify: the S3 was good, but never great.
Another area that really pissed me off on the Android side was the double standard when it came to email apps. Want to use Gmail on your Android? Of course, Google treats you like royalty with their better than sliced bread Gmail app. But looking to connect your Office 365 account or POP account? Good luck with their awful native "Email" app. It is something you would expect on a Windows Mobile 5 device! I used Office 365 email on the Android for a little while using its native email app, but found it be utterly sub par and nothing like the Gmail app experience.
Yes, there are plenty of alternatives out there in the Play Store, like the fabulous Touchdown, but come on -- really? I have to rely on third party apps to get basic functionality like Exchange-based email to work properly with due features and functionality? I know Google has no love left for Microsoft, but their reluctance to build a better non-Gmail email experience on the Android is frustrating. And part of the reason I waved them goodbye.
Already using a Windows 8.x tablet or computer? The up and coming concept of Universal Apps means just what it says: purchase/download an app once, and you have it immediately across all connected devices with a common usage experience. Apple has been hinting at such possibilities on its iOS/Mac platforms, but Microsoft is actually making it happen end to end. The unified Live Tile experience is already present, but it will take some time for developers to catch up on the app end. (Image Source: WinSupersite)
Windows Phone, like iPhone, doesn't treat non-MS email services with as much disdain as Google does on their phones. My experience for my Gmail account is as slick as it is for Office 365 and my Outlook.com account. Same interface, same options, and a unified experience overall. There are some aspects I enjoyed about the Gmail app experience on Android, but seeing as this was exclusive to Gmail, I'd rather have a fairly powerful experience that is similar across all of my various email accounts.
Google's assumption that everyone will be using Gmail or Google Apps is a mistaken one.
Another aspect I find severely lacking on the Android side is a fast, accurate camera an app experience. The biggest issue with the camera on the S3 was its verrry slow entrance into the camera app and time between photos. That is a thing of the past on my Lumia 925. I happen to use the Nokia Camera Beta app which is blazing fast (the stock app is quite good, but this one is by far the best) that has minimal delay on app entry and time between shots is almost as minimal as that for an entry level DSLR. Microsoft wasn't kidding when they said they wanted to provide the best mobile camera experience on the WP devices.
I use my Lumia now for all client site surveys, for example, when we need to install a new network or upgrade a server room, just to name a few. I can take easy panoramic shots, shoot video, or take traditional pics. Nokia Camera Beta is quite slick in its interface and I find it considerably more feature filled than what I had on my Galaxy S3. I also find using the dedicated camera button on the Lumia 925 to be much more fluid than tapping on the touchscreen to take photos, as I was doing on the S3.
I had numerous problems with ISO levels and blur on my S3 shots, which is also nearly gone on my Lumia 925. Most shots are crisp, have proper color levels, and the image stabilization on shots is actually darn accurate in most cases. Lumia cameras should be considered the bar for mobile photography.
Cortana is something which I admittedly don't use as much as some others probably do, but the service, just like Siri when it was released, is rapidly improving by the week. At initial launch, my tests with Cortana were just decent. It was missing some of my words and spitting out too many plain text Bing searches when it "gave up" on providing a contextual response. Now the service is increasingly becoming more accurate and the number of searches coming up with proper context returned are inching upwards.
Cortana isn't perfect by any means, but she's learning quickly due to the crowdsourcing that Microsoft is pushing through its Azure service to power Cortana. I'm personally letting others continue to guinea pig the service, but I can imagine that Cortana will have the reliability of Google Now in a half year to year timeframe at this rate, perhaps sooner. She's useful so much more than just searches, and you can go much deeper in depth on her capabilities in this great video.
It's also hard not to be excited about the future of Universal Apps across Windows 8.x devices. Microsoft let loose at BUILD 2014 that going forward, development effort for Windows Store apps is going to become unified so that Windows tablets/PCs, Phones, and Xbox can all share a common code baseline and therefore have apps that truly flow across all devices with ease.
Curious about what fruits the Universal Apps model will bring Windows Phone? Office Touch, the Windows Store edition of Office coming soon, will be one of the first heavy hitting collections of such apps for Windows devices. Windows Phone 8.1 users will soon have nearly as powerful of an Office experience as desktop users do -- beating out even the iPad, some say. The future of Windows-powered devices looks pretty bright. (Image Source: WinSupersite)
Think about it in plain terms like this. You may use a Windows 8.1 PC at work. You could also have a Windows 8.1 tablet or computer at home. And say you also pick up a Windows Phone device. When this concept of Universal Apps starts molding together, you will be able to place a single purchase of an app on your work computer, and it will instantly stream onto your other devices without any extra effort. This is all controlled through your sign in via a Microsoft Account.
As more apps continue to shift into the Windows Store model, and away from the legacy desktop approach, this functionality will start to have much more benefit for end users entrenched in the Windows 8.x ecosystem. For now, we still have a large distinction and separation between Windows Store apps for the desktop/tablet side and that of Windows Phone, with Xbox being on its own as well, but that future will be changing soon as Universal Apps become the standard.
Office Touch will likely be one of the first major flagship app suites that carry this model, and yes, it will presumably be available on Windows Phone the same day it hits Windows 8.x tablets and PCs. Getting serious work on on your phone will finally not be a second rate experience, here's hoping.
Finally, I want to touch on the overall stability of Windows Phone 8.1. Of all the mobile OSes I have touched, this has to be by far the most rock solid, crash-proof platform I have used. The worst of all were my BlackBerry OS 5/6/7 devices, that constantly hung up on Java exceptions and the like.
Once in a blue moon I have to restart my phone because of an odd issue with an app, but this seems more so induced through rare app crashes than general OS instability. But far less than what I had experienced on BlackBerry, Android, or even my short stint on an iPhone 4 (which later iOS releases seemed to be pressing to the limit, it seemed).
If you have any predispositions about the stability of Windows Phone based upon your experience with buggy Windows releases, like Windows 98 or XP or Vista, do yourself a favor and wipe those preconceptions away. Win Phone 8.1 is a completely different animal in almost every respect, and my daily usage with it for the last few months has proven that even to myself -- one of the biggest former doubters of the Windows Phone ecosystem.
Windows Phone 8.1: A Great Mobile OS, But Not Perfect
There's no such thing as perfection when it comes to fluid, changing software like a mobile OS. Windows Phone 8.1 is no exception. While I find it to be the best mobile experience I've personally used to date, it has its fair share of rough edges that people should know about.
As I continue to use my Lumia 925 more heavily on Bluetooth in my car, I am noticing the huge tanking that my battery takes when the technology is turned on. Bluetooth 4.0 LE is supposed to be one of the aspects that hits in the final final version of Windows Phone 8.1, but it doesn't seem to be included yet. I am sure many Bluetooth users would heavily appreciate being able to use this energy saving edition of BT, especially for mobile warriors like myself. Here's hoping this makes it into the final release from Nokia, along with the upcoming Lumia Cyan firmware update.
While I do think that Microsoft's Bing Maps and Nokia's HERE Maps apps are quite good, I sometimes wish I had Google Maps back, especially in situations where the mapping technology brings me to some bizarre addresses I am trying to get to. This has happened already twice in the last few weeks. One was my attempt to get to an animal shelter in our area (it took me to a destination about 1 mile off course) and the other was a client address that led me to a spot along a forest preserve. The technology is great when it works, but mistaken addresses seem to be a nagging problem on one of every 10-12 trips I take. I'm sure this can be easily resolved through some increased development effort on Microsoft's end.
Lync 2013 on Windows Phone runs great for the most part, but it still confuses my staff and I as to why and when it decides to show me as going offline. It's interesting, because as soon as a message is sent my way, it will "wake" me up on the mobile app and show me as idle, and the message will come through. But the bigger gripe I have is how Lync calls in the car that I push to Bluetooth only work about half the time.
The other half, the call just continues on silent and the other party doesn't know what is going on. This could be more of a Lync bug, but the Bluetooth aspect could be a fault due to Win Phone 8.1. Hard to say, but I wanted to alert potential buyers who are heavy Lync users. Since we switched our company onto hosted Lync through CallTower, our usage of Lync calls/chats is ballooning heavily over traditional phone calls and texts. I really hope Microsoft's Lync team can fine tune the mobile apps where we can rely on them for all client needs without needing to resort back to the native phone interface.
I also tend to rely on the official Podcasts app quite a bit when traveling. When I have enough of Pandora music, I tend to like to pop on some technical podcasts like Windows Weekly or Mike Tech Show and listen away. The app has a tendency to auto stream some shows behind the scenes, but the files don't always play properly. I sometimes have to force them to re-stream multiple times for them to work. Again, likely a work in progress, but annoying.
My final gripe happens to be centered around my phone of choice, the Lumia 925. The decision NOT to offer the handset in one of the dizzying arrays of cool colors like the other units have gotten (920, 1520, etc) is beyond me. As such, I went out and got a yellow outer case for my phone in an attempt to jazz it up over the standard silver/white it carries. There are no great cases on the market with color for the 925, so I'm stuck with a half baked aftermarket case that is already on its last leg.
I also have to scold Microsoft for choosing not to include both a removal battery and a removable SD card slot. Not that I am using SD cards anymore, but I am sure others out there would love to continue using the slot. But the lack of a removal battery is something that nearly turned me away from the Lumia 925, as I have always sworn by having a removable battery that I could replace when it died (another area I dislike Apple's approach in).
But I gave in, biting my lip. Here's hoping Microsoft brings back self-replaceable batteries to future Lumia models as I am sure many love purchasing replacement batteries for their phones. I did so heavily on my BlackBerry and Android phones and to me is a minor, but still strong, selling point.
The Media Is Changing Their Tune on Windows Phone, Finally
It seems that I'm not the only one finally being bullish on what Windows Phone 8.1 brings to the table. Numerous media outlets, even some that scorned most previous versions of Microsoft's mobile platform, seem to be coming full circle and actually praising the new OS.
The Verge wrote a detailed, lengthy writeup on the new WP 8.1 OS, giving it an official score of 8.0/10 (users are pinning it at a higher 9.1/10, ironically) with especially high praise to the excellent Live Tile home screen UI.
Engadget had similarly high praise for the OS, giving it an 85 score of 100 with kind words for the new action center, swipe keyboard, and potential being shown in the form of Cortana.
And ArsTechnica's headline for its formal in-depth piece on WP 8.1 speaks for itself: "Windows Phone 8.1 review: A magnificent smartphone platform".
Even Matt Miller over at ZDNet outlines how far Windows Phone 8.1 has come over all previous iterations, claiming that "Windows Phone 8.1 is compelling, so stop dreaming of Android". He goes on even further, adding that, "Developers continue to roll out apps for Windows Phone and the stability of the platform can't be beat. Roll on Windows Phone, roll on".
I think Windows Phone is an ecosystem that is rapidly improving, trying to distance itself from the image as an OS on the fringe and as more of a mainstream offering finally due to increased developer interest and high quality handset selection. I couldn't be happier with my time on the Lumia 925 since late last year, and Windows Phone 8.1 has furthered my comfort on the platform.
Should iPhone and Android users consider giving Windows Phone a try? I'll leave that up to you. I myself was a huge skeptic until I dipped my toes, and admittedly find it hard to go back to Android.
Over six months in at this point, all those flashy Galaxy and iPhone devices don't tempt me anymore. I'm already playing with the thought of upgrading to the phablet Lumia 1525 when it hits T-Mobile here.
Windows Phone 8.1 has lived up to my expectations for the most part, and proves that you can have leagues of customization capabilities without having to sacrifice device quality, OS stability, or feature sets. If app counts are your sole bearing for the value that a Windows Phone can bring, you may be slightly disappointed.
But if you're like most of the population that is App'ed-out at this point (myself included), Windows Phone 8.1 is a friendly, functional OS that will likely surprise even the biggest Android/iPhone lovers.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
For many organizations under 100 users or so, there has been a trend on the rise that is either decimating formal IT departments entirely, or trimming them down to bare minimum levels. Many in the IT industry wouldn't notice it, because, of course, they may likely be working for such a department themselves. It's hard to have an objective viewpoint when you're part of the status quo.
As an outside consultant, who works with a variety of organizations small and large, I see my clients and their support structures from a different lens. Being an outsider here has its advantages, namely in being able to see many of these IT departments for what they are.
While many times we're merely being called in to help manage high stakes projects, like email system transitions or network upgrades, we're also being pegged with an increasingly common question: "Can you just take over our IT for us?".
The driving forces behind business owners looking to dismantle their internal tech departments are as varied as the Heinz product catalog. Some are looking to drive down support costs from bloated staff counts. Some are disenchanted by the level of service they are getting.
And many others complain that their tech people don't seem to be aligned with business needs; they're busy pushing their own last-decade agendas and are complacent with sticking to a "No, sir" answer on cost-cutting trends like cloud IaaS/PaaS and other similar initiatives.
This is a tough article for me to write in many ways. I have good friends and colleagues who work in large, formal IT departments in the public and private sectors. And I personally used to be an IT staffer for the very high school district I graduated from.
Without a doubt I don't regret my time spent in an enterprise IT setting. It taught me a lot of what I know today, both in the people and technical skills that gave me the confidence to branch out on my own and form my managed services company, FireLogic. It was a wonderful four years that allowed me to grow and advance as an IT pro.
But that doesn't mean I wasn't aware of the numerous problems which plague large static IT departments. Groupthink is a big problem afflicting much of K-12 IT (although they aren't the only segment of IT suffering from this, honestly). I dedicated an entire piece to some of symptoms of said groupthink, namely the tunnel-vision drive towards 1:1 computing.
While it's an end goal I fully agree with, the path towards getting there is fraught with IT and administrative leadership that is too busy picking technology that can sugar coat press releases; not tech that truly meets the needs of students and teachers alike for the long term.
But the problems affecting formal IT departments go much further than issues at the highest decision making levels. The IT department as an institution most organizations feel married to without any recourse for divorce or reform is one half of the equation. The other part, I feel, is the condescending attitudes that permeate many (but not all) that call themselves "the IT guy".
I hate furthering stereotypes, but there is something to be said for why most non-technical people have idealized the average IT person as something as seen below.
(Image Source: ITPro.co.uk)
I'm not here to attack the dress code of the iconic IT geek; I find it rather differentiating. The negative stigma associated with the "IT guy" at your company comes from a mixture of some or all of these traits:
I hate painting a picture across the whole valley of IT workers because I know there are many great individuals working at large organizations trying to personify the opposite of the downtrodden, back-closet IT person. But the stereotype for IT departments didn't arise overnight, nor did it happen because the above descriptions were isolated incidents.
Based on how many organizations I've come in to provide tech consulting for, and how many instances I've seen the above problems at play, I know many IT departments are still plagued with these ills. It's tough for a lot of them to admit it. Looking inwards objectively is extremely hard to do. Making improvements based on these findings is even tougher.
I wasn't surprised to find that my opinions were shared by others in my own industry. David Hansson, a founder at Basecamp (better previously known as 37signals), penned a piece simply called The end of the IT department. By far one of the best snippets from his op/ed is:
The problem with IT departments seems to be that they're set up as a forced internal vendor. From the start, they have a monopoly on the "computer problem" -- such monopolies have a tendency to produce the customer service you'd expect from the US Postal Service. The IT department has all the power, they're not going anywhere (at least not in the short term), and their customers are seen as mindless peons. There’s no feedback loop for improvement.
His thoughts echo much of what I mentioned earlier in this piece. IT departments increasingly represent a necessary evil serving users that have no choice on the quality of service they receive. David's reference of the US Postal Service instantly brings to mind the iconic poster boy for everything the USPS stands for: Seinfeld's Newman. You can read up on the iconic portrayal of the USPS in Seinfeld episodes on the unofficial Seinfeld Wiki.
Newman for the USPS is very much a metaphor for what the angry, back-closet "IT guy" represents at IT departments which are seen in a similarly negative light.
David went on to note that "IT job security is often dependent on making things hard, slow, and complex". Regretfully, he's right on the money with that remark. The kind of aging, archaic technologies my company encounters at companies with sprawling IT departments boggles my mind sometimes. When we come in as outsiders to propose easier and cheaper to maintain alternatives, that are easier to operate, many times we're met with silence or evil glares from the IT folks.
That kind of resistance to positive change, with an embrace towards things like the cloud, VoIP, virtualization, and other similar advancements is another reason why formal IT departments are driving themselves towards irrelevance. Many of the worst offenders of the above critiques are the ones focused solely on keeping their complex "machine" oiled and sputtering away. Business as usual, you can say.
Legacy Exchange servers that can easily be converted to Office 365 or Google Apps. Creaky application servers that can very well be virtualized on platforms like Azure or Amazon AWS. Even consistently out-of-space file servers that could be tossed right into Azure or onto NAS boxes or up into SharePoint Online still keep chugging away through the fervent, frequent attention of a pre-occupied IT department that could otherwise be time better spent.
Things like focusing on business objectives, end user training, or converting the department itself into an actual profit center would serve an organization ten times better than indefinitely wasting resources away on menial IT tasks better suited for the 1990s.
The famous US sitcom Seinfeld featured a character called Newman, who portrayed his introverted superiority and subsequent plight as a postal worker in countless memorable episodes. He's a direct metaphor for how some of today's IT workers are giving their departments a bad rap. A culture of entitlement; viewing users as mindless cogs in the wheel; and a refusal to embrace modern cloud technologies. (Image Source: Weasel Zippers)
The narrow minded dilemma of IT departments viewed harshly by the users they support comes from a twisted belief that modernization will force them into obscurity. A notion that every server to come down means one more tick on the job loss clock. A notion that each extra cloud service a company employs will lead to extra internal IT layoffs. And this belief can be extended to any facet of moving IT responsibilities away from internal staff and outwards towards vendors or managed service providers (MSPs).
While I'm not here to claim that every organization can be run without a formal IT department, there are indeed many who could very easily make do. A decade ago, anything an organization wanted to store, automate, or analyze having to do with a computer needed to be somehow managed internally. Today, the plethora of options surrounding XaaS (anything as a service) has unchained organizations from the reliance on doing everything themselves through capital expenditures.
It's not that IT departments cannot innately succeed. They very well can in many situations. But if there is one thing that is killing their reputation faster than anything else, it comes down to one word: trust.
The Trust Factor: How IT Staffers Are Kindling Their Own Extinction
Trust is a value that takes a long time to build, but can be easily broken in a very short period. Even an IT department that had a good relationship with the users it supports for the last 15 years could bust its image as a key company asset in short order. And this usually stems from those who make up the department -- the "face" of IT for an organization, whether it be two or twenty people.
Computerworld penned a piece tackling a similar angle on whether IT departments are on their way out. The article's author, Steve Ragan, actually fueled his op/ed with content submitted by a Twitter user who responded to a very simple question: "What's a dumb practice in IT that needs to go away?".
The response, of which only a sliver I will republish here, was as follows:
[One] big problem is the arrogance that emanates from IT departments that they're somehow important to the organization. I believe that was the case 15 years ago, when people would hire you for simply knowing HTML [or knowing how to use FrontPage], but now the credit card and cloud provider phenomenon has allowed businesses to say "f*** you IT that charges me 10 times as much money, takes 20 times longer and still gets my requirements wrong".
The bolded statements are my emphasis, of course. But the meat of the discussion at play connects back to a core concept I am referring to when it comes to IT being its own worst enemy. Can a business or organization survive in the modern era, where servers are being swept away into the Cloud and XaaS is the answer for how to offload nearly anything IT related these days?
IT workers aren't stupid. They have seen the developments occurring in manufacturing, auto work, coding and other sectors where offshoring and outsourcing are viable, plausible business alternatives. And rather foolishly, they have been fighting a simmering battle with the organizations that rely on them. Simply put, it's a war full of FUD, and it has been aimed at making it seem nearly impossible to outsource any facet of operations due to overblown security, privacy, financial, and other concerns.
Not all of the pushback is hot air, mind you. I've penned about some of the rightful issues to be contemplated when, for example, deciding on whether to move major systems into the cloud.
But time and time again, for at least the organizations we are being asked to come in and consult with, the IT department is either partially or fully taken over by a team of No-Men. "No, you can't use the cloud for file storage". "No, you can't move your phone system into the cloud". "No, you can't virtualize servers in a cloud-hosted IaaS environment". "You don't understand... if you want it working, I HAVE TO MANAGE IT MYSELF".
Just as David Hansson alluded to in his brilliant piece, these legacy IT crusaders are hell bent on keeping all IT operations as complex, on-premise, and difficult to transition as possible. This is the same reason we see things like One-Man Knowledge Silos (key IT people who refuse to share critical passwords and info with others); a reluctance to cross-train others on specialized technical systems; and a heretic attitude towards anything that has the word "managed" or "cloud" in its description.
He remarks on the backwards mentality engulfing many IT departments today:
If the Exchange Server didn't require two people to babysit it at all times, that would mean two friends out of work. Of course using hosted Gmail is a bad idea! It’s the same forces and mechanics that slowly turned unions from a force of progress (proper working conditions for all!) to a force of stagnation (only Jack can move the conference chairs, Joe is the only guy who can fix the microphone).
The negativity surrounding the relationship between end users and the IT staff is a vicious circle, one that can be encapsulated in a theory I call the "Circle of Distrust".
It's considered a festering loop because each set of individuals involved feed off eachother in fostering a continual, downward spiral in relations.
The IT department naturally views users as an unfortunate symptom of the "beast" they have to maintain. That complex, expensive assembly line of an IT infrastructure at their disposal provides them with wonderful employment, but without end users, such systems are meaningless. Besides, why would an organization invest in complex systems if it didn't get business value from them?
And so if end users are to achieve business value for the organization, they have to man the wheel for their respective aspect of the complex machine. This is the part some of today's IT staffers dread. But it's a funny dilemma, actually -- one being sometimes an outward result of the very Leviathan that these IT staffers pushed for, purchased, and subsequently implemented.
When this or that feature doesn't work, or a particular server needs to be rebooted for the umpteenth time, it tends to always be seen in the light of users being too needy, too impatient, or not knowing enough. To the IT staffer I've described above, end users become the worst part of their job.
"If only IT could be without end users...". I'm sure this is a wish many ponder at night.
The flipside of this dilemma then grows from the friction caused by an IT department unwilling or unable to support end users in a proper manner. End users and execs end up losing faith, or trust, in their own IT department and the downward spiral begins. A shift to BYOD starts to unravel. Cloud providers are picked without the knowledge of the IT people. End users begin finding answers to their own problems; even if the solutions aren't in line with what "IT wanted".
The Circle of Distrust envelops the very relationship that should be fostering competitive advantage.
Is it any surprise, then, that the BYOD era has whalloped IT so fast and so hard? The generation of "No-Sir" has turned into a symptom of rogue IT exemplified namely by BYOD users. The ones using their personal tablets in meetings. The ones connecting their personal smartphones to company email and leaving corporate-issued cell phones in a desk drawer. And the same ones who are yearning for SaaS alternatives to the complex, slow, and error-prone solutions offered up by internal IT.
Many tech news pundits like to point the finger at the BYOD problem solely at end users trying to get around formal infrastructure. Surely, this notion of super-users flocking to BYOD would be believable if statistics for how wide scale this problem is were far smaller in scope and scale. But BYOD has become an epidemic, not a scattered problem that could be solved through rounds of whack-a-mole.
The above graphic, which can be found on everything from shirts to mousepads aimed at help desk staffers, exemplifies everything that is wrong with the modern self-infatuated tech department. The end of the IT department may very well be hastened by its own doings. (Image Source: Zazzle.com)
Those IT departments that spent the last decade saying no to the cloud, XaaS, and capable third party devices are now paying the price, in some ways, for pushing the age old policy of what we choose is all you can use, period. In many ways, the consumerization of IT and the prevalence of the cloud over the last half decade has shown average users that perhaps the IT department doesn't always know best.
And this is exactly where the divide between the tech of yesteryear, and those that will thrive into the future, will continue to grow. Rod Trent of WindowsITPro calls it the age of Cloud IT. Jeffrey Snover elaborated further, and believes it involves the clear evolution of the "click next" technology professional, into something best described as Differentiated IT.
Whatever you wish to call the next flavor of cherished IT worker, one thing is invariably clear: the status quo of the archaic tunnel-vision IT pro is nearing its end.
Can The Cloud and MSPs Dismantle the IT Dept?
ComputerWeekly very frankly thinks that the end of the IT dept is situated right smack in the cloud. Basecamp's founder David Hansson has similar conclusions about how SaaS is turning a reliance towards internal IT on its head.
Even InfoWorld's J. Peter Bruzzese isn't afraid to say it how it is: that there may be no future in on-premises IT sometime soon.
Bruzzese says of traditional IT:
The shift to a full cloud-based infrastructure-as-a-service (IaaS) or platform-as-a-service (PaaS) model is imminent. Yes, in the next five years there'll be hybrid and convergence solutions teed up across the board to provide a transition from on-premises to cloud, but ultimately there'll be little on-premise IT left for admins.
Could all of these industry voices be wrong? It's naive at best to think that everyone is colluding against the forces that be within traditional IT. The interesting part about most of these voices, including my own, is that we are well acquainted with both sides of the on-prem and cloud arenas. And from everything I can gather, at least from the above pieces, the cloud is a fad which is anything but going away quickly and quietly.
The other factor allowing organizations to chip away at, or entirely dismantle, their IT departments are managed services providers (MSPs). Just do some Googling and you can find outlets that are willing to outsource any number of traditional IT operations, usually for a fraction of what an entire internal team of IT staffers can provide.
From disaster recovery, to cloud-hosted PBX solutions, to file servers in the cloud. The options are plentiful and pricing in many of these arenas is extremely competitive as providers fight to carve out crevices in these up-and-coming offerings. Due to the differentiated expertise they can bring to the table, along with the economies of scale they can build around their offerings, it is becoming increasingly hard to find insourcing as being the better option.
My own company, which started off merely doing computer repair for local area residents, has now turned full circle and specializes in providing outsourced IT for small and midsize businesses. Just two years ago it seemed like no one was interested in such a concept -- their computer broke and they just wanted it fixed. Turn to today, and customers are calling us near weekly to see how we can either partially or fully take over IT operations for them.
Organizations I speak with are, in many ways, not only fed up with the "No Sir" attitude coming out of their own IT departments, but finally realizing that they don't have to endure it any longer. While larger, static organizations have tougher times with overhauling how they handle IT, small and midsize organizations are leveraging combinations of the cloud and MSPs to solve their previous ills. And in most cases, offloading IT externally is saving money, headache, and delivering solid results.
Is the cloud being its own inadvertent referee in some ways? Most definitely. It's not only bringing price points down on commoditized IT, but it's also leveling the playing field and allowing organizations to dole out IT operations to parties that can provide better service at lower cost with less staff overhead. What's not to like?
In the end, it's a win-win for everyone involved. Except the "Click-Next" IT pro.
Photo Credit: Ahturner/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
As we enjoy the restive Memorial Day weekend here in the States, I finally have a chance to relax from the hustle and bustle of tech consulting life. In my leisurely net browsing, I came across an interesting conundrum raised in an article I stumbled upon at ArsTechnica by Peter Bright titled simply "Lync 2013 is everything that Skype should be. Why do they both exist?"
It's a very intriguing question that led me to think a bit deeper about this admitted two-face coming out of Microsoft. This is especially true for me because I've had the chance to get knee deep into Lync since we ditched Google Apps in favor of Office 365 last year.
Lync has been a refreshing and very capable internal communications client that puts Google Hangouts to shame, and also trounces Skype in many areas that most people don't know about. After investing many hours in Hangouts and Skype alike, I won't lie: once you go Lync, there's no going back. It's not perfect, but then again, Skype and Hangouts aren't shining IM platforms, either.
But there's a ballooning elephant in the room which Microsoft doesn't want to tackle head on succinctly. There's Skype. And there's Lync. Does it make practical sense to continue pushing two different, yet interoperable, VoIP platforms? I can't imagine I'm the only one who has quit using Skype because Lync fully supersedes everything Skype did.
Both the Lync and Skype clients and related ecosystems exist today, with little talk from Microsoft on what the long term future holds. The company is working diligently on completing the Skype-Lync network integration, but this only addresses connectivity between the two platforms.
It's fair to say that there is little reason that two ecosystems need to exist for the long run. As I'll show, Lync indeed does everything Skype does, and brings a lot more to the table as well.
The sole acceptable argument in favor of keeping Skype alive, being that there should be a "consumer" play in the form of Skype and "business" play by Lync , is starting to lose headway in my eyes the further Microsoft keeps consolidating services into Office 365. Keeping two distinct brands alive that service relatively the same end purpose is nonsensical as time goes on.
It's why we have a OneDrive (personal) and OneDrive for Business. It's why there's Office 365 Home Premium and Office 365 for Business. And the same reason why Windows 8.1 and Windows 8.1 Pro/Enterprise exist.
But you look at Lync and Skype, and the lines are becoming very, very gray. Microsoft calls Skype the VoIP client for personal needs, but then offers dedicated landing pages on how to use Skype in your business. Likewise, in contrast, I'm told that Lync is meant for work, but Microsoft is busily chipping away at bridging the Lync-Skype ecosystems for full IM/voice/video chat interoperability. In the end, it's a branding message that is turning into a crockpot of use cases, with no proper delineation either way anymore.
Peter Bright of ArsTechnica pitched the very question then that got me thinking: why do we need Skype still when Lync already exists? Peter's pitch for getting rid of Skype is quite valid in numerous areas. Not only does Lync carry functional advantages over Skype in numerous valuable aspects, but it's also wrapped in an arguably cleaner UI, and doesn't rely on closed protocols/standards nearly exclusively like Skype does.
Yes, I know full well that Lync's Mac client is complete garbage right now. But Microsoft has made it clear they are going to make right on the Mac side come the next Office for Mac release, hitting this summer/fall sometime.
The comments section of Peter's article are what truly itched my interest in providing a response to this piece, and actually expanding on the case for why killing Skype would be a very smart move.
Some people hit back at Peter, claiming that among other things, that there are numerous political issues holding back the unification of Skype and Lync. Others made the case that there is an infrastructure divide within Microsoft in terms of how its corporate services (Office 365/Azure/etc) and consumer offerings (Outlook.com/Skype/etc) are doled out and supported.
And yes, there was even mention of Lync's convoluted pricing scheme with staggered CAL functional/price points, that I've long argued need to come down. I can't even count the amount of time I've lost trying to coach clients on the nuances between the Lync Online editions and the on-prem CAL licensing (no less than 5 flavors between the two modalities, which is pathetic I fully admit).
But licensing cleanup is something Microsoft can easily fix if it had the internal push to do so. Windows 7 had six distinct SKUs and Windows 8 came out with only three (not counting RT). And Office for the most part, at least in the subscription side of the suite, delivers a common application set for the most part now. There's little reason Microsoft couldn't clean up this picture in a potential unified Lync/Skype world, for whatever prosumer options it would offer on a paid basis.
But the real answer to the question raised by Peter goes much deeper in my opinion. After taking some time to dig into the overlaps and differences between Skype and Lync, and investigate some of the surrounding circumstances going on with both products, I tend to agree with him at face value.
If I were in charge of Skype and Lync, I'd be cutting Skype out of the equation entirely, albeit gradually. Am I crazy?
Perhaps not as much as some may think.
Skype vs Lync: The Functional Limitations of Skype
The Microsoft Skype team boasted in April 2013 that its service now rolls through about 2 billion minutes of connectivity between users per day. That's without a doubt a large number, and one of the reasons why Skype is by far the most widely used VoIP chat client on the planet.
But most widely used doesn't necessarily translate into best. Let's not make that mistaken mental leap. Skype has a substantial market lead not because it offers revolutionary features or capabilities, but because of the obvious licensing and targeting decisions Microsoft has made (so far) with the way it markets and sells Lync.
Yes, I know full well Lync is primarily a business-oriented offering. But who says this will be the case indefinitely? The licensing bureaucrats at Microsoft could slay the licensing demon associated with Lync quite easily if they really wanted to -- if there was a compelling business case to do so. As such, I say there definitely is.
What if there actually was a consumer edition of Lync with a similar feature set as its paid offering, with some frills missing? Would you really choose Skype over Lync? I'd argue that the answer is no -- given, of course, that you're actually aware of all the limitations Skype brings to the table.
For starters, let's have a look at the number of participants you can have in a group chat on Skype and Lync. What previously used to be a Premium feature, Microsoft very recently unlocked and now allows up to 10 people on a single Skype chat.
In stark contrast, I can easily do video/voice/IM/screen sharing with 250 people at a time in Lync Online in Office 365. And that limit, as has been rumored at some recent Lync events, will be going up to 1000 to match the on-premises edition of Lync (called Lync Server 2013). For those who say Lync's infrastructure can't scale, I think the numbers speak for themselves.
Skype's desktop sharing capabilities are also rather poor by Lync standards, being limited to a single party at a time. With Lync I can broadcast a screen share (along with take-control rights to any attendee) to the entire group of up to 250 people.
People will bounce back and claim the obvious: well, Lync is a paid service and Skype is free. And they're absolutely right. But there's little reason Microsoft couldn't consolidate its VoIP needs into Lync and offer a free consumer edition, prosumer edition, and then the standard business editions, with the higher end of the spectrum having the rich PSTN (telephony for business needs) options being baked in.
If the full Lync experience can be had for as little as $5/month right now in the Small Business edition of Office 365, how much of the cost is truly associated with Lync? Keep in mind this price tag also affords a 50GB email inbox, spam filtering, 1TB of OneDrive for Business storage, SharePoint access, and other frills. In the Office 365 realm, Microsoft's cost-per-user on Lync must be very, very low.
I'd argue that Lync could easily be spun off on a consumer offering without the hefty licensing that Microsoft pushes onto on-prem business users. Lync is caught up in the draconian licensing dept at Microsoft which still holds too much control over the company's products. As I mentioned earlier, a clear business case for bringing Lync into the consumer arena could easily slim down and eliminate the complex licensing which exists today for the product.
Screen sharing and group chats aren't the only areas where Lync has factual leverage over Skype. Another awesome area that Skype has never offered is whiteboarding capability within chats. While this feature is nice for standard mouse/keyboard users, it really shines in situations where you can leverage touch (or best yet, stylus input) on modern Windows 8.1 devices.
Being able to share thoughts on a virtual whiteboard that can be saved and shared after a group chat is powerful, and severely holding back Skype group chats beyond anything more than bona-fide multi way conference video calls. If you think Skype's the only one that can freely handle group video calling, Oovoo is already doing 12 party video calls. Skype's so 2005 at this point.
Lync also offers an integrated way to share OneNote notebooks during group chats, in which participants can markup and collaborate on in a seamless fashion during the meeting. Sure, you can open up OneDrive and share out a notebook from there, but it's one more place you have to step into, and attendees all have to be led in the proper direction to gain access. Lync brings the functionality into the chat experience which is powerful for less technical attendees.
Although Skype recently upped video chat to allow for 10 attendees, Lync still blows it away by offering up to 250 person chats, with 1000 person limits in the works already. Perfect for allowing teachers to webcast with students, among other modern use case scenarios. Add in the capabilities of doing whiteboarding, OneNote sharing, streamlined PowerPoint presentations, and per-app screen sharing, and Lync easily trounces Skype in almost every respect -- all at up to full 1080p HD. Peter Bright was right: what exactly does Skype offer that Lync doesn't? (Image Source: Microsoft)
It's also important to note that Lync has a huge leg up in the security department. For example, a doctor can't legally use Skype to talk with a patient because the product carries zero form of HIPAA compliance. Healthcare in general has to treat Skype with a hands-off approach due to HIPAA regulations. Even though Skype boasts internal encryption, its infrastructure and supporting technologies aren't tested and held to the high standards that Lync is.
In contrast, all aspects of Lync are fully encrypted too, but the product takes it one step further by extending HIPAA compliance guarantees from Microsoft in the form of a standard BAA (Business Associate Agreement). This is big for people who want to use video/IM/voice chat with patients or colleagues in the healthcare arena. And healthcare isn't the only vertical heavily regulated when it comes to VoIP. Financial services, public sector (government), and other industries are in the same boat for needing something more than what Skype offers.
It's also worth mentioning that Skype's approach to codecs has been of a closed, proprietary nature until recently in comparison to where Lync is heading. Skype's audio needs are served by SILK which, while being pursued as an open standard in the form of Opus, is still a codec which is primarily only used in the Skype world. Skype made the move to H.264 for HD video only as well, with non-HD calls still being handled by VP8 as far as I'm aware.
Lync 2013 made the open H.264 its default video codec and supports numerous open audio codecs out of the box like G711 and G722 which are already widely adopted and used in a plethora of VoIP applications. It's nice to see Skype making shifts into a more standardized codec ecosystem, but for all intents and purposes, it seems that Lync 2013 is already there when it comes to utilizing industry standards. As a result, Lync is interfacing better than ever with competing video system and app producers like Cisco, Polycom, and others.
Developers in the Skype community have also rightfully thrown up their arms since Skype has effectively shut down all access to its previous API which enabled a wide variety of excellent apps to thrive off the Skype ecosystem. Almost all third party apps that depend on the Skype API are either being forced out, or are on their way out (Skype extended a reprieve to some "integral" app areas, like call recording). Things like MP3 Skype Recorder will soon be a thing of the past when Skype shuts its API doors for good, with little insight as to what will come to replace these excellent third party enhancements natively.
The reasoning behind the API closure? Beyond making broad claims of "evolving the user experience" and improving the infrastructure behind Skype, nothing too concrete was shared. Lync, on the other hand, has a widely available API set and SDK which see no signs of slowing down or being shut out from the developer community. I have to question the long term viability of Skype with next to no third party future development effort providing value-add for the ecosystem.
Finally, while the rest of the internet world is actively making inroads towards shifting to IPv6, Skype is stuck in an IPv4 bubble with not much public mention of progress being made in this area. As you may imagine, Lync already has full IPv6 support since the launch of Lync 2013 (and subsequently, the version that runs Lync Online). This topic may not be of much interest for regular consumers, but to the technical community, a lack of formal IPv6 support plans indicates that internal development at Skype may be leveling off to some extent, at least in my opinion.
Besides, can you recall any major new features or functionality that Skype has introduced in the last year or two? Besides the expected introductions of apps for Windows 8.x devices and Xbox One, and tearing down the Premium subscription walls, Skype hasn't been blazing any major notable functionality trails like we've seen with Lync.
Lync has been getting roughly quarterly releases for new features, including things like spell check, tabbed messaging, and other noteworthy additions.
Microsoft's actions with Skype may be talking much louder than any words it has been letting through the grapevine. Could Skype indeed be slowly but surely shifting into the sunset?
PSTN (Telephony) in Skype: Nice Feature, But Too Many Rough Edges
One of the big areas that is going to, in my opinion, elevate the usefulness of Lync above what Skype currently has to offer is in the arena of traditional telephony support. PSTN, as the telco nerds like to call it. I already made the argument two months back that PSTN-enabled Lync Online is going to be the tipping point for making VoIP viable for the true masses. It may need to trample over Skype in order to get there.
But wait, doesn't Skype already offer VoIP service that is connected back to traditional phone lines? Yes it indeed does, but its been a feature hobbled by numerous limitations which most people don't know of. I was personally very interested in using Skype with a home phone adapter as a replacement for my now-dead OBITalk adapter, which was laid to rest when Google shut down third-party Google Voice calling (via the now stripped XMPP standard).
However, there's a lot lacking in Skype's PSTN (public switched telephone network) offering. Here are just a few of the biggest caveats about treating Skype as a traditional phone endpoint:
Unless something major changed inside Skype, I don't see them retreating on some of these positions anytime soon. The honest feeling I get from how Skype treats PSTN connectivity is that of one with a two foot pole. We'll grudgingly offer it, but don't expect anything spectacular.
Just their continued stance on not allowing external number porting is reason enough to stay away from using Skype in a primetime fashion for telephony. Mind you, Google Voice has been allowing number porting for some time, and even amateur provider Ooma also supports porting. So Skype's resistance to porting numbers leads me to believe they are holding back for an eventual reality, such as a potential Skype transition into Lync.
It's not tough to connect the dots on the changing landscape that will soon be entangling Lync Online into full blown PSTN. Brian Riggs posted an excellent piece on his takeaways from the Lync 2014 Conference, namely in how Microsoft is going to be establishing phone service within Lync Online. While it's already becoming evident that full blown "enterprise voice" isn't coming to Lync Online in the short term, PSTN connectivity will be hitting the service -- and ostensibly, eventually allowing Lync to swallow what Skype provides with a better feature set, better pricing, and much wider reach (potentially touching all Skype and Office 365 users, which is massive).
Voice-enabled Lync has been a tough find in the cloud so far. I've written previously about the poor selection of providers on the market who have been offering hosted Lync voice for business use. Very recently, we have started working with a rather good provider, CallTower, which aims to fill this massive gap and the results thus far have been promising.
You can light up PSTN voice on desk phones, conference phones, and of course, the Lync software client on both the desktop and on mobile devices, in order to make outgoing and take incoming phone calls. Yes, it works just like Skype but so much better; it's truly what Skype should have been when it comes to telephony.
But CallTower is dedicated to offering cloud hosted Lync voice for businesses and as far as I know, it has no plans to extend anything for the residential sector. Which is a real shame because the technology they offer is what Microsoft should be striving to bring to the masses in Lync Online someday.
While Microsoft continues to take its sweet time in offering full voice in Lync Online, companies like CallTower have filled the void for clients in offering full PSTN voice support in a cloud hosted Lync offering. The service does everything I wish Microsoft could have done by now: provide a dial tone within the Lync soft phone across all devices; provide a dial tone for traditional VoIP desk phones; and offer interoperability with Skype so one can use Lync as their sole unified communications app. Although CallTower's service targets businesses for now, this is likely a good preview for what a consumer offering on Lync could entail. (Image Source: CallTower)
Some people commenting in the ArsTechnica article claimed that it would be tough for Microsoft to re-establish the relationships in the telco sphere to provide Lync what it already offers Skype. How so, I say? It's already been let loose that Microsoft is formally working with the likes of ATT and Verizon to light up PSTN voice in Lync Online.
If Microsoft could converge what it already has in the form of partnerships for Skype phone calling, which seems to have a global reach thus far, then it could consolidate its efforts into a new consumer offering in Lync Online that could potentially blow away the Skype counterpart. Lync already has full support for e911, traditional SIP interfacing with VoIP phones, and a soon-to-be-public relationship base with telcos for the PSTN connectivity.
Likewise, some commenters claimed that scaling out Lync for a potential consumer offering and takeover of Skype's user base would lead to trouble with infrastructure scale-out inside Microsoft's datacenters. Complete crock, I say. In dealing with Office 365 on a daily basis, and seeing how we can spin up 50-100 person (and larger) organizations on Lync Online within a matter of a half hour or less, it's hard to believe the underlying ecosystem wouldn't be able to scale to the levels of what Skype or Outlook.com support today.
Between the massive server deployments Microsoft has to support for Office 365, SharePoint Online, Azure, and all of its other cloud services, I find it hard to believe an argument that Lync couldn't be taken to a grander scale with some minimal effort. The backbone is there; the real challenge will be migrating users off Skype and into Lync, along with all the related difficulties of shifting a global user base. Lync apps already exist on all of the major platforms. With the flick of a switch, Microsoft could likely offer consumer-level accounts the ability to sign into Lync without any major re-engineering of the core backbone.
I wouldn't be shocked if all that heavy lifting is being prepared and tested under the guise of the Skype-Lync interoperability project. This would be the opportune time to bridge the Skype-Lync gaps even further than are being publicly admitted to.
I see little reason a post-Skype future couldn't work well if Microsoft fixed the nagging limitations that Skype refuses to address to this day. And with most of the braintrust and development time being spent on Lync these days anyway, shifting focus to a multi-faceted Lync approach for consumers and businesses would make perfect sense.
Microsoft Has Consolidated Product Lines Many Times Before
Does Skype have a future at Microsoft if Lync clearly has the functional, developmental, and other advantages under its belt? I'd have to argue no. As much as I like Skype, and have used it for numerous purposes in my personal and professional life, I'll fully admit that exposure to Lync has spoiled me in many ways.
No, Lync's not perfect by any means, but its making vast strides release-over-release, especially in contrast to Skype's almost trickle of a development pace at this point in its life.
So here's what I'm getting at: It's time for Microsoft to can Skype and roll its user base into Lync for good. Roll it into the Office 366/Lync ecosystem and offer a free edition with premium features for paid subscribers of ala-carte functionality, like home phone service. Even offering a full-blown edition that comes with Office 365 Home Premium wouldn't be too far off track. Isn't that what Microsoft has already done successfully with Office, and is preparing for Windows as well? It only makes logical sense to continue down this similar subscription based model, tying more pieces of the Microsoft ecosystem into it.
As it stands, Skype continues to be a pariah of the Microsoft family. It doesn't get much prominent placement in Microsoft marketing these days, except for the snippets mentioning free Skype calling mins included with Office 365 Home Premium and other special promos. It seems like even Redmond has little certainty with how it wants to continue pursuing Skype.
Today, Microsoft still plays a mysterious balancing act between Skype for consumers and Lync for businesses. Yet both platforms will soon be interconnected between one another. Is there a reason to keep two distinct software ecosystems alive, when Lync does everything Skype can do, and much more? It's time for Microsoft to consolidate onto the better client, Lync, once and for all. (Image Source: Telcospinner)
So why not just kill it and get this prolonged death march over with? Skype is merely doing enough to stay afloat, not actually get ahead. With fierce competition starting to arise from the camps of Viber, Oovoo, and Google Hangouts, Microsoft needs to spice up the consumer VoIP realm and not continue to rest on its laurels, as Skype is clearly doing.
If you're thinking that such a move is unprecedented in Microsoft history, guess again. Microsoft has consolidated overlapping product lines more times than one can count on one hand, and the results have generally been for the better.
Here's just a sampling of some of the instances where they killed off duplicitous products with success:
Outlook Express (XP), Windows Mail (Vista), and Windows Live Mail all replaced by Outlook and Windows 8.x Mail app. Microsoft has a long history with dabbling in half baked email clients. I'm not the biggest fan of Outlook proper myself (I use OWA for my day to day needs) but it does, for better or worse, power the world's largest email deployments. And I've got legions of small business and even residential customers who swear by it. OE died with XP; Windows Mail was a loner that came and went with Vista; and the replacement for both of these, Windows Live Mail, hasn't been updated since 2012 and still carries the Live branding which is pretty much dead. Microsoft consolidated all of these products into either full Outlook for business users, or is offering its new Windows Store Mail app which comes pre-installed with every Windows 8.x device. As Paul Thurrott unveiled a short time ago, Outlook is getting the full touch treatment for Windows 8.x, so everything could be coming full circle eventually to be wrapped into Outlook for both desktop and touch users alike.
Microsoft Works gave up to regular Office. I always hated Microsoft Works. It was full of me-too Office equivalents that did a decent job but continued to push their own file formats that made life living hell when it came to cross-OS compatibility. Microsoft realized that there was no place left for Works, and just consolidated its productivity software efforts into Office. No one in their right mind is clamoring for Works to come back.
The Courier tablet was killed in place of Surface. Microsoft publicly admitted to killing Courier in 2010, shortly after the iPad came out, and you shouldn't be so naive to think that this didn't have anything to do with work starting on the Surface. Just look at all the concept art that showed what Courier was meant to do, and it's clear that Surface Pro has met all of those needs, especially in the focus on the stylus-driven input. We didn't need two half-baked efforts competing against iPad, and hence, Microsoft took Courier behind the barn in favor of its shinier, more capable cousin, Surface Pro.
Live Mesh killed off; rolled into OneDrive. This consolidation wasn't surprising in the least bit. OneDrive (formerly SkyDrive) was doing everything that Live Mesh offered, barring some niche functionality few used. Today, OneDrive continues to thrive and Microsoft is actually placing it front and center in marketing efforts, something that Live Mesh never received. It rode off into the abyss with as much as fanfare as it received when it was introduced.
Custom domain support in Outlook.com killed, and users asked to move to Office 365. Last August I wrote a well-received piece outlining how you could use Outlook.com as a free custom domain email host, since Google Apps cut that functionality out a while back. How short lived that was. Microsoft recently trimmed this functionality out, and is moving all users of this service onto Office 365 for Business gradually. It makes complete sense, as Office 365 provides much richer domain capabilities and actual phone support for when issues arise - which is nice because I was on the receiving end of countless emails each month from users looking for free help on a product Microsoft wouldn't take phone calls on. How ironic.
Windows Small Business Server no more. Lots of small business consultants I know loved to install Windows SBS for customers. I was not one of those who fell in love with SBS. The product was in a similar spot as Skype is in today; surrounded by better offerings from Microsoft, and Redmond is merely trying to see how it can best cut off the umbilical cord without creating too much backlash. SBS bit the dust and while 2012 Essentials is a half-hearted replacement to SBS, it lacks numerous key features that actually differentiated SBS like integrated Exchange, SharePoint, and other nagging limits. Essentials 2012 was pretty much Microsoft easing small businesses to go with Server 2012 Standard, or just make the proper move to Office 365 (which is likely their best bet, anyway).
MSN TV killed and integrated into Xbox 360/One. This was a perfect example of a service that came way before its time. MSN TV (and WebTV) were supposed to provide a rich internet experience in the home theater. Which, as we all know, is accomplished in an easy way by Xbox 360 and Xbox One by the way of Internet Explorer. Play some Halo 4 and answer some emails, you know, if you really need to on the couch.
InfoPath closes up shop; functionality likely merging into SharePoint. While InfoPath carrier a dedicated following of developers who loved creating forms with almost zero coding know how necessary, all good duplicitous things must come to an end. With Microsoft putting all of its online workflow and document organization focus into SharePoint, it isn't shocking that InfoPath had to bite the dust. Seeing that you can now place Access Apps up into SharePoint Online with relative ease, and the other improvements Microsoft is adding into SP on a consistent basis now, much of what InfoPath did will likely be replicated on SP fairly soon.
And I can guarantee you there are plenty more where those came from. Microsoft has a good track record at noticing where its overlapping efforts, and killing off the logical weak links in its product lines.
Will Skype be the next one on the chopping block?
Unless there's something Peter Bright and I are missing, it's time for Microsoft to double down its efforts on Lync and bid farewell to Skype. It was good while it lasted, but in all honesty Skype, you're in a prime position to be replaced by Redmond's own natural selection.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net
While Apple and Google are fighting a FUD war for the hearts and minds of K-12 campuses, there's one area of education that neither has been able to penetrate with success: higher ed. Specifically, I'm referring to the conglomerate of colleges and universities across the US (and likely abroad).
That's because for all their love in the media, tablets have yet to prove their weight when it comes to deep research and content manipulation in the classroom. Real student work comes in the form of content creation, not consumption -- an area Google and Apple are endlessly infatuated with.
To this end, I wasn't entirely shocked to see Computerworld publish a story that questioned whether iPads were making the grade on campuses today. The article featured some numbers from a long running survey by Ball State's Institute for Mobile Media Research which found that, according to the latest figures, only 29 percent of students own a tablet today.
Advertising professor and director of the above institute, Michael Hanley, called tablets out similarly to how I've done numerous times before: "Tablets are for entertainment purposes, not for writing papers and doing class projects -- key components of higher education".
He went on to say that, "Tablets are fine for reading material and accessing digital files, but for any type of coursework requiring sophisticated design, image manipulation or production, tablets fall far short. Of the 140 students in my classes this year, none used a tablet in class for academic purposes".
This trend of college students thumbing their noses at tablets has been reported in other studies as well. Deloitte found that overwhelming majorities of college kids today own computers and smartphones (more than 80 percent for each category, in fact) and tablets are considered redundant for what they offer.
Louis Ramirez, of Dealnews.com, summed it up rather well. "Unless you’re shooting for a degree in Angry Birds, tablets are a horrible back-to-school purchase. You can’t write a 10-page research paper with an iPad".
The Traditional Tablet Dilemma in the Classroom
I didn't need formal study figures to come up with my own conclusions based on personal experience. In between my busy days working with customers, I'm actually a student myself once again. While most of my graduate school classes are taken online from the comfort of home, I do opt to take some of the tougher classes (like math-driven ones) in a traditional manner. Today's webcasting software, which lets me watch class recordings, isn't up to the levels of what we consider normal on, say, YouTube or over Lync, but that's a gripe for another day.
Nevertheless, I had the chance to take a rigorous financial planning course in-person aimed at IT professionals a few quarters back. The class consisted of about 15-20 people on a weekly basis. Of those, I would say about 50 percent chose to use some kind of digital device to "assist" during the course of note taking. Me being one of them.
Of that 50 percent, I would say that well less than half were on iPads; the rest had laptops or hybrids. I saw one Surface Pro user happily using OneNote, and I joined him with my Thinkpad X230 Tablet in slate mode. No complaints there.
It's no surprise that traditional tablets are failing to gain critical mass on college campuses. Studies continue to prove what has been belief for years already: that non-convertible tablets like the iPad are a complete nonstarter when anything more than rudimentary text or data entry is required. Until more tablet makers realize their shortcomings, and move in the direction of the Surface with keyboard/mouse input, tablets will be taking a backseat to laptops for the indefinite future. (Image Source: SurfPK)
But what were the iPad users doing during class? Besides fumbling around in Facebook or the web browser, the few that actually used the device for some purpose related to class merely had copies of the PowerPoint slides opened on their tablets. In essence, they sat there and thumbed through the same exact thing that was being shown on the projector. An interesting approach to dual screen functionality, I'd say.
I will be honest and admit that one individual attempted to take typed notes into his iPad (from a third party keyboard) but after trudging along in this manner and trying to keep up with the complex equations the professor was displaying, he gave up. The paper notebook came out and the iPad didn't show up again.
It was rather interesting to sit by and watch, as both a fellow student and observer. All the while, I had no qualms about taking snippets of the PPT slides and inserting them into my OneNote notebook. I had the freedom and flexibility of what a traditional notebook affords, with the digital organization and powerful search that OneNote brings to the table.
I'm not here to pitch the benefits of OneNote. The program sells itself in my eyes. I was just surprised that a company with as many big ideas and consumers wins over the last decade as Apple, still hasn't figured out what serious education users are looking for in a device.
I admit my Thinkpad X230T won't win any fashion awards compared to an iPad or Surface Pro. But it got a heck of a job done where I personally watched the iPad crumble into performing the simplest tasks any computer could replicate with ease: viewing PPT slides with no contextual notetaking input.
And college isn't the only place I've seen iPads have a tough time. When I used to work IT for a public high school district in Illinois, I was knee deep in watching segments of our own district and other nearby districts attempt their hand at traditional tablets. From the nightmares of managing these devices, to students and educators who were constantly fighting the limitations of the platforms, I was perplexed at why no one ever took a step back to ask: is this really the direction we should be heading?
Needless to say, my former employer made a smart move and adopted Chromebooks district-wide, but the voices pushing them in the iPad direction were strong and unrelenting. I still think going Surface would have been a step up, but the cost of these devices is admittedly a premium for a taxpayer-funded, high school environment.
I'm sure I can't be alone in my observations. If the numbers above hold their own, then tablets in their most traditional form are missing the grade in higher education rather miserably.
Consumption-Focused Tablets Provide Little Value for Higher Ed
Apple and Google continue to count their app store ecosystem titles by the thousands, and yet they fail to realize this has little relevance on the college campus today. The younger generation enrolled at universities and colleges are well aware that if they are going to choose a device upon which to spend countless hours upon, it very well should be one that is versatile for both the classroom and the dorm room.
The iPads, Nexus, and Xooms of the world are elegant and crafty electronics platforms for watching Netflix, browsing Facebook, and perhaps tuning into Pandora. But where they excel in consumption minded activities, they are merely along for the ride when it comes complex content creation.
I urge you to spare me the mentions of all the third party input devices for the iPad like keyboards and complex, battery driven add-on covers that aim to fill this gap. I'll put it very succinctly: if Apple wanted to be serious about tablets in the higher education sphere, they would have taken similar steps as the Surface Pro already has. I bet a large part of their hesitation is that they can't stimy Macbook sales any further for fear of pushing their own laptops into irrelevancy. I see the business acumen behind it, but for those who really want to replace their laptop with an iPad, they couldn't care less.
Even the launch of Office for iPad doesn't solve many of the innate roadblocks that finger-input-only tablets present, like the lack of mousing ability to rearrange content en-masse easily, manipulate groupings of cells in Excel, or even cut and paste items between PowerPoint slides. While Office for iPad was written with finger limitations in mind, almost two months after its launch, I have yet to encounter any iPad-toting clients who have opted to ditch their primary computer.
The amount of intimacy that a digitized stylus tablet provides is truly unrealized until you give one a try yourself. From easily converting complex drawn-out equations (shown above) to allowing for instant searchability for handwritten notes, modern tools like OneNote 2013 on Windows 8.1 make the iPad version look like child's play. I'd be hard pressed to find many that wouldn't fall in love with a digitized stylus device over an iPad after trying it for themselves. (Image Source: OneNote Blog)
Most honest document-warriors (business pros and higher ed students alike) see traditional tablets for what they really are: complementary devices -- not full laptop replacements in any way. The Surface Pro is the first true tablet hybrid that fits both bills equally well, and many colleges are choosing this route over the iPad in larger numbers each year.
Tom Grissom (Ph.D.) published a lengthy writeup on his education-focused foray into usage of Surface Pro at Eastern Illinois University, and summed it up pretty well: "The Surface Pro is what the iPad should have been". The rest of the writeup is a very intriguing look into how an inking-capable tablet like the Surface Pro can really be embedded into a high stakes learning environment. It's definitely a recommended read for any University or K-12 administrator interested in honest insight on what direction their campus or district should head with tablets.
I understand very well that the iPad grew to be a hit around its ease of use when it comes to slurping up content, from games to movies to web browsing. But this absolutely doesn't translate into a classroom-friendly device that empowers students in higher education to dump their computers.
As the numbers above prove, college students continue to entrust their last-generation laptops with most campus tasks still. And until other tablet makers realize their fumbles, most devices (other than exceptions like the Surface Pro) will be relegated as stocking stuffers post-graduation.
The key argument against tablets still stands: a laptop can do everything a tablet can, but not the other way around. So why would college students invest in three devices (laptop, smartphone, tablet) when two suffice just fine?
How Digitized, Hybrid Tablets Work Where the iPad Fails
Students are saying traditional tablets don't fit their needs for data-driven coursework.
Professors are saying these devices also lack the critical intimacy that can only be replicated by convertibles with true inking and traditional input (mouse/keyboard).
Are these users missing the point of regular tablets? Or are they uncovering a glaring hole with what traditional tablets bring to the table? I'm led to believe the latter is increasingly holding true.
As a graduate student right now, and a former recent undergrad student, I've been heavily reliant on a more traditional tablet PC (you know, the ones we wrote off in the late 2000s?) for my studies. Previously, a Thinkpad X41 Tablet and now my day to day X230 Tablet which doubles as my primary work PC. The only downside to these devices is the extra heft they bring and the good-but-not-great battery life. But they have all the ports I could want, and most importantly, offer me access to full OneNote with digitized stylus input -- the coup de grâce for a tablet that will succeed in an educational setting.
Without a digitized stylus screen, tablets are, in the succinct words of Shark Tank's Kevin O'Leary, "Dead to me".
When is the last time you tried to take serious notes on an iPad where there was a need to not only input text, but also complex charts and figures that were being drawn on a whiteboard? You probably haven't.
As I exemplified with my own tribulations in an IT finance course at DePaul, the iPad was never designed for such situations. No one in their right mind is going to try and "finger paint" charts and graphs with much detail or ease. And copying/pasting content from other sources into pages for markup? If you've got time to burn, go right ahead and folly around -- but I can get these same tasks done in less than a quarter of the time on my Thinkpad X230 with a digitized stylus. And the end results look great.
A similar argument can be made for the ability to use a traditional keyboard and mouse with a tablet. Look at any of the reports from campuses that decided to go Surface over iPad and the arguments are similar time and time again. Keyboard capability. Mouse input. Digitized stylus. USB ports. And the benefits go on.
For the uninitiated, there are two primary types of styli for tablets today. Most tablets without a Wacom digitizer, like an iPad or Surface RT, use a "soft touch" stylus which is nothing more than a more precise imitation of a regular human finger. It takes messy notes and has little precision advantage over a regular finger. Devices like the Surface Pro and Thinkpad X230T contain a Wacom digitizer which enables for fine-point styli to be used which are FAR better for any kind of note taking, drawing, signing, and other activities where detail and "pen-like" capability is key. I would never buy an iPad since a digitized convertible tablet with OneNote 2013 has spoiled me. (Image Source: SoftPro.de)
I know very well hell would freeze over before Apple would consider admitting defeat and opt to really turn the iPad from a toy into a purpose-driven work and school device. But this is a long shot. It's a shame, though, because if it weren't for Apple's stoic stance in the computing area it revolutionized, perhaps it could go after these segments of society which have clearly said no to their novel device.
Until that time comes, I see higher education flocking closer to alternative choices like the noteworthy Surface Pro 2 which give them everything they are looking for.
Tom Grissom, who I mentioned earlier, ran a full fledged 30 day trial on the Surface Pro as his primary educational tool in the college classroom. "I think I have demonstrated over the past 30 days that the Surface Pro is a fantastic choice for teachers, students, and others that want a full-featured tablet device with few compromises".
Instead of continuing to try and put the square peg in the round hole with iPads, I'm hoping others in education will start seeing the benefits that digitized stylus and traditional mouse/keyboard input can offer in the modern, connected classroom.
In the end, it's not about the device in your hand -- it's about the experience it provides.
If we could get over that mental roadblock, education could start getting serious about the 21st century classroom, and stop wasting money at computing approaches that just don't provide classroom value.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net
Microsoft's timing on a blog post released today, provocatively titled "Outlook Web App provides more efficient calendar delegation and management than Gmail," is rather ironic. That's because I was gathering some thoughts on the areas in which this tool still needs improvement and is lacking. So while Microsoft is busy tooting it's own horn, I'm going to turn up the heat a bit for a reality check on the part of 365 I spend the most time in daily, which is OWA.
Don't get me wrong -- I absolutely love OWA in Office 365 and have been using it primetime since my IT company ditched Google Apps late last year. But it's not without its rough edges.
Yes, you heard me right: I refuse to use Outlook 2013 day to day. It's a great tool, but it's just not my schtick. I grew up using Gmail in the web browser since high school. I then worked for that same high school district while in college, and all staff were only given Google Apps in the browser. And when I launched my company FireLogic in 2010, we went full Google Apps and you guessed right: I was a browser user 100 percent of the time.
I know very well Outlook 2013 is a fair amount more powerful, but I am a minimalist at heart. I love having the same experience across any machine I use as long as I have a web browser. And the fact that Outlook Web App gets new features before Outlook client now by default justifies my opinion that Microsoft is aiming for feature parity between OWA and Outlook client within the next 5-7 years, give or take a few. Microsoft's vision has been clear even before Office Web Apps became Office Online: the cloud is first priority, and desktop apps will soon transition into a supporting role.
Regardless of whether you agree with me on OWA's shift into primetime focus, I know many users out there rely on it as their primary email interface day to day. Heck, I've got numerous clients who have even refused to purchase newer Office licensing and merely leverage OWA in Office 365 like I do. I don't see that as a bad thing.
The problem is that, for as much as Microsoft loves to toot it's own horn on all the new goodies coming to OWA, there are more than a few thorny points which irk me quite a bit. Things that I would have expected already cleaned up to bring OWA on par with Outlook client. But even the mighty Redmond, the company that sculpted what corporate email has become today, still has some work to do.
As a full time power Outlook Web App user, from an IT pro's perspective, here are the areas I think Microsoft should consider tackling before getting too sidetracked in more nuanced features.
Lync Chat in the Browser: Still a Sad Afterthought
Of all the features Office 365 brings to the table, my favorite has to be Lync Online, hands down. I already raved about how we've ditched GoToMeeting altogether in favor of Lync for web meetings, and since we moved away from Google Apps, it has become our de-facto IM/voice/video chat platform as well. Our team has it equipped on computers as well as our smartphones and it works (relatively) well. The situation on iphone and Mac is still a bit pitiful (which Microsoft hopes to squash with the next Office for Mac release this year) but that's a story for another day.
Lync within Outlook Web App is almost non-existent in my opinion. It shows up in name only, to be more accurate. And it's a bit surprising, seeing how much Microsoft wants to move its products into the browser and away from these client specific platform clients. So I'm relegated to using the full Lync 2013 app on my computer, which works great since I'm on Windows -- but my Mac colleague on staff is stuck with the sad excuse for a piece of software, Lync for Mac 2011. To say that it s*cks is a bit of an understatement.
Google has had rich chat of all flavors built into its browser client since the late 2000s. As the above comparison shows, Lync chat in the browser for OWA today is more cruddy than the first release of AOL Instant Messenger. While Lync blows away Google Hangouts in formal meetings, ad-hoc IM and chat in the browser is still Google's game. What gives, Microsoft?
Which brings me to my next point about this: if Microsoft brought a feature rich Lync capability back into the web browser, they wouldn't have to fight the uphill battle of feature disparity between Windows and Mac users... and don't get me started on the trickle down that exists then when Lync Web App users join into formal meetings. If you think I'm kidding, just have a look at Microsoft's official client comparison table for all of the various flavors of Lync software. It's a dizzying list of a tit-for-tat feature spar between the various editions, with Lync 2013 for Windows sitting at the top and it goes south from there.
Is it really that difficult to bring the full Lync experience into the browser? Google does it for Hangouts and Chat, and I even had the chance to experience a browser-based session over BlueJeans during my recent video chat with a Colorado State University night course about lessons learned from the Healthcare.gov mess. They have a terrific browser-based product I must say which had a few small hiccups in terms of screen sharing, but for the most part, was as stable as what I get on Lync meetings. But they didn't need an entire separate app; all I needed was a browser plugin for Chrome. I was truly impressed.
And not only should Outlook Web App be able to launch and host online meetings via Lync like BlueJeans can, but also offer the kind of contact and IM integration that I was used to on Google Apps. Of all the things I had to suck up when we moved off Google, the toughest to swallow was the lack of any formal live contact list and rich IM chat within the same browser window. Sure, you can dig into your address book under the People tab and see who is online and converse with them, but the number of steps this takes is far more than anyone would be comfortable with.
It not only breaks your workflow since you have to leave your email screen, but the experience is extremely primitive... so much so that I couldn't describe it in words. You have to see it for yourself. See above.
Oh OWA Calendar... The Limitations You Carry
While the calendar feature in OWA is pretty darn good at face value, Outlook client still has a lot more features in this area. And this is probably where Outlook client has the most feature disparity with OWA.
First off, I applaud Microsoft for introducing calendar search for OWA just a few months back, but one of the biggest glaring deficiencies with this new function? You can't search through shared calendars using it! Outlook 2013 handles this like a champ, but OWA is currently limited to searching only your primary (personal) calendar. This is great for one-man shows who don't work with shared calendaring, but at our company, we live and die by our shared team calendars. So calendar search is only helpful for about 20 percent of the instances I am relying on search for.
Calendar printing is also currently very weak in OWA compared to Outlook 2013. I actually just had to setup full Outlook for a new Office 365 customer due to this very reason. They had very specific sizing and options they wanted for their printed calendars for office staff, and OWA (we found out) provides some menial options in this regard. Google doesn't have a big advantage in this area, but Google Calendar does offer size options for printed text. Outlook's printing options should be what Microsoft should strive to offer in OWA.
While OWA calendar does have Lync meeting scheduling integration, it seems rather rudimentary for OWA users. For example, I can click on a button that says "Online Meeting Settings" but the option doesn't provide any adjustable options for lobby or presenter control. It merely re-states what I already expected to be the defaults for my online Lync meeting. A bit nonsensical if you ask me. Check out how feature rich scheduling is for Lync within Outlook 2013 and this makes OWA look like a toy in comparison.
There are also a bevy of other areas where OWA is still lacking, like no "schedule" view that Outlook 2013 offers. While I don't use it, I know of health offices that do take advantage of this feature. And there is an absolute lack of any integration of tasks into calendar, rendering the feature useless unless you are willing to dig into the dedicated tasks area in the email screen. Outlook 2013 treats tasks like first class citizens as part of your daily work schedule.
For what OWA calendar is, I like it. But there are parts of me that tug at just moving into full Outlook some days.
Tasks: A Segregated Second Class Citizen in OWA
I'm shocked that Microsoft hasn't decided to make more strides in making tasks easier to use in OWA. Because if there is one universal gripe from former Google Apps customers who move into Office 365, it's that OWA's support for tasks is almost non existent.
For starters, in order to get into the Tasks screen, you have to find the easily overlooked button called Tasks that sits all the way in the lower left hand corner of ONLY the email portion of OWA. As far as I know, you cannot access tasks from any other screen online. It's so out of the way most people never even see it.
Another big gripe is the fact that you cannot assign tasks through OWA. Yes, this awesome functionality which I have to launch Outlook 2013 just to use, doesn't exist in OWA. I can make assigned tasks in Outlook 2013 and save them for viewing online - but they cannot be edited in any way. See below to see how idiotic of a limitation that happens to be.
Do you leverage assigned tasks at your organization? Don't try using Outlook Web App to create or edit them; you can't do either currently. Microsoft lets you view these tasks, but beyond that they are merely static items in the web interface. A severe limitation of an otherwise awesome function which Google hasn't gotten right in Google Apps yet.
A lot of users who come from regular Outlook are also very used to having their tasks straddle the side of their screen. As far as I can tell, in OWA right now, you can't have tasks show on any part of your email or calendar screens. You manually have to click back into the Tasks section (which is notoriously hidden in a corner, which I don't understand) and view them in a dedicated screen.
I really wish I could leverage tasks on a more consistent basis, but until Microsoft makes strides in bringing Tasks on par with what Outlook has, I'm going to be sticking to my tried and true notepad for one off reminders. Having to straddle an entirely separate section in OWA is just a bit too much to bear.
Shared Contacts Not Possible and Other Deficiencies
My company leverages shared contacts in a rather roundabout manner. This is because not only has Microsoft not figured out how to make shared contacts natively available on a smartphone via ActiveSync, but Google is in the same boat. So we get around the limitation by using a secondary dummy Exchange Online mailbox that acts as our contact dumping ground. This allows us to load everyone's smartphone in the field with contacts and have them get updated on the fly as our office manager changes them, and we can map out locations quick and dirty with our phones.
But accessing those shared contacts on OWA is one of the biggest pains in the rear. There is no native way to view that secondary contact list within the People interface within your account. Unlike Outlook 2013, which you can merely have mapped to other contact lists, OWA forces us to use the "open another mailbox" feature and only then can we have access to these contacts. A goofy limitation that means there is no way to easily use shared contacts on OWA. Only a small subset of my contact list is on my own personal account; we keep all company contacts centralized so that we don't have to duplicate or triplicate, etc, our efforts in managing contacts.
Working with mass sets of contacts in OWA is also a bit under-developed. As far as I can tell, it cannot be done right now. You cannot select multiple records for deletion, nor can you move multiple contacts at once to different folders.
Not to mention, there are no optional views like Business Card, Card, or List views that Outlook 2013 has. This means that if I want to make any mass searches across our contacts database, I have to go into full Outlook and use the power of the desktop client. Why, Microsoft ... why?
The Little Stuff Matters, Too
Stupid little bugs also tend to perplex me within OWA such as:
There are likely other oddities I encounter but have just forgotten about at this point. But truth be told: OWA still has some catching up to do. As a power user and an IT pro, it didn't take me long to uncover all of OWA's shortcomings once we switched onto Office 365. I still don't have any regrets about using it, but these are some of the thorns I deal with on a daily basis.
Some people would just say switch to Outlook 2013. I'm a big optimist on OWA, I guess.
Outlook Web App Continues Growing Up
It's interesting to see where OWA is heading from Microsoft's perspective. I am intrigued to see that Microsoft is continually pumping money into bringing OWA up to par with Outlook, as was showcased just about a month ago with a blog post that outlined exciting changes coming. Of the three new features outlined, two of them really caught my eye -- Groups and the new attachment experience for OWA.
There was a new feature called Clutter which aims to mimic what Gmail is doing with its new tabs view, but in a different manner. This doesn't excite me much, as I don't necessarily consider anything left in my inbox non-important (yes, I'm an inbox clean freak). For true email overload junkies, I can see this being a bit more tasty for their liking.
Groups for OWA is an interesting feature that will allow us to have a unified collaboration experience between Yammer, SharePoint and OWA. Ideally, you will be able to interact in discussions with members of a group in a streamlined fashion in OWA without having to go back into Yammer. We're not using Yammer yet at FireLogic as we are too small to find much advantage from it (Lync is serving us just swell) so it's tough to say whether this would come in handy.
My favorite is the new attachment experience aiming to allow users to ditch traditional attachments and instead bring their files into the OneDrive folder immediately. In the screen below, you can see how a PowerPoint file has a cloud-y logo pasted onto it, which means the document is OneDrive-enabled and also can have permissions adjusted for users who will get access to the file. You will also be able to pluck files for attaching right from OneDrive (and SharePoint doc libraries, I presume?) using the rich attachment feature that will be launching with the upcoming edition of OWA.
The new attachments experience coming to OWA will finally bring it up to speed with what Gmail has had for some time. Namely, users will be able to attach items with OneDrive integration off the get-go, a process that not only establishes placement into the cloud for central editing, but also helps create the permissions needed to collaborate on the file. I think this is a great move in the right direction, and long overdue for OWA in light of OneDrive's presence in Office 365 for the last year and a half now. No timetable yet, but Microsoft is promising a release for 2014.
I'm interested to see what further improvements the next updates for Exchange and Exchange Online will bring, as these tend to unlock new features for OWA as well. I'd love to see shared contacts support in OWA, as well as a tasks integration that doesn't stink. I don't think OWA is a bad web interface by any means; it just needs the polish that Google has placed into Gmail for the past decade or so.
I'm confident that we will get to where OWA needs to be. And yes, I do believe there will be a day when full Outlook will not be the forerunner way in accessing Office 365 (Exchange Online) or other Exchange powered email accounts. Seeing the direction that Office Online is moving really excites me, and I hope that the continual improvements keep on coming.
But if Microsoft wants to wow its user base with new features like Groups and the new attachments experience, I think it needs to remember that the small stuff counts too. Those items which add up to a lot of daily annoyances, like the inability to do assigned tasks in OWA or the pathetic lack of any proper intra-browser Lync chat support, are inexcusable this late in the game. If they can't fix what's already broken (or missing) compared to Outlook 2013, how can they claim to be trouncing Gmail?
Wake up, Microsoft. There are a lot of users out there like myself who have ditched the Outlook client, or are ready to do so, and are wondering if they are making the right decision. If it's a cloud-first game now for Office 365, Outlook Web App deserves the features that we have become accustomed to in Gmail and Outlook 2013.
Image Credit: Syda Productions/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net
Government IT projects have a tendency to fall on their rear ends more often than not. After the miserable debacle that was Healthcare.gov last October, I made the case for why the larger than life public face of Obamacare had zero chance of succeeding in original form. Fast forward six months, and after some contractor firings and a public about face consisting of a "tech surge", the website is finally working at nominal levels.
That's not to say no one didn't take the fall for the mess of this bungled IT project gone haywire. The former head honcho of the US HHS, Kathleen Sebelius, had no choice but to step down and take the hushed blame for the mess that unraveled under her command. Publicly, the story goes that she stepped down on her own will. Behind the scenes, I highly doubt this was the entire story.
But I'm not as interested with the political ramifications of this failed project as I am with understanding the underlying issues and more importantly, what we have learned and where we go from here.
To that effect, I received an invite from Colorado State University associate professor John Hoxmeier who asked if I could join his BUS 630 night course via online video feed, and couldn't pass up the opportunity. I was asked to discuss some of my insights about what really caused the Healthcare.gov mess and what lessons we should take away to turn course on future federal IT projects.
That session took place just about two weeks ago, and I thought it would only be appropriate to share some of the topics we discussed that night as a formal followup to my previous article on the Obamacare site.
If the Private Sector Can Do Massive IT Projects Right, Why Can't Government?
Representatives from one of the primary culprits in the Healthcare.gov mess, CGI, claimed last year that the website project was ultimately "challenging, unprecedented and complex." It was a nice, meandering way to admit that they clearly weren't cut out for the work, and rode on their heels of knowing the ropes of government IT contract bidding -- not their technical project expertise -- more of which I will drill into later.
To put things in perspective, our government has paid more than double what CGI was originally budgeted to receive, close to $196 million so far, for their part in piecing together Healthcare.gov. And while Obama went on the offensive to defend the "tech surge" going into fixing the project, the private sector came in to show just how easily Silicon Valley could have built a working Healthcare.gov from the get go for jump change compared to what veteran contracting firms are invoicing the American public for.
HHS has already publicly admitted they have dumped over $677 Million into the Obamacare site (and that figure was through Oct 2013, mind you) only to have signed up a purported 8 million people as of March 31, 2014. Even some simple math tells us that this means the site has come in at a cost of nearly $85 per signup. The likes of Amazon and Microsoft would have been lambasted in the private sector by now if their cloud ecosystems rode on such terrible metrics.
In contrast, take a peek at the simple yet effective Healthcare.gov clone website HealthSherpa.com. The site performs nearly everything that the front-facing part of Healthcare.gov does, including allowing people to easily window shop for plans by zip code, and dig into premium and deductible combinations depending on family size, income, PPO vs HMO, and other factors. It's arguably just as visually appealing as the government's website, just as simple to navigate, and downright easy to even sign up for a plan of your choice.
Meet the brains behind the Obamacare website alternative, HealthSherpa.com, that was built on a budget with far fewer zeros at the end. These coders from San Francisco built the Healthcare.gov-lite clone in just three days. Imagine if the US government tore down the draconian walls keeping Silicon Valley out of most public sector projects. Without deep change out of DC, public sector IT projects have very little incentive to succeed, let alone innovate. (Image Source: TheBlaze)
Before the internet hounds pounce on me, yes, I will admit that the site does not handle the other half of what the Obamacare site entails: behind the scenes processing between numerous federal agencies and insurance companies. But this still doesn't render the site's usefulness and proof of concept any less worthy.
If you think that HealthSherpa took dozens of programmers, hundreds of man hours, and a large budget to construct, guess again. 3 dudes in their twenties from San Francisco built the site in just about three days on a hamstring budget of a couple hundred bucks. To say they could have single handedly saved the government hundreds of millions of dollars is probably not too far off track.
And HealthSherpa isn't the only example of the private sector leading the innovation charge on a considerably smaller budget. The Sochi 2014 Olympics may be a fading memory at this point, but they laid claim to a massive IT undertaking that went off without any noticeable glitches.
I'm talking about Microsoft's handling of the live video streaming for NBC two months ago, which I covered in a previous story. Windows Azure did all the heavy lifting for nearly all of the American footage that was streamed for the global event, and the scale of the effort puts the Obamacare website to shame in every comparison.
Take this food for thought:
Surprised? CGI wanted to take comfort in the notion of how unprecedented the Obamacare site was in terms of complexity. I'd say broadcasting the largest global sports event for two weeks straight without room for error is pretty darn complex, too. And Microsoft's far from a global media streaming specialist, to boot.
I'm not a developer at heart so I can't attest to the complexities of database design and integration at the kind of scale Healthcare.gov needs. Perhaps comparing the site to Azure's streaming of Sochi 2014 may be in some ways unfair. But drawing back to the amount of money and time that Healthcare.gov has had, not to mention the number of supposed technical minds at the helm of its development, and one must wonder how this could have possibly crumbled so miserably under even the lightest realistic situations.
It's no secret that the private sector can deliver better results in shorter timespans with less money. Yet there is more under the hood causing government IT to fail so miserably time and time again.
US Government IT Contract Bidding: Flawed In Every Way
Our Founding Fathers put forth one of the greatest democracies the world has ever seen. The checks and balances afforded by the institutions of the US government are unparalleled in any other country today. So why can't this same system tackle IT projects in as efficient of a manner as other proven facets of top-down legislation?
Simply put, government contracting in America was molded for a time when projects entailed static infrastructure work like bridges, railroads, roadways, highways -- not complex and fluid IT projects like the Obamacare site. Such last century needs were fairly straightforward in complexity; easy to plan in waterfall project methodology; and most importantly, once projects were finished, they didn't need nearly the kind of ongoing attention and inter-agency cooperation like IT ecosystems require.
Fast forward to today, and you can see how a bureaucracy carved out for a 20th century industrial mindset is abysmal at tackling 21st century IT projects. It also becomes a lot clearer at why pitiful companies like CGI continue to get re-awarded massive projects even when they have terrible track records in the drivers seat.
Government IT sweethearts like CGI and QSSI don't hold much weight in expertise or execution compared to the bevy of firms that make up the best of Silicon Valley today.
But they do have a competency which government loves to bask in: red-tape navigation.
To get a feel for just how misguided the federal government bidding system is, I had the chance to talk about how this process works with a to-be-unnamed client of mine in the construction industry. They provide general construction services for various US government property from military bases to agency branch offices.
Take a fairly simple construction project, like a new building development or expansion of an existing property. The private sector has various guidelines it has to follow to comply with safety and regulations, but it's fairly straightforward and doesn't require vast amounts of endless paperwork.
This notion doesn't exist when handling similar work for the US government. A bundled mess of never ending red tape must not only be complied with up front pre-project, but consistently during the entire project until it's finished (and sometimes, well after the project is done and signed off on). It's a skill that the average construction foreman or engineer cannot even handle; it requires a dedicated position on-staff at the organization I am referring to. Day after day, this person is solely responsible for responding to and submitting leagues of paperwork.
They have a sister company in town that also handles similar work, and they need two full-time people on staff to tackle this crud. It's nothing more than bureaucratic sludge going back and forth, drowning in complex codes and policies and triple checks that no private sector project would ever think of engaging in.
And we don't see skyscrapers, residential homes, or office space buildings falling down or crumbling in any appreciable numbers because of it. But the federal leviathan demands it, and won't work with contractors who aren't skilled in navigating this paperwork minefield with proficiency.
Most of this mountain of back and forth paperwork is nothing more than administrative drivel that confirms the same things many times over in different form and fashion on each go around. But that's the price you have to pay for winning government contracts.
And if you're wondering how CGI and QSSI, just to name a few, fell into the Healthcare.gov picture, it's because they are darn good at navigating all of this red tape. They don't have exceptional IT staffing expertise, project success, or innovation leadership to show for.
They just know how to game the bidding system that is so well rigged favoring patronage instead of results, Silicon Valley has little ability or willingness to provide a meaningful competitive alternative.
Do you think that tech heavyweights like Microsoft, Apple, or Google would have come anywhere close to botching the Obamacare site like the status quo did? Even the worst effort by any of Silicon Valley's best probably would have fared better than what the contractors hired by the government produced.
IT Contractors: Incentivized to Fail
You would think that a company awarded work for Healthcare.gov would have a winning reputation and track record to back up its bid. Guess again. One of the worst offenders that was fired from the project, CGI, has a trail of failure that has riddled its recent history with government IT work.
Here's just a sample of the projects uncovered by the press that CGI has largely fumbled in one way or another:
Canadian Firearms Information System (Project Result: Canned). The Canadian government scrapped this faltering project that was meant to serve as a national gun registry after numerous deadlines were missed by CGI, and the entire project itself was running severely over budget.
Hawaii Health Connector (Project Result: Delayed Launch; Numerous Technical Problems). This was Hawaii's state-run healthcare exchange built by CGI, and it has been fraught with technical issues and cost overruns since the project began. Not to mention, it launched a full two weeks after its deadline to go live.
Hawaii Dept of Taxation (Project Result: Overpayment for Work Never Completed). This massive overhaul led by CGI stunk of deliberate IT project oversight failure, with Hawaii paying CGI millions extra for work that was never even completed. A corporate consultant who was brought on by the Hawaii Dept of Taxation in 2008, James Bagnola, said of CGI's role in the Healthcare.gov disaster: "The morning I heard CGI was behind [Healthcare.gov], I said, my God, no wonder that thing doesn't work."
eHealth Ontario Diabetic Registry (Project Result: Canned). Another flop at the hands of CGI which was canned by the Canadian government in September 2012. The end result, if launched, would have provided a platform that was obsolete from the starting gate.
Vermont Health Connect (Project Result: Late Launch; Technical Problems). CGI concocted this state Exchange just like Hawaii's and numerous others around the nation. And just like the rest, there was a late launch, budget overruns, and numerous other promises that weren't delivered. Massive technical issues existed months after the site went live, and to make matters worse, hackers have broken into CGI's servers hosting the site on numerous occasions already. Very comforting news for Vermont residents.
Numerous other examples are available of how cruddy CGI has been in the federal contracting space, and I didn't even mention Massachusetts' severance of ties with CGI over its own failed state run healthcare exchange. Different customers, similar end results. Piles of technical problems confounded by lack of oversight and amateur attempts at proper security and stress testing where it was necessary.
But herein lies one of the secrets of being a powerhouse government IT contractor having the stature of a CGI: there's very little incentive to be on time with delivery or even to bid on projects honestly.
We saw it with plenty of other bungled federal IT disasters, including but not limited to the dead-end Virtual Case File system the FBI scrapped in exchange for its own home-grown Sentinel project. Contractors like CGI come in with too-good-to-be-true lowball bids on massive endeavors like Healthcare.gov, and then riddle the landscape with change orders that drive up cost.
One would think that the market would have punished CGI for its part in the abysmal Healthcare.gov launch last October. This was anything but the case. CGI actually rode a small bullish stock upswing just post-launch, but more glaring was its October 10 spike that continued rising steadily. Even as of May 2, their stock price remains nearly unchanged from its Oct 1 level during the Healthcare.gov crisis. With how federal IT contracting works today, it's clear that there's little incentive to do a great job -- failure is just as lucrative in this bizarre sector of IT. (Image Source: Washington Post)
There's no shocker, then, in finding out that CGI has already been paid well over $196 Million for its pitiful role in the Healthcare.gov launch. In contrast to the smidge over $93 Million contract CGI boasted about on its own website, you have to question how their part was bloated by more than double original estimates.
To make matters worse, agency leaders like the fired resigned Kathleen Sebelius of HHS, especially ones with little insight or background in large IT rollouts, have zero leverage in reigning in scope change requests or understanding start to finish oversight. As I argued in my class appearance for Colorado State University, leaders with not even a whiff of IT project management experience (like Sebelius) had no place being the end of the line on the largest technical undertaking in US government history.
And the worst slap in the face to the American public footing the bill for this chronic mess was the admission by the White House that Obama had no knowledge of the site's woes prior to launch. Ignorance, incompetence, or just plain left in the dark?
I'll tell you this: if my Presidential legacy was hinging solely around one accomplishment like healthcare reform, I wouldn't have taken a rain check from every chance at getting briefed on the status of the technical implementation supporting this push. It's unconscionable to believe that President Obama only learned of the mess waiting for the American public the same time the country did. But you can make your own conclusions on that end.
A Zoo Full of IT Contractors With No One In Charge
For lack of a better description of the Healthcare.gov development effort, the best description one can pin on the supposed A-Team enlisted for the project is something close to that of a zoo. This is a befitting description seeing as not even Sebelius and the HHS could provide us with an accurate top-down of what the dev team looked like and who was owning up for what.
A reporter from NPR, rather creatively, decided to take it upon themselves to help draw up what the team effort looked like on the Healthcare.gov project. This was all drawn up merely from contractor testimony in Congress. It comes in the form of nothing less than a simple hand made mockup:
(Image Source: NPR)
The drawing pits some of the main contractors involved and denotes what roles they played in this effort. CGI was in charge of the entire shopping experience on the site after an account was created. It was also responsible for front end actions and confirmation emails along with related data processing.
QSSI was the one at the helm for taking care of the data pipeline that interacted with numerous agencies such as HHS, IRS, Social Security Administration, etc. They also concocted the registration management tools used on the website.
Serco, another contractor with a smaller but rather laughable role, was in charge of processing applications for healthcare received in paper form. Mind you, their workers were expected to punch data into the same broken system that end users were struggling with for months.
So in essence, all you were doing if you couldn't use the website was offloading your frustrations onto a data entry clerk on the government contracting payrolls, who merely had the ability to skip the front door (the registration system that forced end users to sign up prior to "window shopping" which was initially targeted by some lawmakers as a way for HHS and the White House to hide the true cost of these new plans).
In the end, who was playing quarterback for all of this? Well, no one really, and that job fell fruitlessly into the lap of a sub agency of the HHS (Centers for Medicare and Medicaid Services to be exact). It was an invisible title in name only, as the agency had next to no experience in managing such a co-mingled rollout involving so many hands.
It's safe to assume that modern software project management concepts like Agile Dev and User Stories were not in the vocab of the leadership at CMS.
Even the White House's own CTO, Todd Park, claimed in congressional testimony late last year that he had no knowledge of the scale or type of testing that was going on in the Healthcare.gov project prior to launch. Even if he was truthful in stating he was not involved pre-launch, that doesn't necessarily excuse him from having no involvement.
Again, the President's entire second term hinges on healthcare reform and this website, and your own dedicated in-house CTO is out and about handling business as usual months before the troubled launch? It's just hard to believe in some respects.
Another big problem with the way the pieces of Healthcare.gov were built? Modern style software development processes, known as Agile, were leveraged for the "front end" user experience -- everything that users see when visiting and working with the website. Yet this wasn't the case for other aspects of the project.
As Pat Morrell eloquently pointed out, the back end of the website wasn't so fortunate and was developed using that 20th century industrial approach I talked about earlier; known in project management circles as Waterfall methodology. Healthcare.gov had no place using a last generation development approach to a modern software project as massive in scope and scale as this website is.
Where do you pin the blame for such a monstrous failure? Kathleen Sebelius may have taken the public fall, but behind the scenes, it's very clear that this was nothing more than a game of government hot potato. No one set of hands ever needed to take possession of responsibility. For it was always "someone else's problem."
Tearing Down the Federal IT Leviathan: Piece by Broken Piece
There's a lot of blame to go around for the sour taste that Healthcare.gov left in everyone's mouth. From contractors that weren't performing basic stress testing, to a lack of clear central leadership at the reigns, down to a messy landscape of co-mingled agile and waterfall project development structures that were followed.
If the only place we can go from here is up, where the heck do we start?
Let's begin by simplifying the federal IT project contracting regulations system, which is by all accounts convoluted, overly complex, and favors large ineffective organizations like CGI by barring fair competition from outsiders who aren't well suited in red-tape navigation. Is the goal of government to keep its belly fat full of bureaucrats who justify their own existence by burying the American people in regulation hell?
I'd argue that it's time we ditch this failing contracting system in favor of one that is open to change, competitive on the free market, and one that tears down the immense barrier to entry that well qualified outside organizations can't currently overcome.
How about new standards in establishing proper accountability for undertakings like Healthcare.gov? Does it make any reasonable sense that roughly 47 contractors were involved in some fashion with the flawed site and not one of them could correctly establish themselves as the overarching technical project lead? Even though that duty ended up informally in the lap of CMS, a non-technical federal agency should have never been tasked with this unreasonable expectation.
Stress testing was another big focus of congressional hearings after the bungled launch. Numerous contractors testified that such testing was seemingly non-existent prior to launch, as the YouTube video on my previous article about this website mess alluded to. Even minimal, basic tasks attempted on the site were falling flat on their face mere days before launch on October 1. That's inexcusable, and goes back to a flawed implementation plan and timeline.
It's already well known that roughly 68 percent of all IT projects are doomed for failure, and a large majority of this is due to improper requirements planning and scope analysis up front before any contractors are even hired. It's safe to say that the US government, and namely HHS, botched this aspect of the Healthcare.gov planning phase.
Think the Obamacare site is the only example of government IT projects gone awry? Guess again. Compared to some other miserable examples of public sector IT endeavors, Healthcare.gov pales in many ways. The US Air Force ditched a monstrous ERP system before it ever went live, and the UK had its own Obamacare-esque IT failure with a scrapped NHS medical records platform that has been billed as the "largest IT project failure ever seen". (Image Source: Curiousmatic.com)
I've also got to ask one more time: what was the rush to get this flawed website launched on Oct 1? Seeing that the White House already has a failing grade on meeting Obamacare deadlines (a full 44 of 83 have been missed so far), why couldn't the braintrust at the White House decide to delay the launch, or at the least, stagger the launch in multiple phases? If Healthcare.gov couldn't handle a successful "Big Bang" approach to launching, it could have very well been protracted into a stepped approach.
For example, the site could have went live on October 1 to start live testing of performance under load by the public, and allow them to merely window shop for a month or so. This could have given infrastructure engineers the time they needed to fix backend bugs, and allow for more time of the inter-agency data integrations that plagued the site for months on end. Would the White House really have garnered any larger of a black eye in light of all the other deadlines it has missed?
One must also wonder, as I mentioned, how Kathleen Sebelius was considered even qualified for such a critical role as the public face of this project. Microsoft has technical masterminds like Satya Nadella, and formerly Bill Gates, who have led the firm over the years. Amazon's got Jeff Bezos, another individual with strong a computer science background.
Even Google has Larry Page and Sergey Brin. Sebelius didn't have to be as technically astute as some of these other industry leaders, but was it appropriate for her to lead one of the biggest White House-led technical initiatives in recent memory with next to no IT project management skills or insight? She was a terrible pick, one almost inexcusably destined to fail.
If the White House was serious about spending taxpayer money wisely, it would have recruited the best and brightest to help lead the effort from you guessed it: the private sector. You know, that private sector which can stream the Olympics nearly issue-free to a global population in real time? The one which allows the likes of websites such as eBay and Amazon to thrive and innovate? And the one which is officially taking over for NASA to be in charge of bringing equipment, future spaceships and people up into space?
Those that scream we can't get things of massive nature done without government red tape and oversight should wake up and realize that the private sector is innovating all around us, on minimal budgets, and without sacrificing end result quality. The above examples are just the tip of where this is evident today.
The public sector Leviathan ensuring the status quo on federal IT projects doesn't have to continue indefinitely. An informed public, empowered through an overdue congressional vote shakeup, could slowly but surely topple this headless monster constricting IT innovation at the federal levels.
As soon as we stop expecting par for the game in Washington to be IT project failure, perhaps (only) then can we get serious about handling public sector IT projects sensibly.
Photo Credit: Gajus/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net
Microsoft shocked the IT world this past week by making the cardinal mistake: releasing another XP patch after support officially ended. While I think Redmond makes a lot of mistakes, from licensing nightmares to marketing blunders, this particular move really irks me.
That's because it not only sets the wrong precedent, but it's a direct slap in the face to those fighting the good fight in helping eradicate XP. Specifically, IT pros like myself. As a consultant for my clients, I've been knee-deep in the conversations that Microsoft can't have directly with its customers. You know, the ones actually in the trenches -- not those just sitting in the comfort of their Redmond offices?
For the last year and a half, I have been personally toeing the sometimes uncomfortable party line in urging my clients to get off XP (much of that time unpaid, if Microsoft is curious). Seeing such doublespeak in action therefore really digs in under my skin, no matter how critical this IE flaw may have been.
How can I continue to make the case for ditching XP if Microsoft is pulling the rug from under my feet? One of the best arguments in my consulting playbook was the fact that there would be no more patches released for XP once April 8 came and went. Redmond helped curate this message with its respective partner community via numerous channels, including splash sites like this and this.
But it seems the pre-April 8 warnings only carried so much bearing, and now IT consultants like myself have egg on our face. For all the sit-down conversations we've had with clients we support, they're starting to call Microsoft's bluff after all.
While some in the blogosphere are calling this the right move, I have to fervently side with the likes of Brian Fagioli of BetaNews and Peter Bright. This was a terrible, terrible mistake and I'm hoping whatever misguided execs called the shots on this audible wisen up for the long term message on XP.
This just-once-more XP patch sets a poor precedent, just like Obama's empty "Red Line" threat on Syria gave Putin the leverage he needed in his power play in Ukraine.
It's just like kids that play their parents when they throw around hollow threats and never follow up on grounding them after the next time.
And it's no different than police turning a blind eye to enforcing laws after they go into effect.
XP's final patch tuesday came with a new popup for users to warn them that support has ended. Perhaps Microsoft should have reworded this message to note that April 8 was the Almost End of Support date. The latest IE patch for XP users was nothing less than a slap in the face for those (such as myself) who have been vigorously fighting the good fight for eradicating XP. (Image Source: Microsoft)
If tough words don't follow with similarly tough actions, you might as well call such policies meaningless. People will always take the path of least resistance, and in this case, it comes in the form of sticking with the increasingly risky XP OS.
The above aren't the only reasons why this XP patch really ticks me off. There's numerous other ramifications to Microsoft's blunder here:
Future threats of support sunset dates will carry little weight. If Microsoft kowtowed to the XP holdouts this one extra time, how can they possibly carve out a position of enforcing lifecycle policies that actually mean something? Are IT pros supposed to take future support cutoffs with a grain of salt?
Vista's next on the chopping block for patches, and I don't know how I will be able to fight back against those who will say "But remember all those patches Microsoft released after XP support ended?" Short term idealism may come back to bite Microsoft and the exact professionals who are trying to do the right thing.
Criminals will continue building better and more numerous XP exploits. What sets this recent IE flaw apart from any of the other potential future exploits that may come out? If history holds true, these will only multiply, hit harder at the bevy of flaws that XP has, and cause just as much headache for XP holdouts. Is it Microsoft's job to keep coddling users of a 13+ year old OS? It might as well keep making patches for Windows 2000 as well if we are to believe in this flawed mentality.
Apple cuts off users of OS X after just 4-6 years and there are no hoots and howls for Apple to reverse course indefinitely on patches. This is the one time I will say Microsoft should follow Apple's lead.
IT admins who made plans around the expectation Microsoft would blink on XP support have been absolved. There are likely more than a handful of administrators out there who made the case in their organizations that XP would be given an extended lifeline as necessary. It already happened once, and if luck is on their side, it is bound to happen again, they're chuckling.
Those still on XP have been given a false sense of security. Similar to what I said above, on a wider scale, XP users of all stretches will sleep tighter knowing that Microsoft is willing to go against its own grain as we've seen. This is exactly the uphill battle IT pros like myself have been struggling to overcome in discussions.
Microsoft should keep in mind that its actions speak far louder than the years worth of words it has espoused.
The "just one more time" crowd has the ammunition it needs now. There is a vocal but sizable minority in the tech community that believes Microsoft should continue on such a path of supporting XP indefinitely. Now that Redmond has given them a glimmer of hope, watch this crowd hold tight to their argument on the next big zero day.
It gives IT pros fighting the good fight a bad name. I truly believe that getting the computing world into a post-XP era is the right thing to do on numerous fronts, the largest one being security. But as time goes on, and if Microsoft continues its policy of having mercy on XP users, the technical leaders out there beating the kill-XP drum will have increasingly less of a pedestal to stand on.
If Microsoft won't back up the very troops fighting on its behalf, how long before we lose faith in Redmond altogether?
To me, it's pretty clear that this last week's buckling of the no-more-patches mantra for XP really turned a former red line into a gray one; a line that shifts with the winds of the security flaws out in the wild.
Redmond claims in a blog post explaining its actions that this was truly a one-off move, and that XP is not going to get any more patch love from Microsoft.
Is it only me, or have we not heard this already?
Photo Credit: sukiyaki/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net
It's tough for me to get too excited about TVs these days. I'm past the glitz of the 3D craze. And "large" 60 and 70 plus inch screens are neat, but after enjoying a 114" viewing area thanks to my home projector the last few years, anything smaller pales in comparison.
Yet when I got to try out an 82" Perceptive Pixel touch TV at Microsoft's Chicago offices earlier today, I couldn't resist wanting one for my own condo or even office. It's that unique of a TV screen, and if when it goes mainstream, it will completely change the way we view interactive entertainment displays.
My visit to Microsoft had nothing to do with this little-discussed burgeoning technology. In all honesty, I was there for a Microsoft Azure IaaS cloud training event. But my curious side just couldn't hold back with a monstrous 82" display hanging on the wall, clearly running Windows 8.1 from what I could tell. I remembered seeing these same kind of units all over Fox News Channel recently, but never thought they would see the light of day outside of special purpose scenarios.
Well, I was quite wrong. It's entirely fitting that Microsoft would have one of these puppies hanging in their general lobby, but after a colleague and myself played with its various features, the wow factor just popped.
Let me tell you this: touch on a laptop or desktop screen is one thing. But at 82" with multi touch point capabilities in life-size manner, this monitor makes your desktop touchscreen look like child's play.
If you're looking for an all-out in depth review, this is not it. With only about 10-15 mins of hands on time with the device, it was hard to dig too deep. Part of the problem was that I had never experienced touch input on this kind of scale before.
I've worked with devices like SMART Boards when I used to work in IT for education, but even those expensive contraptions don't hold their own against the overall immersive quality of a Perceptive Pixel. The experience was smoother, cleaner, and touch optimized, more so than anything I've ever seen from a SMART Board in the past.
And don't ask me for any hardware specs, inputs, etc from this device. It was wall mounted, looked very expensive, and yet was fully unlocked so we could use Windows 8.1 to our heart's content on it. I was da*n well impressed with what I saw.
And no, not a single Microsoft rep came by to talk with us or show us how to use it. Luckily, we didn't need much help getting to know the device.
What is Perceptive Pixel (and Where Can I Buy One)?
I'll answer the easiest question first: in order to buy one of these massive touch screens, you need to get past some serious gatekeepers at Microsoft. The purchase page on the Perceptive Pixel website lays it out fairly bluntly by recommending you contact your Microsoft account rep, or email the PP team. No purchase links, no Amazon references, nada. This technology is still in its budding phases and I have a good hunch Microsoft only wants serious, hand-picked buyers mounting these on their premises.
Now that the bad news is out of the way, I'd love to explain some basics about this radical new touchscreen TV. Without going into too much history, Jeff Han was the founder of the company of the same namesake as the TV set is currently called. After taking his technology to the technical world via a now famous TED Talk, Microsoft picked up interest in his firm and ended up buying the company outright in July 2012.
There's no hiding that Microsoft had full intention of using the Perceptive Pixel technology to further the touch experience for the then-in-development Windows 8, and more importantly, give Microsoft a leg up in rethinking interactive displays on a bigger scale.
CNN and Fox News in the US are already leveraging Perceptive Pixel screens for their programming. Above is a screenshot from an Election 2012 breakdown of Florida's Presidential vote totals. While interactive TV network markups are an obvious scenario for these sets, I can envision healthcare, education, construction, and conference rooms benefitting just as much. (Image Source: Wired.com)
While I was hopeful that the slick screen came pre-built with the necessary computing hardware under the hood, right now at least, it seems this is not the case. This slick screen has the finesse necessary for excellent touch input capabilities, but it still relies on traditional (rather beefy) PC hardware. I can see why Microsoft is taking its time in perfecting this for the mass market; not everyone is like me and would be OK with a loaded workstation PC sitting in their home theater.
PC specs needed aside, the screen itself is downright gorgeous. No, it doesn't enjoy the benefit of having the slimmest borders around, but it's clear that this TV follows a professional display format meant for commercial settings. The one I played with was very heavy duty, with a solid frame and proper wall mount. But at 82", I couldn't care less about the fat around the edges.
My absolute favorite part of the screen engineering was the matte glass chosen for the unit. I am saddened by the path the TV industry is taking and creating so many sets with glossy finishes. These sets are ideal for situations with good light control and absolutely little to no natural light bleeding in. Otherwise, you're talking about glare hell. Hopefully some in the TV industry are reading this and take notice: matte looks much better!
My love for the matte glass used on this screen was put to the test pretty early on, as the unit was combating a nasty overdose of afternoon sunlight that was bleeding through the large office windows in the Microsoft lobby. As some of the original photos in this story show, there is very little effect on the image quality or color levels. I took all of the shots with none of the natural sunlight being blocked and my Lumia 925 camera compensated for the overexposure as best it could.
The screen seemingly comes with an active stylus that from what I could tell is battery powered. It has the heft of a solid metal pen but is well constructed and behaves just as expected. I was a bit confused as to what the rear end of the pen did, as I am used to writing on my Thinkpad X230T using an active stylus as well -- and the rear of the pen is a workable digital eraser.
This wasn't the case on the Perceptive Pixel pen. Not a deal breaker, and I'm sure Microsoft would take cues from Lenovo on this for any eventual consumer release.
Perceptive Pixel was awfully accurate as a touch screen, and it has a leg up on almost every other touch screen on the market in one big way: it has near unlimited touch points. Most consumer level devices like my Thinkpad convertible or desktop touchscreen all-in-ones max out at just a few touch points, which theoretically limits you to a single person's two handed input.
Perceptive Pixel was built with multi-user input in mind so you could have, for example, three friends up at the screen with you drawing on a OneNote page, marking up a map, or playing a party game.
An 82" Touchscreen Running Windows 8.1? No Other Device Can Compare
I'm pretty well entrenched in touch capability in my computing life now. I'm consistently inking in OneNote 2013 on my convertible Thinkpad and love it. My Lumia phone is full touch, and I'm eagerly awaiting what Windows Phone 8.1 has to offer. But the experience of using Windows 8.1 on an 82" screen is hard to compare to much of anything else on the market because, well, it's such a radically different feeling.
Interactive projection screens like what SMART has had on the market for years is outdone by Perceptive Pixel in almost every way. We used to have some of these SMART projection units in a few classrooms at the high school I handled IT for. While you would think kids and teachers would be all over this, the conceptual functionality of the tech, at least what I played with only a few years ago, was kind of pathetic.
While not as fluid as working within Modern UI (Start screen), Desktop mode in Windows 8.1 on such a large display isn't too bad. Marking up a Word or PDF file in such a manner wouldn't be far fetched. Here, my colleague took a stab at trying a few things with the stylus on Desktop.
First off, interactive projection systems like what SMART offers to K-12 education are fraught with technical limitations. You have to use special "markers" to interact with the displays - there is no such thing as natural touch on the cheaper, more prevalent units in K-12. And you cannot control the entire operating system experience with the devices either. Meaning you are limited to a controlled PowerPoint or other single-purpose applications. Touch was an afterthought; not an integral part of the experience.
It was not surprising, then, that the SMART Board was left unused in the classroom for months until summer time came along, and maintenance crews plucked the screen off the wall due to lack of uptake. I guess teachers and students found the technology just as immature as I did.
Perceptive Pixel solves all of the ills I experienced with these early "interactive" display technologies in K-12 education. First off, Windows 8.1 is the first OS that can leverage touch in such a natural manner on a large format display. Even the thought of installing last generation Windows 7 for such a device would kill the display's advantages fairly quickly. Windows prior to 8 (and 8.1) was never, ever meant to handle touch in the way a human expects to leverage it.
Navigating the unit was easy as pie, seeing as I've had some experience with Windows 8 already. I don't spend too much time in Modern UI yet, but the gestures for the Charms Bar, closing apps, and general navigation intuitively make sense on such a large screen. When you can use both of your hands as big input devices, the touch aspect is taken to a whole new level.
One of the first things I did was open up OneNote (in Modern UI) and begin drawing. The screen was extremely responsive, both to finger input and stylus actions. I prefer using the pen as it provides a level of detail and ease that comes naturally for me, but finger painting was not hard either. I can see many others preferring to use their fingers to mark up or draw on a Perceptive Pixel screen.
I explored numerous apps in Modern UI to get a feel for what it would be like to enjoy one of these expensive devices on a daily basis. The Bing Sports app worked just as well as it does on a smaller device. Flipboard was intuitive and relatively problem free as well. I even opened the familiar Cut the Rope game just for kicks. Seeing that big hungry frog respond to my actions was much more thrilling on 82".
The Maps app was also neat to explore, in ways that I never tried on my little laptop. Using my two hands, I was able to flip around and zoom in to my native Chicago areas of the map, even turning the map on a tilt to travel across the city in a poor-man's 3D perspective.
One of the coolest demos we tried on the screen was an app I forgot the name of, but it was focused on the education sector quite clearly. It was an interactive app that went over numerous aspects of the science behind a tree's leaves, roots, trunk, and other vitals. The user is able to zoom out from a full tree view, all the way down to the bacteria that sit on the tree's leaves.
The animations are seamless, and the graphics are animated in real time -- no canned demos here. It's akin to what you would see done on a TV newscast, but aimed clearly at students learning science in a classroom setting. The app could highlight different parts of the tree so that a teacher could show emphasis on a particular aspect, and have real time zoom and pan capabilities during explanation. This is the kind of tech I wish my teachers had when I was growing up.
Of all the things I thought were missing from the Perspective Pixel display, that I could have easily glossed over, was a way to do live markup of any part of the screen in a similar way like Fox News or CNN does in their usage of Perceptive Pixel for programming.
So for example, I could use the display to blow up a large map of Park Ridge, IL where my company resides. Maybe I wanted to draw live circles around all of the locations our technicians were visiting at that particular time, and other details. Right now, at least in its current form, I couldn't figure out how to do that.
Perhaps it doesn't exist in the software yet. And if so, Microsoft dearly needs to introduce such functionality. Why should newscasters get to have all the fun drawing on everything?
Such functionality would be so useful in numerous other scenarios. If a doctor had a Perceptive Pixel in their office, they could blow up X-rays of a patient and mark up spots of interest. Or a teacher in math class showing a homework document from the night before would be able to live draw right on the Word file as class went along.
Perhaps students in the class could be asked to come up to the display to draw out their answers, as I frequently had to do back in high school. The teacher could save the markup on the fly in digital format and share them with the class on a SharePoint or Google Site. Education seems to be a natural fit for this kind of device.
The Modern OneNote and Perceptive Pixel were a match made in heaven. Naturally, I couldn't resist trying it out. It works like a monster digital whiteboard should. You can use a simple two finger combo to pop out the program's pen and markup options. The sunlight was awfully strong in this shot, but the screen is still highly visible and crisp -- thanks to a well chosen matte glass on this screen!
The uses for such a detailed, precise display in numerous areas of life would be highly beneficial. My company is in the process of moving offices right now, and we are on the prowl for either a TV or projector for our new conference room. I now wish I could afford a Perceptive Pixel for the space. That would be killer.
Overall I didn't run into many bugs in the display's workability; at least not as many as I was expecting to find in such still-experimental technology. The biggest gripe I had with was that we couldn't pinch zoom on OneNote Modern for some reason. I do it in OneNote 2013 (desktop app) all the time while taking grad school notes, but for some wild reason, modern OneNote refused to handle it properly. Could have very well been an oddity limited to OneNote.
My other gripe was in the functionality of the stylus and how its rear end didn't act like an eraser. Perhaps my Thinkpad has spoiled me. Still, Microsoft, I think this is a simple improvement that would make the lives of experienced touch Windows stylus users much easier if there ever comes a day when we can get our hands on these fanciful units.
Expensive Commercial Toy Today; Living Room Screen for Tomorrow?
Perhaps people will call me crazy for thinking such technology will end up in our living rooms at an affordable price one day. But is it really that impractical to believe? I remember when my family purchased its first flat screen LCD TV at the turn of the new century for over $4000 USD -- a mere 32" set!
And years ago, most people would have never thought that we would be seeing things like 3D or 4K definition in any form outside of big established movie theaters. How wrong we were once again.
While there is something to be said for the capabilities of the Kinect and how it has transformed kinetic input into modern gaming, I fully believe that large format touch also has its place. While it's serving in natural roles in the commercial and education settings right now, it's only a matter of time until the tech starts edging down in cost and into the hands of consumers.
Another reason I don't see this kind of tech as being that crazy is because we have already had market attempts at full blown computers in the living room, namely via Windows Media Center PCs in the mid 2000s. Those devices flopped because the software was immature and the experience was disjointed, but if Microsoft can find a way to load the necessary hardware right into the TV sets themselves, these could be the bona-fide living room sets we have been wishing for all along.
If Microsoft can successfully combine the capabilities of what the PixelSense offers (formerly the original Surface table) in a beautiful wall-mountable format like the Perceptive Pixel screen, I can easily see growing interest in such interactive technologies for the living room. From viewing collections of family photos to marking up recipes with your significant other, the possibilities are endless.
Part of the innate problem at such an early stage with visualizing what we could do with this tech in the living room is that real development brawn hasn't been tossed at the cause yet. Microsoft hasn't thought of Perceptive Pixel in a living room setting so far, and likewise, app developers for Windows haven't given us an inkling as to what sorts of things they could offer for this new segment of home theater display.
I think the writing is clearly on the wall, especially with Microsoft's new foray into tearing down any walls left between apps that hit traditional computers and mobile devices. Windows will be Windows will be Windows -- no matter what form factor we choose to consume it upon.
Tablets and small format touchscreens are showing us how useful touch can be. But big displays like Perceptive Pixel will unravel a more intuitive experience, unleashing the true extent of the everyday potential in the still newfound technology known as touch.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net
Aside from a lack of backwards compatibility with Xbox 360 games (which is being worked on as I write this), what's the other big reason I am holding off on a first gen Xbox One? A TV streaming & DVR experience that was much talked about in the buildup to launch, but has fallen short in reinventing the way we manage and consume TV content today.
When I first heard about Microsoft's Xbox One plans at E3, I was thinking the same thing so many others probably were: my Tivo (or cable box) days are numbered. But my lofty plans for a simplified entertainment center were quickly killed, when I learned that Microsoft had no plans on replacing your DVR, but merely piggybacking onto it.
It's rather disappointing in more ways than one, and I'm personally a bit confused as to why Microsoft settled on such a lackluster approach to Xbox One's TV capabilities. A recent story on Neowin pinned this in exactly the right light: TV done right could easily be Microsoft's coup de gras for the taking. I know very well I'm not the only one who would adore the idea of being able to combine the power of Tivo into my Xbox. Or am I just crazy and alone here?
The Xbox One has a beautiful TV channel guide layout and SmartGlass apps, among other navigation niceties to enhance the viewing experience. But its reliance on an external cable box or DVR make the value proposition rather muted. Microsoft should be focused on how buyers can ditch their existing gear and centralize onto the Xbox One; not just piggyback the console onto legacy cable TV components. (Image Source: Microsoft)
Scott Stein of CNET sums up the issues with Xbox One's halfway attempt to play as a TV channel mediator. "The real problem here is that the Xbox One doesn't do anything magical with TV; it just allows pass-through, and split-screen app-viewing, and gameplay."
Scott goes on to further say:
Microsoft hopes the Xbox One will add more robust DVR control and deeper cable access down the road. How soon, or how easy that is to enable, I have no idea. But I'm tempted to just yank the cable box out of the Xbox One until that day arrives.
My thoughts are fairly in line with Scott's so far. In fact, I'm purposely holding back on buying the Xbox One until Microsoft clarifies its vision for TV capabilities with the new console. As I previously wrote, I'm a casual gamer now who cares as much, or more, about the media consumption experience of my chosen home game console. And I'm itching to let my mom have my Tivo if Xbox One can do this right.
But until something drastic happens, I'm not sure if that's going to be possible. If I have to use my Tivo as a crutch for my Xbox One to consume TV content, I might as well just stick with my Xbox 360. Heck, I can get Titanfall on it still, so I'm plenty satisfied on the gaming end.
CableCard: The Answer to Giving TV Watchers What they Want
Lots of commentary on the web about this Xbox One TV/DVR dilemma sits around Microsoft's inability to negotiate for content streaming on the new console. That's understandable, and an uphill battle they can easily avoid getting into. While ideally, yes, having fully internet-streamed ala carte TV channels through Xbox Live would be ideal, there is a middle ground most people forget already exists.
It's called a CableCard, and has been around in numerous forms for about a decade now. Yes, it's limited to the United States only right now, which creates a problem for overseas buyers, but global adoption of any TV content access standard is a mountain too high to climb right now.
But if anything, it would be a start -- and if Microsoft was able to lead the way in the States via a CableCard enabled Xbox One, it would surely give them leverage to force change in other parts of the world, albeit slowly.
I've never owned or rented a cable box personally, as I hate recurring rental fees for hardware and happen to think the Tivo interface blows away anything provided by Comcast. As such, I've been renting a CableCard from my cable provider for years. It's merely a locked down PC Card that is issued by the cable company which you merely insert into the proper slot on your Tivo (or other CableCard device) and configure via the native device software menu.
In terms of complexity, there is none. It's a card that goes into its appropriate slot in a one-way fashion, and comes with all of the necessary unlocking bits to decode your paid cable channels natively on a DVR like a Tivo without the need or expense of a rented unit. I've never had one go bad on me, and the only time I've ever had to play with them is during initial installation. They just work -- and that's part of their beauty.
If Microsoft were to up the ante in the home theater wars, and build a gen 2 Xbox One with a CableCard slot, this would allow people to benefit in numerous ways:
The reason I am holding such high hope on the potential for a CableCard introduction on Xbox One is because it is essentially a win-win for all sides. Microsoft gives buyers a huge reason to choose Xbox One over PS4. Buyers will be able to get access to all of their favorite channels they already love. And content behemoths (as much as I dislike them) won't have to get into nasty negotiations with Microsoft over streaming rights, and they can keep working with subscribers as they already do.
Most large cable companies already have CableCard programs in place and such a change would require next to no investment from their side. My local area has both Comcast and Wide Open West as cable providers, and each of them offers CableCard with just a simple phone call to support.
CableCard has allowed generations of Tivos and other devices to have nearly 1:1 access to cable channels like rented cable boxes do. How much would it seriously cost Microsoft to include such ports on future Xbox One hardware? Tivo paved the path that Microsoft should consider following. (Image Source: Engadget)
The only big change that would have to get introduced is a commitment from Microsoft to a hardware redesign of an upcoming Xbox One generation. But even this is something Microsoft is accustomed to. The Xbox 360 went through no less than 6 motherboard revisions during its history, and even the original Xbox saw a few technical revisions primarily aimed at preventing hackers from exploiting mods.
The Xbox 360 got an HDMI port and even Wi-Fi built in thanks to console revision updates. To say there isn't a precedent already set would be completely mistaken.
The other underpinnings necessary to make CableCard work properly are already in place on Xbox One. Fully encrypted HDMI? Check. Ethernet support for channel guide updates from Xbox Live? Check. And the same internet capabilities could appease cable providers who would want leverage in pushing CableCard updates over Xbox One firmware updates to disable any newly discovered loopholes for CableCard hackers, I suppose.
It's not a question of if Microsoft can do it; it's a matter of whether they are serious about having Xbox One take the reigns in the living room wars. CableCard support for native TV capabilities would give people looking for the all inclusive home theater experience little reason to look elsewhere.
My Eventual Dream: Ala Carte Channel Subscriptions via Xbox Live
One can dream. Even if it's a pipe dream. Microsoft was reportedly trying to work up its own Netflix-esque alternative a few years back, most likely to prepare for a grandiose Xbox One launch, which was shelved prematurely. Playing nice with the content big boys of Hollywood and its various backers is like dealing with a legal mafia. They make all the rules, and if you want in, you'd better be a yes man.
Even if it did work out, another Netflix alternative is not really what I would love to see come to fruition. On-demand media streaming is old hat already, and Netflix, Amazon, Hulu and the rest do a darn good job at it. I'd much rather see Microsoft to strive for the golden nugget: pure ala carte subscription options for traditional cable and broadcast TV channels.
You know, the stuff that hosts live sports, news access, and other content you just can't get via streaming apps until days or weeks after it airs?
Imagine if you could bring up your Xbox Live guide, and merely have a pick list of channels available that you could subscribe to piecemeal on a monthly basis, as needed? Just want to watch sports on ESPN or FSN? Or do you prefer news channels like Fox News, CNN, and MSNBC? Or even perhaps a little bit of both? That kind of choice, either down to the channel, or a genre of channels, is my eventual dream for subscription-based streaming TV.
No doubt, this would be a large fight for Microsoft to take on. Cable companies love the oligopoly they control now, and will fight tooth and nail against it. But Microsoft has an ace up its sleeve which cable companies can't deny: a large installed user base. And for the most part, unlike that of Sony's base, one that is on the whole willing to pay for Xbox Live on a yearly basis already.
To put this number in perspective, Microsoft let loose that it had 46 million Xbox Live subscribers as of April 2014. With the launch of Xbox One, that number has no doubt got to be larger by now. While a good number likely replaced their Xbox 360s with Xbox Ones, there has to be a fair number of first time purchasers who picked up Xbox Ones and Live subscriptions to boot.
Cable companies would be hard pressed to turn their backs on such a potential revenue stream. And I look at it this way as well: many people are cutting the cable box cord as of late. If the cable companies were smart, they would realize that changing their draconian channel subscription options would stop the hemorrhage of viewers and actually retain large numbers of their base.
What's so radical about only paying for the channels we wish to watch? We pay for internet speeds based upon our consumption habits. Cell phone bills are charged in a similar manner. And all other utilities follow suit -- electricity, water, etc. Is it that crazy to believe that some kind of pay-for-what-you-consume plan couldn't work via Xbox Live?
None of the cable companies have yet shown solid reasons why it wouldn't work as the rest of what we subscribe to in our lives. And as such, I'm challenging Microsoft to stand up for the TV channel watching community (me included) and start fighting for the next generation living room.
The Xbox One ecosystem is only half the answer. We need a TV subscription revolution to go along, and it's clearly Microsoft's war for the taking now.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net
It's no secret that my company had its own internal usage relationship with Google Apps go sour in the last half year. As our mobility, security, and feature needs continued to grow, at least in my eyes, Google seemed too focused on appeasing education and other niche sectors. As such, they've been leaving healthcare and other business verticals behind.
Is Google Apps necessarily a bad product? Not by a long shot. I just see Office 365 as a slightly better shoe: one that fits snug like a glove for our needs. And it's not just our company that has made the move from Google Apps to Office 365. To be honest, we get about 1-2 inquiries each week with customers who are looking to move in a similar direction.
If you're interested in reading about all of the reasons for my public flogging against Google Apps, please peruse my former piece that went into full detail about all the things that Office 365 resolved for us.
For this article, I wanted to focus on an important aspect of our move to Office 365, and that was our adoption of SharePoint Online as our sole document file server. I know, how passé for me to call it a file server as it represents everything that fixes what plagues traditional file servers and NASes.
Let's face it: file servers have been a necessary evil, not a nicety that have enabled collaboration and seamless access to data. They offer superior security and storage space, but this comes at the price of external access and coauthoring functionality. Corporate IT departments have had a band-aid known as VPN for some time now, but it falls short of being the panacea vendors like Cisco make it out to be.
I know this well -- I support these kinds of VPNs day to day. Their licensing is convoluted, they're drowning in client application bug hell, and most of all, bound by the performance bottlenecks on either the client or server end.
And similarly, file servers that are maintained well provide excellent return on investment. But in the small-midsize business realm I support, this is rarely the case. From little to no anti-malware protection, down to nonexistent patching routines. Too many organizations treat their file servers like a one-time investment, when they are really long term investments. Cloud storage, at least in theory, is meant to solve both of the above ills.
I previously wrote about how my company used to juggle two distinct file storage systems. We had Google Drive as our web-based cloud document platform, buts its penetration didn't go much further than its Google Docs functionality. That's because Google has a love-hate relationship with any Office file that's not a Google Doc. Sure, you can upload it and store it on the service, but the bells and whistles end there. Want to edit it with others? It MUST be converted to Google's format.
And so we had to keep a crutch in place for everything else that had to stay in traditional Office formats, either due to customer requirements, complex formatting, or other reasons. That other device for us was a simple QNAP NAS box with 1.5TB of space.
We've recently replaced it with a spare Dell PowerEdge server, but the device has been relegated to hosting only those files which are not cost effective to move to the cloud yet: client PC backups and raw multimedia from training we do. But we don't have to touch those files very often. In the case of PC backups, they usually go into storage and stay put for 30 days until we can safely delete them. And our training recordings are also just backups, as we place most of the content onto our YouTube channel.
I've been comparing all of the other players in this cloud storage market for some time now, as other clients have been asking for help making the move. We liked Google Drive's real time collaboration functionality, but the way it treated non-Docs files was pretty pitiful. Dropbox for Business provides the best headroom for growth, but it's starting monthly price is too much to swallow. And Box and Egnyte don't bring much more to the table besides bona fide cloud storage and sync; SharePoint Online offers a rich ecosystem that we can grow on.
For the purpose of running our day to day business needs, SharePoint Online has taken over for both Google Drive and our former NAS alike. We don't have to convert items to and from Google Docs anymore just to collaborate. We have as good, or better, permissions in SharePoint compared to Google Drive. And the search power in SharePoint is disgustingly accurate, providing the accuracy and file previews that we were used to on Google Drive.
Was SharePoint a pick-up-and-go solution like Drive? It definitely forced us to put some elbow grease into the initial setup, but the reward on the backend has been extremely gratifying in light of the daily pains we had with our GDrive-NAS hybrid scenario.
It may not be a solution for everyone, but it definitely fits the bill for us.
What the Heck is SharePoint Online?
First off, let me clarify the product I am referring to in this article. SharePoint comes in two primary flavors now, with some important differences between each. Many people throw around the names interchangeably but they are different beasts, and as such, I am not going to continue propagating that mistake.
That pretty much sums up the two products. For all intents and purposes, if your organization is looking at SharePoint Server 2013 on-premise, a lot of the contents of this article may not apply. SharePoint Online is first and foremost a cloud solution that has additional tie-ins with Office Online products, OneDrive, etc that may or may not exist in the on-premise version of the product. Seeing as our company doesn't have much experience with this edition, we will only mention it in passing.
The areas that SharePoint Server 2013 excels over SPO are contained within aspects such as connecting multiple instances of the product in a server farm environment, storing large amounts of data without incurring cloud storage fees, and other Enterprise centric needs. Plain and simple, I have yet to run into a small or midsize business that was not better suited by the SPO edition of SharePoint.
Describing SharePoint Online in a single sentence is very tough to do, because like Google Drive, it serves multiple purposes in its latest iteration. It's a cloud file server (the focus of this piece). It's a content search hub. It can run public websites and internal intranets. It can help handle complex document workflows. You can even run Access databases on it.
A lot of people are interested at how it compares to Google Apps' offerings. I say it like this: SharePoint Online is a Google Drive and Google Sites mashup on considerable steroids.
I'll be completely honest and let it be known that I had nothing but contempt for SharePoint until last year's release with the revised Office 365 suite in late January 2013. Not only was the cloud edition of the product too convoluted for most people, but the on-premise version was an even greater nightmare. It's not that it didn't provide the functionality people wanted; it was just a beast to setup and maintain over the long term.
And the biggest downfall of the prior editions? Most access to SharePoint data was relegated to the web browser, meaning that Dropbox-style access that people have grown accustomed to over the last half decade was only a pipe dream. For a service company like mine, that is out in the field more than half of the workweek, browser-only access to files was a non-starter.
I can finally work as I wish, in-browser or in Office 2013 -- or both at once. My entire company "file server" is synced via OneDrive for Business to my Thinkpad, and likewise, I can edit any files in a browser via Office Online apps. It's a nirvana that Google Drive almost afforded us, if it weren't for Google's distaste of traditional Office files. It's good to know you can have your cake and eat it too.
Microsoft released SkyDrive Pro last year, which has since been transformed into the renamed OneDrive for Business just recently. The tool not only allows you to access your personal "My Docs in the Cloud" (which starts at 25GB per user now) but more importantly, for organizations moving their file servers to the cloud, it lets them sync up to document libraries.
You may be curious at what a document library is. And that's a good question, because the way that SharePoint organizes storage is a bit different than most other providers. Since SharePoint is built around a collaboration and search focused backbone, it requires more structure to how data is placed into the service.
Much of this hums behind the scenes and is never seen, but the basic concepts are important because too many people think that you can just dump data straight into SharePoint Online and make it work like a file server.
As you can see, there are some conceptual building blocks that have to be taken into account when building your SPO "file server in the cloud". While Microsoft advertises OneDrive for Business as being able to serve this purpose simply on its own outright, this is slightly askew marketing at its best.
Microsoft doesn't bring this very point front and center: that OneDrive for Business without document libraries cannot allow for mass sets of files to be shared in a file-server style environment. You know, the way we are used to working with shares, where large collections of files can be dumped into folders and as long as someone else has permissible access to the files, they can view/edit/etc them?
OneDrive Without Document Libraries DOESN'T Replace the File Server
OneDrive for Business DOES allow you to share files, but unless you specify every single person individually on files you are sharing, the items will not be seen in their "Shared with Me" view. This is one of the first roadblocks that I had to fight when I tried to test out SkyDrive Pro last year. There is not much information online that alludes to this, as Microsoft's marketing engine wrongfully assumes that people don't wish to continue thinking of their files in a server-centric approach.
Our company usually works with clients on setting up the basics for getting document libraries separated, OneDrive configured, and all other aspects that pertain to a post-server lifestyle in the cloud. There are a lot of concepts to wrap your mind around, and I won't dive into them in detail here.
The biggest thing to remember is that OneDrive for Business, on its own, does NOT function as a file server in the cloud. That power is solely available through Document Libraries in SPO.
I'll go into other aspects of how SharePoint Online and OneDrive work together further down.
Cost: How do the Biggest Names Compare?
Anyone planning on ditching their file server is probably quite interested in the recurring cost of making such a move. Customers have been pegging me with this question for some time as well. For the sake of writing this article, I decided to do some fact finding to put my various notes and emails on this topic with clients in a definitive one-stop format. I created a master spreadsheet that outlines all major aspects between the top five contenders (in my opinion) in the cloud file server market.
I want to make it clear about the different editions of each product that I used to compare against the other names. For SharePoint Online, I decided to base my pricing at the standalone Plan 1 level for the sake of simplicity. I know that SharePoint Online is a part of many Office 365 plans (we use E3 ourselves, which has it) but then the pricing will not be accurate against all the others since Office 365 provides a lot more than just cloud storage.
A similar decision was made for Google Apps; I did NOT account for any edition besides the Business and Education edition which are the two most prevalent editions. For Dropbox, I decided to use the Business edition since that is best suited to organizations moving off the free version. Box Business and Egnyte Office were also similarly chosen targeting small to midsize offices.
How do the different providers stack up? Here's my pricing breakdown that is valid as of March 23, 2014:
SharePoint Online clearly wins out much of the pricing category against the others due to Microsoft's insistence to offer SPO as an ala-carte product, not an all you can eat buffet that other names represent (especially Dropbox for Business). The fact that Microsoft offers additional SharePoint Online tenant storage pricing at only $0.20USD per gigabyte is a winning differentiation, as all the other provides insist on forcing "storage packages" down your throat.
Google is the most forgiving of the rest of the pack, but still not as friendly towards smaller businesses that want to start small and piecemeal their storage needs as they grow.
Dropbox for Business offers an attractive base storage plan that is truly unlimited, but you pay dearly for that bragging right: you have to bring at least 5 users onto the plan, at a cost of $15/person per month. That means you can guarantee that you won't be spending any less than $900USD per year, even if your office of 15 people only needs 200GB of storage space. Not my kind of way of doing business, honestly.
Google also has an attractive offering, with pricing for the year starting at $60 (if you pay by the month; yearly prepaid accounts get $10 off). They give each person 30GB of Drive storage space, but be mindful that this is deceiving because Google considers their space shared between Drive, Picasa, and Gmail. So if you have a 20GB inbox and another 2GB of photos on Picasa, you're only left with what's leftover (in that case, about 8GB).
Google forces storage upgrades in packs that have different pricing, with the cheapest starting at $4 for 20GB extra per month. The price per gig cost on that comes out to the same amount that Microsoft charges on its piecemeal offering of $0.20 per gig per month. Competitive, but I still think Microsoft's approach of allowing single gigabyte upgrades is much friendlier for small organizations.
It's important to have a look at the sample monthly costs I stacked up across the board. These represent what average sized smaller organizations would be spending on cloud storage with each provider. I didn't offer up multi Terabyte samples because this is not representative of what organizations we work with look like. You can run those numbers for your needs, but for the sake of comparison, I offered three different user size counts along with various storage needs.
The savings that SPO represents is fairly drastic across the board, and I would only assume it to be even greater if you pair your needs with an Office 365 plan level, that discounts pricing for all included services appropriately. But seeing as I am not interested in comparing apples to oranges here, our comparison sticks with SharePoint Online Plan 1.
You can make your own conclusions before moving into any of these providers' clouds, but unless you have huge sums (1TB plus) of data to move up, SPO is going to be an attractive option for most companies. Otherwise, Dropbox for Business or Google Drive may be your better bet.
Calculating exact pricing per month for SharePoint Online is a bit tougher to do, since the way that storage is handled can be a bit confusing. Microsoft pools a baseline 10GB of space your entire company, and then adds on an additional 500MB for each extra user you have on your account. 2 users represent an extra gigabyte of space, and so on. This distinction must be made when trying to find how much it would cost to move your data into their cloud, since otherwise you may overcompensate with additional storage unnecessarily.
Features in SharePoint Online vs The Rest
Even though I didn't do so, you can easily draw an imaginary line between Google Drive and Dropbox for Biz on the chart below. I see SPO and Google Drive representing rich platforms that do much more beyond just storing and syncing files. The others are just that: cloud storage entities that are merely throwing oodles of raw datacenter HDD space at customers.
You may not be interested in being able to edit your Office files in a web browser, build an intranet, or collaborate with others in real time on your docs (not just "share" them in the traditional cloud sense). But as time goes on, I tend to find clients yearning for these functions more and more. The cloud storage arena isn't even that mature yet; I find it to be quite young, in fact. But I think Google Drive and SPO are leading the pack in the arena of functionality and expandability, in ways that Dropbox, Box and others just can't touch.
Here are some of my favorite aspects about SPO and what ultimately led me to the platform:
There's a lot that SPO brings to the table, and a lot of it ties back to Office 365 as you can tell. The two are made hand in hand for eachother. I'm not saying that a company that uses SharePoint Online in its vanilla form, detached from 365, won't have a great experience. It's just that you will have that icing on the cake if you are tying it all together in 365-land.
One of the first things I was worried about in a post-Google Docs life was: how great is the coauthoring experience? I will admit that Google has a certain polish on their product which Office Online is still working towards. If your users will be 100 percent relying on Office Online apps for their coauthoring needs, they will run into relatively few speed bumps.
Things get a little buggy sometimes when you are working with people across desktop apps simultaneously with Office Online users. I think Microsoft still has a work in progress here. Word 2013, for example, plays quite well with Word Online when coauthoring in real time. Word 2013 relies on autosaves to push updates into the desktop side, while Word Online is saving and updating in near real time. I guess this is one of the downsides of desktop apps; their legacy dependencies were not intended for a real time coauthoring future.
Excel 2013 is by far the stickiest experience for us. For example, our office manager could have an Excel file open from SPO through OneDrive (for example, our payroll calculations sheet) and I try to open the same document on my computer in Excel 2013. I can't make any edits to the file until she closes it. It seems that Excel 2013 is not up to par with the coauthoring changes in the other desktop apps; perhaps it's one of the tougher apps to get updated. The same limitations don't exist in Excel Online, naturally.
OneDrive for Business is a pretty feature filled add in that works just like the personal edition, but most importantly, provides for the syncing of one's personal OneDrive area, and allows for document library syncing. All of our company staff are allowed to sync doc libraries they have permission for to their laptops/desktops. There is an option in SPO to disable OneDrive sync capability, but it's a one size fits all solution: you can't disable it for individual users only. I'm hoping Microsoft will be changing that as customers are asking about that too.
One notable missing piece in the OneDrive for Business family is the presence of a Mac client. My coworker who is an avid Mac-only user gives me grief about this every week, so it's something I wanted to ensure that you know about. We have been getting around this limitation for Mac users by having them use Office Online, but this is not always ideal. For example, there are many times where my staff are onsite and need to access client records in SPO in local only mode due to network changes being worked on by us.
While some people can rely on always-accessible internet, as a tech consulting firm, we're knee deep in situations where this is not necessarily the case all the time. Having local Office access through OneDrive for Biz on my Thinkpad is a lifesaver, and I do feel bad for my Mac colleague. Luckily, Microsoft let loose at SharePoint Conference 2014 that this is changing:
There are solid rumors that the new Office for Mac 2015 will be hitting later this year, and I fully expect OneDrive for Business for Mac to launch with the new suite. The above screen was grabbed from a Microsoft employee that is already running the Alpha version of the app. (Image Source: SharePoint Conf 2014)
When it comes to mobile and desktop app availability, Microsoft is probably in last place in this regard. And it's relatively surprising to me, especially in the lacking of a native client for Windows Phone 8 at this point. The other providers have covered both sides of the desktop, and are covering most of the mobile sphere -- and in the case of Box, they have a mobile app for EVERY ecosystem including BlackBerry. Impressive.
It's not a deal killer for our company, or me personally. We're a heavy Windows shop, as are our clients, and the fact that 95 percent of SPO's capabilities are capable of being handled in the web browser, this is a fairly solid cross platform answer right now. But I know today's world is so app-driven, and this is not good enough. Microsoft's answer to the app dilemma has been "it's coming" for the last year or so, and I'm hoping there is some meat at the end of the stick soon.
As a Windows Phone user, the app equation has been bittersweet especially from Microsoft. While there is a native OneDrive app for Windows Phone, it doesn't touch any documents or shared files in SPO or OneDrive for Business. And the "Office" app on Windows Phone allows me to view recent documents I've opened on my desktop or web browser, but I can't browse my SPO folders in any legible way. It's an odd situation that just needs to be fixed.
I also want to touch on permissions briefly in SPO. If you are coming from a Windows Server based environment, rest assured that Microsoft did not drop any balls here. While best practices always recommend you stick with assigning users to groups and setting permissions in this manner, you can setup per-user granular permissions on any file or folder in a document library. Read only, editor, limited viewing, and many more permission levels are there at your disposal. There's not much I was able to do in a Windows file share environment that SPO cannot handle.
Search is also something that SPO handles with ease. Google Drive had excellent file crawling capabilities, and I was expecting nothing from SPO. It definitely delivers! SPO search can crawl any data that is readable within a file, including metadata, and even allows you to view file previews prior to opening an item if you are unsure if that is what you were looking for. Since SPO is a platform that runs near exclusively within SQL Server behind the scenes, it's not surprising that data crawling is so easy for it.
Which leads me to my next point regarding the use of SQL Server in SPO. While SPO is definitely a speedy product, I think it still has some room for better performance. I think Google Drive has a bit of extra pep to it when you put both head to head, especially in some file opening instances or performing searches.
To this effect, it's not far fetched to believe that Microsoft will be implementing its new SQL Server 2014 product to power SPO, which is already showing 30 times the performance improvements compared to SQL Server 2012. Seeing that Microsoft is migrating SPO into Azure, I wouldn't be shocked if SQL Server 2014 was powering all or most of SPO by this time 2015 or sooner.
Another aspect of having such a reliance on SQL Server is the fact that, right now, Microsoft is not yet encrypting all data in SharePoint Online. At least not per what I can find publicly online. While most of the rest of Office 365 already has full encryption for all communication to information in transit and at rest (email and Lync, notably) SPO is still awaiting such a security enhancement.
Microsoft's datacenters are pretty serious affairs, but this graphic puts the nuts and bolts in an entirely different light. You can view the in-depth walkthrough on how all this works from this SharePoint Conference 2014 session. The amount of redundancy and copies that exist for any of your data at any given time are impressive. (Image Source: SharePoint Conference 2014)
The problem with running an entire ecosystem is that you have to utilize something called TDE (Transparent Data Encryption) which allows for real time encryption/decryption of data from SQL databases. Implementing such tech for SPO across the board must present a real strain on datacenter resources, but if Microsoft is serious about having SPO reach into the highly regulated sectors (HIPAA, government, financial, etc) they have to get full encryption implemented into SPO from end to end -- not just for endpoint communication into the platform.
On the uptime and support front, Microsoft offer the same guarantees as they do for the rest of Office 365. Three Nines of uptime (99.9 percent) and 24/7 phone support from their call centers. All of this is backed up by a public SLA which is similar to one that Google offers for Google Apps. It's interesting to note that the other players make references to SLAs they offer to some clients, but none of them publish their SLAs publicly for viewing. I'm not saying they have anything to hide, but I think they should evaluate their level of transparency once over.
Limits in SharePoint Online & OneDrive for Business
No platform is without its intricacies, and SPO isn't any different. SPO has some limits in the way that it handles files, the storage space it doles out, and other aspects about its operation. Customers ask me about these all the time, so I'm bringing them front and center up against the other big providers.
SPO is a slightly different beast compared to the other providers in that it considers storage space on two different levels. The first is individual storage space per-user that is accessible for each person in their personal OneDrive for Business. This is the effect of their "My Docs in the cloud" that I referred to earlier. Secondly, there is a concept of tenant (or pooled) storage that applies to document libraries and sites in general. Think of this space as the "bucket" which is pulled from for your file server in the cloud in the form of shared folders.
Microsoft just upped these limits, and allows for OneDrive personal space and site collections to hit 1TB in size, and best of all, overall tenant storage for your entire account has no limit now. Redmond is serious about getting SPO in a position to not only compete on value-added features, but on raw storage sizing as well, up against the likes of Dropbox and Box.
Two other limits that I will make light of are the 5000 document per library sync limit when it comes to OneDrive for Business. This is where proper document library planning comes into play, so that you aren't dumping 10K files into a single document library and having your users complain that they are missing half of the old file server. This limit is much smaller than the limit for how many files your personal space in OneDrive for Biz can sync, which is 20,000 items. Microsoft outlines all of these limits online for the curious.
I will call Microsoft out on the lowly per-file size limit they have in SPO, which is 2GB per file. In light of Dropbox having no limits, and Google Drive capping out at 1000GB per file now, it's tough to see why Microsoft can't work on upping this limit. I know that the limits of SQL Server behind the scenes may have some effect on this, but SPO needs to pick up the game if it wants to outshine the others.
We're not storing super large files like this on SPO, and instead rely on our traditional server for these needs. But as cloud storage continues dropping in price, and file size limits continue rising in SPO, I can see a day potentially when we will drop our legacy file server needs entirely.
You have full control over file versioning in SPO, and can technically place no cap on this in the SPO admin center. But be warned: version history does eat into your storage quota, so allowing unlimited version history is not really recommended. Nor would I think this to be practical for any organization. If you have greater versioning needs, perhaps set a limit of 50 or 60 versions back and also enable minor versioning, but refrain from using the unlimited setting. Your storage quota will thank you.
Security & Privacy in Cloud Storage
This is a tough discussion to have with some clients. There is usually a mental mountain to climb when it comes to believing that the cloud could actually have better security in place than what a company can place behind its own locked doors. But more often than not, I do find this to be the case. Piecing together how this is the case tends to be the tricky part.
Luckily, SPO is backed by some heavy artillery from Microsoft's concrete datacenter privacy and security commitments. Redmond has been working very hard on convincing the most stringently regulated industries that their data is safe in Microsoft's cloud. We didn't make the move for our data lightly either: we keep confidential client records, internal company data, marketing information, and other items on SPO now. We're also regulated by HIPAA since we are a business associate for numerous clients who are considered HIPAA covered entities.
Here's a glance at all of the security/privacy safeguards that Microsoft and the others have met so far:
Out of all the entities on the list, Microsoft arguably has the best resume when it comes to security compliance standards. And while they have not achieved FedRAMP for Office 365/SPO yet, their Azure datacenters have reached this high benchmark as of last year, and have stated in public documents that they are working towards the same for 365/SPO sometime in 2014.
A lot of our clients have been making moves to Office 365 for meeting HIPAA compliance needs, and it's a given that SPO has this covered. Not only can you use SPO for ePHI in the form of traditional Word or Excel documents, but you can also build database driven Access apps on SPO. This is a great, integrated manner to leverage database functionalities without spending more on separate similar apps like the dreaded Filemaker I so love to hate.
I will point out that NO provider has met PCI DSS (Payment Card Industry Data Security Standard) compliance yet, which would allow you to store customer credit card information in their cloud. Just like with FedRAMP, Microsoft has received PCI compliance on their Azure cloud infrastructure network, but this does not entail SPO or Office 365. This will probably come hand in hand with SPO getting Transparent Data Encryption for its SQL backbone.
Of the rest of the providers, I think that Dropbox and Egnyte have quite a mountain to climb still in order to reach the same levels of what Google and Microsoft have carved out. Most notably, Dropbox has yet to receive HIPAA compliance, which means the healthcare industry has no choice but to stay far away from arguably the first innovator in cloud desktop sync.
Coupled with the granular permissions that can be setup on document libraries, I find SPO to be a near 1:1 replacement for what traditional file servers provide, given that you aren't regulated by FedRAMP or PCI. Those two earmarks aside, SPO's security paradigm is pretty top notch, especially up against the competition. For the price point SPO comes in at per user, the amount of security afforded cannot be beat.
SPO Still Has Room to Improve
Microsoft has a real gem on its hands with SPO. Is it perfect? Not by a longshot. I've denoted some of the gripes we have with it. With as many announcements as Microsoft has been making around OneDrive and SPO in the last few months, it's fairly safe to say that Redmond is serious about growing SPO into a powerful part of its cloud portfolio.
The app layout, especially on the Mac side, needs to get resolved. Office for Mac 2015 is just around the corner, and I'm dearly hopeful that Onedrive for Biz for Mac is bundled in with this release. The treatment Microsoft has given Lync on Mac is rather pitiful thus far, and they are repeating a similar path with SPO on all fronts aside from Windows. Judging from what I saw at SharePoint Conference 2014 (shown above), fingers are still crossed.
I also believe that OneDrive's tendency to get clogged up when it comes to file syncs at times needs to be improved. Microsoft isn't alone in these bugs; Dropbox and Google Drive have just as many instances where the gears just get clogged and you need to force a sync or clear out sync issues. This is probably one of the biggest roadblocks to desktop sync apps presenting a problem free experience. For now, it's a relatively smooth ride that requires some diligence along the way.
And I must question Microsoft's intention to keep the barrier to entry on SPO relatively higher than what it is on Dropbox or Google Drive's side. As I mentioned, without document libraries, moving a file server up to SPO is a disaster waiting to happen. But implementing document libraries properly still requires a fair amount of technical planning and foresight, something which consultants like myself need to assist with. Why can't Microsoft make SPO doc libraries as accessible in setup and operation as OneDrive for Business is when it comes to a user's personal files? Does it have to be this hard to get started?
Don't take my stance the wrong way. I fully believe that the time and energy investment in setting up SPO reaps its rewards many times over the further you leverage the product. Between Office Online, OneDrive, coauthoring, search, and the cheap piecemeal storage offered, SPO is an entity that doesn't have any other direct competitor, toe to toe considering all that it provides, aside from Google Apps.
Some of the biggest objectives for Microsoft in new SPO features for the next year or so: Android tablet support, tighter integreated communications, better desktop app to Office Online feature fidelity, and more. (Image Source: SharePoint Conference 2014)
But I guess it comes down to what your organization is truly looking for. If all you want is a never ending dumping ground for cloud storage, Dropbox for Business may be your better bet. If you're after the best in class coauthoring experience and are willing to forego Microsoft Office support, Google Drive may serve you slightly better. However, add up the sum of all your parts, and SPO stands above the rest in my professional opinion.
Are there good alternatives to going with a cloud provider like SPO if it doesn't meet the security regulations you're bound by, or other roadblocks? Most definitely, but I think both options are a decent amount more expensive up front. One option we have been routinely pushing is going with a Windows Remote Desktop Services experience for hosting files internally, while providing seamless outside access. This allows you to keep your file needs internal on-premise, but ditch the VPN yet offer a much faster experience.
Another good option that is out there is by using a Windows Azure server instance to host your file server in the cloud. The monthly costs on this approach are naturally higher, as you are asking Microsoft to host a virtual server on your behalf, but the security, speed, included Windows Server licensing, and low maintenance levels (not to mention no electricity usage!) are definite upsides. You can then connect to your cloud file shares over WebDAV and have a system that mimics near 100 percent what a traditional file server can do, with all the prowess of traditional Windows AD permissions.
With any potential move to a modern cloud powered solution, there are always trade-offs to consider. Is your organization willing to forego hefty up-front costs in exchange for a more gradual cost schedule? Have you considered looking at what your various TCO alternatives look like? Is your necessary security baseline going to be met by your intended cloud provider? Will the cloud be able to offer the storage space you need for the price you can afford?
All good questions to consider. If you're interested in seeing SPO and OneDrive in action, I highly recommend you have a look at the webinar I made last last year touching on most major points of the product. It should give you some functional insight as to how this may look for end users.
Is SharePoint Online the right cloud ecosystem for your organization? That's for you to decide. I personally find it to provide the most compelling argument of all the storage clouds out there, but you'll have to weigh your pros and cons to see if it fits the bill.
I will leave you with this: if your company is already on, or thinking of going to, Office 365, then SPO should be an almost no brainer. Not only does it tie into the same common ecosystem for your users, but it will most likely offer the cheapest option in your transition to the cloud.
Cheapest isn't always best, but in this case, stereotypes definitely don't apply.
Image Credit: frank_peters/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net
The standard fare of tech industry pundits just don't get it when it comes to Windows RT. They lambasted it when it came out in 2012 (in some ways, rightfully so). They doubted Microsoft would release a Surface 2 variant, and Redmond did just that. And they continue to beat the anti-RT drum loud and clear, using RT device sales figures as their proof of a pending death notice.
Perusing Google, you can come across a wild variety of articles that purport to explain why Microsoft needs to ditch RT altogether. Chris Neiger penned one such piece, and even John Martellaro of MacObserver.com did his best to argue how foolish Microsoft was for even considering RT a serious contender.
I've always held Windows RT in a slightly different light. Not only have I argued that Windows RT is here to stay for numerous concrete reasons, but I outlined publicly how well RT tablets are working out for clients of mine as traditional x86 replacements. That customer rollout I highlighted in the aforementioned article is doing so well, in fact, that the only calls we are getting now are for menial support requests like password resets and spare power cords. Can you say it just works?
In light of all the RT critique out there, today I was pleasantly surprised when my RSS feeds came up with an article from Rod Trent of WindowsITPro.com. It's a wonderful read, and one that I heavily agree with naturally, but he boils it down to this: Windows 9 is going to be more of a Windows RT than Windows 8 was, and Modern UI will only become a larger part of the core experience.
Unlike many of the tech analysts that make a living observing this wild industry from the outside, my rationale is formed by what I am noticing in the field as a consultant. Wall Street may live and die by financial performance figures, but the tech world isn't easy to predict on that basis alone.
As I mentioned in my article on why Windows RT is here to stay, I pointed out that Microsoft rode on large $3 Billion in losses for the Xbox consoles for almost a full decade.
It paid off in the end, as Xbox 360 stood as the best selling console in the USA up against the PS3 for nearly 2.5 years straight. While Xbox One is getting off to a slower start than the PS4 initially (likely due to its $100 higher price tag) I think the competition will equalize as Microsoft continues to unleash living room-friendly features for households and bring the Xbox One price point down over time.
And now Google seems to be on the same course, busily converting its niche Chromebook line into a full fledged case study for what netbooks should have been back in 2006. After years of dogging it in the market, Gartner is predicting a 160 percent surge in Chromebook sales for 2014 -- and some are even saying that this is a conservative figure. So much for the idea that everyone wants full Windows or MAC OSX.
Since people like Rod Trent and I clearly seem to be in the vocal minority, perhaps outlining some of the "writing on the wall" may be helpful.
#6: Windows Phone and RT Will Merge, And Only One Direction Makes Sense
It's already clear from leaks that Microsoft is going to be unifying its application bases between Windows Phone apps and Windows Store apps. The bigger point here leads into what Julie Larson-Green of Microsoft told us a few months back: “We have the phone OS, we have Windows RT, and we have full Windows. We are not going to have three.”
She was absolutely correct. There's little reason for Microsoft to keep supporting three distinct platforms. In fact, I'll go so far as to say that Microsoft could eventually get away with a single unified "Windows OS" platform that takes over for the consumer desktop and its tablet/phone lines. Risky, but completely possible and plausible if you see what I see unfolding.
A single OS for non-server devices makes complete sense in the long run. The reason I say non-server is because part of my predictions here are formulated around Windows Server keeping to a traditional x86 code base to allow for sustaining a cloud pedestal for backwards compatibility, powered in unison both by Windows Remote Desktop Services and its Azure cloud infrastructure. All of that geeky backbone goodness still lives and breathes on Windows Server -- an x86 world through and through for the foreseeable 15-20 years or more.
But for the short term, Windows RT and Windows Phone will need to merge so development resources are not being exhausted coding for separate ecosystems. Developers are still staying away from the Windows Store platform in many ways for this very reason, and removing this barrier to entry would take away this argument in one fell swoop.
Windows RT is the better half of this duo for numerous reasons.
First, it already has the viability of running tablets quite well judging by the Surface RT/2 and Lumia 2520, to name a few. It also carries the same Modern UI interface for its touch-centric activities, so is familiar with what Microsoft is already pushing in Windows Phone. But this is not the case the other way around -- Windows Phone is missing the Desktop UI that RT brings to the table for multitasking and traditional Office usage. And this will be the keystone that holds the RT advantage together.
Not to mention how RT already has the proper assembly on ARM, which is the benefit that allows it to work on devices where battery life is at a premium for users. The Surface RT and Surface 2 have batteries on par with the iPad, yet provide leagues greater computing functionality for the end user. Intel keeps promising better battery life with each x86 line, but nothing is on par with what I have seen out of Surface RT devices.
Giving developers a common RT backbone to work off of for their tablet and phone apps, similar to how Apple does it with iOS? It only makes sense. Apple has proven its a sustainable model, and Windows RT provides all the critical pieces of the puzzle that tablet users and phone users alike are looking for. Function, finesse, and feasibility.
#5: Chromebooks Proving That a Limited Mobile OS Can Work
My first real dive into Chromebook land was all the way back in late 2011, when the industry was questioning why the heck Google even attempted to create an OS out of a web browser. I was helping a school district out in Missouri roll out their first batch of Chromebooks for a pilot program for students. Education got it, but the rest of the tech world was scratching their heads.
How wrong the skeptics were. After abysmal sales of the Chromebook devices for over two years, Google finally got traction on the devices in 2013 and as we churn into 2014 now, Chromebooks are the hot topic in notebook computing.
Apple's advertising may want you to believe differently, but the numbers show Chromebooks taking up a whopping 21 percent of the US notebook arena now, with Apple at a minimal 4 percent. These cruddy, limited, barebones devices that have only been around for a couple of years are hotter than anything from the fruit company?
Google took a huge risk with a laptop based on an OS more limited than Windows RT in 2011. Turn to 2014, and it's the hottest sector of notebook sales. Early critics of Windows RT are making the same premature judgement. (Image Source: ABCNews)
Even as Windows and MAC laptop sales are dropping, Chromebooks continue rising. “The presumption that desktop Windows is ‘most familiar’ [to users] no longer applies,” claims Jeff Orr from market watching firm ABI Research.
The underlying message that can be extrapolated from these indicators is that RT can succeed as a mobile platform even in light of its inability to run the plethora of millions of Windows apps. As we shift into the cloud and become comfortable with getting work done in a browser or in DaaS environments, a necessity for the unlimited compatibility playground we have been used to is becoming increasingly irrelevant.
The 1.84 million Chromebook purchasers of 2013 let that be known fairly loud and clear.
#4: Desktop-as-a-Service (DaaS) to Become Almost Ubiquitous in 2014
DaaS penetration and market realization is quickly growing, and I've personally been having huge success with it in my own consulting for clients in the last year. We've had nothing but positive results from converting one company off netbooks to Surface RT tablets using Windows RDS DaaS hosted internally, and just this past weekend we replicated the same setup for a HIPAA customer in the south Chicago suburbs.
I tend to love what Microsoft offers first party with RDS (Remote Desktop Services), especially in the Server 2012 R2 release. It's super secure, rock solid in performance, and cuts down endpoint device costs considerably. For the above HIPAA entity I mentioned, we are already planning on replacing office desktops with a full fleet of zero clients on the next machine upgrade cycle. Zero maintenance, super low power consumption, and no endpoint licensing to worry about.
But not everyone happens to be an infrastructure geek like me, and I understand that. While my company tends to be installing these RDS DaaS setups on-premise for customers right now, we are already planning out steps to take this fully into the cloud in 2014 and beyond.
Windows Azure hinted at making this a full fledged reality last year when Microsoft allowed SPLA-licensed hosting companies to offer RDS in their IaaS cloud, but this is still somewhat anti-consumer still since SPLA licensing is not easily acquired or maintained by smaller firms that want to control their own RDS setup on Azure.
It's only a matter of time before this changes, and I'm fully expecting Microsoft to make it as easy to spin up RDS desktops on Azure as it is on Amazon's new WorkSpaces platform. Amazon's pitch is as simple as it gets: no need to worry about RDS CALs, SPLA, or any of that other Microsoft legacy licensing cr*p. You bring the workload, we provide the desktop in the cloud for you.
Microsoft has been tweaking Windows Remote Desktop Services for the better part of the past decade and a half. It's so accurate in recreating a traditional desktop experience that most users barely know they are on Windows Server. Just good development, or preparation for a Windows RT future? (Image Source: WindowsITPro.com)
Not to be outdone, VMWare announced last October that they acquired Desktone, another heavy expert in the hosted DaaS market. It's not quite clear as to what VMWare's intentions are for the service, as they have yet to formally release an offering utilizing the expertise they acquired, but all indicators are leading to a WorkSpaces-esque product.
And so all three of the cloud datacenter giants have some flavors of DaaS already available or coming soon. Google will likely join the fray at some point as well. They wouldn't be investing this heavily in a sector of cloud computing if it didn't have long term potential. And thus they see, as I do, that DaaS is here to stay.
What the heck does DaaS have to do with Windows RT? Quite a lot, actually. As I mentioned, Windows Server is likely to continue down an x86 path for the foreseeable future so Microsoft's RDS functionality can continue offering up the "legacy" Windows experience for clients.
Business customers have been using RDS in some flavor heavily for the past near 15 years, previously under the guide of Terminal Services. You think Microsoft hasn't taken 15 years of lab rat testing in this field seriously? Guess again.
Terminal Services used to be fairly pitiful in its Server 2000 and 2003 forms. Personalization was almost non-existent per user, and the amount of effort that had to go into setup by an administrator was daunting. Microsoft literally had hundred plus page getting started guides for the functionality. Not anymore.
With the advent of User Profile Disks that can provide virtual, dynamically-sized containers for end user profile data, there is little to no difference for a user on RDS compared to a traditional computer setup. Combined with the unlimited power that Group Policy affords, it's my DaaS of choice going forward.
Combined with Azure, Microsoft can easily pursue a Windows RT future on upcoming releases of Windows. Modern UI and Windows Store can power the mainstream, maturing vision for Windows usage, and legacy support can easily be handled through a mixture of on-prem or Azure-driven DaaS in the form of RDS.
The other vestige of a legacy Windows world, Active Directory, has already been proven that it can run in a Windows Azure environment. For businesses that require permissions and device management from ADDS (Active Directory Domain Services) they can easily keep their needs going on Windows Server.
A lack of legacy support on the native endpoint thus becomes almost entirely moot. By the time Windows RT becomes a mainstream consumer offering (perhaps starting with Windows 10, as Windows 9 may be too soon) most app developers will have either already ported options to the Windows Store ecosystem, or be well on their way. That fringe of society in need of definite horsepower or niche legacy apps would opt for spots on a potential DaaS offering on Azure.
Many rumors have already leaked the development of a Project Mohoro, which I honestly think is the effort to make this dream become a reality inside Microsoft.
I personally think there's more than meets the eye with Microsoft's decision to release kosher Remote Desktop apps for Android and iOS last year. If they're going to perpetuate an Azure-driven DaaS future, they figure people might as well have access no matter what endpoints they decide to use. Even if they aren't Windows powered. If you can't win them fully over, Microsoft bets, at least let them in on the ride.
So in light of all this, it begs a question: what is it that Windows RT can't do after all? Nothing that I can think of. A modern app ecosystem via Windows Store, and a clean legacy path that can be scaled up with little effort through RDS. The best of both worlds in my eyes.
#3: Office Online, OneDrive, SharePoint: Eradicating the x86 Desktop
Taken separately, you may view the above trifecta as just natural progression. But as a whole, they represent the silent vision that will bring together a Windows RT future. That push into the cloud will soon become deafening, and the direction and capabilities the above platforms are providing now represent how Microsoft will position an RT-first pitch to the world.
Office Online is the rebranding for the under-advertised Office Web Apps, presumably to bring these offerings front and center next to desktop Office. And their functionality is gaining attention next to Google Docs; and in many ways, leaving Google's offerings lacking in contrast. Give Office Online another 3-5 years, and it will have the development investment that will bring it up to 80 percent+ parity with what desktop Office contains, if not more. I use Office Online quite a bit on the personal and SharePoint side, and have been pleasantly impressed. And I'm a longtime Google Docs user that recently converted.
And if Office Online isn't your thing, you may already know that Microsoft has nearly the entire Office suite already available on Windows RT. Reports from outlets like Neowin also hint that Microsoft is quietly developing the missing rest of the suite for Windows RT for eventual release. So we may not be relegated to Office Online solely after all.
Microsoft surprised everyone when they delivered Outlook 2013 for Surface RT late to the game. But if information from Paul Thurrott is accurate, Microsoft may be preparing the rest of the Office suite for Windows RT after all. Internal builds at Microsoft purportedly already contain OneDrive for Business, Publisher, and all the other lesser apps. A cue to Microsoft's long term intentions? (Image Source: WinSuperSite.com)
OneDrive on the consumer side is quickly becoming the new norm for cloud storage that Microsoft is clearly forcing upon the public in a fairly heavy handed manner. Sign into your Windows 8.1 device with a MS account? You've already got OneDrive access and storage space, and it's caked into your OS whether you like it or not. This tight integration isn't going to decrease if my Microsoft senses are telling me right.
And on the business end, SharePoint Online in Office 365 is replicating what file servers of yesteryear provided with relative ease. Tight permission controls? Expandable storage space? Browser-based and mobile device connectivity with local sync? SharePoint Online can do all that and more for a pretty affordable price each month. For any business that is already paying for numerous Office 365 flavors, they may not even know they have this tool at their disposal out of the box.
We moved our entire company file server over to SharePoint and couldn't be more pleased. We're actively in discussions with numerous other customers to do the same.
If Windows RT took over the future of Windows, these maturing services would represent how 80 percent or more of the populace could store and access their files. For that other 20 percent that needed legacy capabilities, some flavor of DaaS like RDS from on-prem Windows Server or Azure-powered RDS would fill the void.
#2: Windows RT Peripheral Compatibility is Good, And Getting Better
Microsoft has made device compatibility for Windows RT a fairly high priority, and my own tests with the OS have proven this to a great extent. Yes, it doesn't compare to traditional Windows (yet) by a longshot, but it's still leagues better than what Windows Phone or even Android/iOS have to offer.
Any eventual move by Microsoft to have RT take over as the primary desktop and tablet offering on the market will be met with a need to give people the device choices they have grown used to. Windows Phone has never held this as a priority for obvious reasons, but the RT side already has a nice leg up in this regard.
Unifying the Windows Phone and RT ecosystems under RT will afford Microsoft this solid baseline of device driver availability that is only getting better as time moves forward. They would have the backbone for a tablet, laptop, and desktop OS which would be seriously viable by the time it becomes reality.
#1: Features like Split Screen, Modern Apps on Desktop Hint at the Future
A Windows RT future is slowly being hinted at in how Microsoft is treating the duality that Windows 8 affords. With all of the work being put into melding the functionality of both the Modern UI and Desktop modes, we're getting a glimpse for where Windows RT will take over once it becomes the flagship for Redmond.
Take, for example, the recently enhanced split screen functionality in Windows 8.1 that allows for dynamic resizing of Modern UI apps and desktop mode on the fly. Legacy desktop mode will continue to exist in Windows for some time, both as a holdover for traditional users and as a means to access full blown Office apps that RT already provides. Other useful features like Windows Remote Desktop tend to work slightly better in legacy desktop mode still, compared to the "app" version from Windows Store. But that could change with time.
And as Paul Thurrott mentioned back in January, sources are already hinting that Modern UI apps will be capable of being used fully windowed for Desktop Mode by Windows 9. If Microsoft were intent on fully ridding us of Desktop Mode, this definitely isn't the way to show those intentions. If anything, Microsoft will be keeping Desktop Mode for some time to come.
Windows RT has been following the same development path of traditional x86 Windows 8 and 8.1 at a steady pace, which leads me to believe that Microsoft is merely getting its paths on a parallel track until RT can fully take over as a flagship.
Taking Windows Phone out of the mix will be one barrier taken down, and then RT can have a peaceful adjustment to take over the rest of the Windows ecosystem outside of the server room.
Windows RT: The Sum of Microsoft's Cloud Vision
If you're viewing Windows RT as a mere piece of Microsoft's "devices and services" equation, it's easy to dismiss it as a mere sideshow.
I take Windows RT as actually representing the summation of Microsoft's cloud-first future. The best of desktop blended with the best of a modern app-store driven OS. It's akin to what Apple is dabbling in with App Store penetration into OSX, but Microsoft's approach is going all in. Apple is too entrenched in its own ego and history with OS X to realize where the future of computing is heading.
I understand that it's far too easy to label Windows RT as a short term failure in the market. But as Microsoft has shown time and again, it's the initial rough ride that has given Redmond its greatest successes. From Xbox to Windows Server to post-MSDOS Windows itself. Microsoft's history has shown that it's willing to tread in unpopular territory in order to carve out its computing endeavors.
Don't be surprised when Windows RT makes its primetime red carpet comeback.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net
Over 1000+ hours of live footage covering 98 Winter Olympics events, all being broadcasted over numerous digital and TV avenues -- simultaneously. As an infrastructure geek at heart, and someone who consults clients on their networks, I can't even fathom what kind of backbone is needed to stream a worldwide event like the Olympics. But NBC has fallen back on a familiar face in the cloud arena to make this magic happen: Microsoft.
NBC partnered with YouTube for the 2012 London Olympics and the experience for end users was less than ideal. Robert Cringely had some truthful words about his efforts to catch some footage, and even the NY Times published a scatching piece on the miserable pain that YouTube was putting users through with buffering times. YouTube may be one the most watched video services on the web today, but their secret sauce has traditionally been in handling statically uploaded footage. Live broadcasting is a whole different arena from the little I know about the media arena.
So it's going to be an interesting test for Microsoft's Azure platform this time around. The fact that 50 percent of the Fortune 500 is using Azure is quite reassuring. And I've personally had good things to say about Azure, as my IT consulting company is moving numerous clients to Microsoft's cloud for virtual server needs. Even our internal remote support software, ScreenConnect, is hosted up in Azure and its uptime and reliability is exceptional for my staff.
But the Fortune 500 and my own tribulations with the platform can't replicate what monstrous swathes of viewers will stress the platform with. If you think that the 2014 Winter Olympics will be any different, NBC stated that the 2012 Games saw 219 million viewers, effectively making it the most watched television event in American history.
To be fair, Microsoft isn't the only IT giant to have its fingers in making this all happen. Adobe is providing its expertise in publishing the content and monetization aspects. But Microsoft's IaaS and PaaS services hosted on Azure will be doing most of the "heavy lifting" if you want to call it that. Streaming, transcoding, storage, networking -- all that geeky goodness is being done on Azure. I'm sure Amazon and RackSpace would have loved to land this responsibility, as I can't imagine any better of a case study for a cloud giant to boast about.
Microsoft has touted the reliability and resiliency of Windows Azure for a few years now. But nothing can prepare a platform for the gargantuan traffic spikes that a global event like the Olympics will see. We'll see what Redmond's cloud is really made of over the next half month. If it all goes without a hitch, Microsoft may win over those skeptics who have traditionally turned to Amazon and RackSpace. (Image Source: Andrew Coates)
If you're curious just what kind of horsepower Microsoft is lining up in its court for these Olympics, here's a hint: 10,000 virtual cores. To put that into perspective, most cloud servers we spin up on Azure for our clients are between 2-8 cores in size. Using Microsoft's own figure that each core in the Azure cloud runs at about 1.6GHz, the processing prowess behind 10,000 cores is roughly 16,000GHz. I haven't found out how much memory is being thrown into the IaaS equation, but I'm sure it's just as disgustingly massive in scale.
On the PaaS side, Windows Azure Media Services is the actual engine that is managing and distributing the video feeds for viewing. It isn't surprising, then, that NBC is leveraging the same Azure Media Services offering for a bevy of other outlets such as GolfChannel.com and NBCSports.com. “There are a lot of stakeholders within our ecosystem that depend on this content being delivered at a high-quality rate, to anywhere and any device,” stated Richard Cordella, senior VP and general manager of Digital Media at NBC Sports.
According to Microsoft, the feeds for these Winter Olympics will first flow through NBC's International Broadcast Center in Connecticut (USA) and then get transmitted to two Microsoft datacenters. One is hosted on the US West coast and the other on the East coast. This geo-redundancy will help Microsoft load balance bandwidth needs and also deliver the necessary clustering efforts to leverage groupings of elastic virtual resources.
And not to be left out, a total of 24 other countries like Japan, Canada, and all of Latin America will be getting Olympics streaming upon a custom solution by DeltaTre that is being hosted on Azure. Microsoft is expecting hundreds of millions of viewers to tune into Olympic events from these countries.
If the 2014 Olympics coverage goes off without a hitch on Azure, what will all the skeptics be able to say about cloud viability?
Perhaps this will be the cloud judgement day that proves the naysayers wrong -- for good.
Image Credit: Rido / Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Watching my RSS streams in Feedly on a daily basis has had my head spinning lately. It's not the usual flood of tech news getting to me. It's all the stories hitting recently about the so-called Internet of Things. For a topic that has so little to show for it in the real world thus far, it sure garners a disproportionate amount of attention in the tech media. So what gives?
Perhaps someone can fill me in on what this Internet of Things is supposed to look like. Is it a different internet? Is it a network solely designated for these newfound "things" that need to talk to every other "thing" out there? Or is it just more of what we already see in the market: giving every device possible an IP address to sit on. I'm just as perplexed at this bogus concept as Mike Elgan from Computerworld. He's calling it a wild idea that is rightly "doomed from the start" for numerous reasons.
Cisco's been one of the biggest commercial entities pushing its take on this blurry vision, in the form of an "Internet of Everything" (IoE). I've seen its print and TV ads come and go, and I'm left more stumped after each successive passing.
What the heck is it pushing? Have a look and judge for yourself if such commercials mean much of anything:
Don't get me wrong; Cisco's done a lot of great things for the modern world. It is debatably one of the fathers of the internet backbone as we know it. And it's got some pretty decent equipment that I happen to use for customer projects. But not for one second am I going to sit here believing in this second-coming of a machine-2-machine (M2M) communication revolution.
Cisco has even gone so far as to release a handy "Connections Counter" which is a running tally of all the devices that make up the firm's notion of an IoE. That counter alone makes me feel like I'm behind the times, as the only devices in my home which have an IP are my smartphone, computers, and sprinkling of home theater gear. Cisco claims we'll have 50 billion devices on the IoE by 2020? Time will tell, but I'm skeptical.
The whole concept at large is fraught with heavenly assumptions about compatibility, industry standards that vendors agree on, and penetration of these new-age devices. Not to mention that we've seen how such an unchecked drive towards machine autonomy can end up, at least metaphorically. The SkyNet storyline from the Terminator movies is but one example of what we should ultimately be afraid of. And not for the reasons you may be pondering.
SkyNet: A Metaphor for what the Internet of Things May Become
If you haven't seen the Terminator series (my favorite being T2, no less) you may not have the intrinsic insight as to what parallels I'm drawing between the IoT and the fictional SkyNet network. For the unitiated, here's a brief synopsis, of which you can read about in-depth on Wikipedia.
SkyNet is portrayed as an autonomous US military command and control computer network that employed the use of advanced artificial intelligence to make its own decisions. The program was intended to bring rational and calculated decision making to the way the US utilized its military assets, but as movie-goers quickly found out, SkyNet grows too dangerous for its own good. It launches nukes at Russia as a pre-emptive effort to save the USA, but this brings about nuclear destruction from the Russians in retaliation. In all, 3 billion people end up dying in this "machines gone wild" nightmare scenario, and SkyNet ultimately spends the rest of its existence in the Terminator series attempting to battle humanity to the very end -- all for the sake of self-preservation.
Text alone doesn't do the storyline justice, and I highly recommend you check out at least the first two movies in the series if you haven't already. It's a shocking, chilling scenario that is great movie fare, but I'm pinning it up as a direct allusion to what kind of doomsday the current day Internet of Things is shaping up to resemble. What the heck am I getting at? Don't worry; it has nothing to do with raving mad cyborgs chasing John Connor.
The notion of machines run via artificial intelligence, autonomous of human control, has been around since Terminator hit the movie theatres in 1984. Exactly thirty years later, we're toying with the same concept once again in an alternate dimension. Toasters that talk to ovens which communicate with fridges, all of which connect back to the internet. What could possibly go wrong? (Image Source: Terminator 2 - movie)
Movie fiction shouldn't be scaring you about the potential downsides of an Internet of Things, there's plenty of real-world worries. Take for example this case of Philips Hue lights that were hacked and attacked, as reported by Ars Technica back in August. This otherwise neat technology that allows for people to control their lighting systems from afar (even over the net) was completely exploited, to the point where entire lighting arrays in places such as hospitals or other critical 24/7 operations could be compromised entirely. You can read the full article on how it happened and what Philips is doing to fix it, but this is just the tip of the iceberg for what an Internet of Things doomsday could look like.
Hacking someone's home lights are one thing. But there's also news that has arisen just about a week ago from Proofpoint Inc which claims to have spotted what I think is my biggest fear from this IoT: devices that are taken over in command and control botnets for the purpose of spam, DDoS attacks, and much more.
A snippet of its findings just to frame the discussion here:
A more detailed examination suggested that while the majority of mail was initiated by "expected" IoT devices such as compromised home-networking devices (routers, NAS), there was a significant percentage of attack mail coming from other non-traditional sources, such as connected multi-media centers, televisions and at least one refrigerator.
And if you're thinking it must be referring to just a bunch of "old Windows devices", as many like to presume, this is far from the case. The press release describes that devices of all corners of the computing and media electronics industry were represented. ARM devices. Embedded Realtek-driven media players. NAS boxes. Game consoles. And even set top boxes (yeah, your believed-to-be secure DVR).
Things that have never seen an ounce of Windows code are all suffering the same dreadful fate from attacker exploitation. In this case, they're doing it for the traditional purpose of peddling spam onto the internet. But keep in mind this Internet of Things is just starting to take shape. If your fridge was responsible for sending out millions of spam messages in a year, how the heck would you ever know?
Regardless of the fact that there is no antivirus software, as silly as it sounds, for fridges yet, what kind of protections are manufacturers going to put into place to ensure this doesn't become the norm? A decade of Windows XP users have fought back the legions of infections and botnets that used to make up the Windows malware scene. Manufacturers are going to make the same flawed pitch that Apple has made to its godly following: our products don't get viruses.
Fast forward to a day, ten or 15 years from now, when this Internet of Things has begun to take shape. Your fridge has an IP address. Your home's electrical backbone is connected to the internet. Even your car jumps onto your home Wi-Fi when sitting in the driveway. If malware writers start to deliver the kinds of things computers have dealt with over the last 20+ years, what would we ideally do? Most average consumers can barely remember to renew their AV licensing -- how on earth would they be able to spot that their car is infected with malware connected to a botnet?
Security research firm Proofpoint is already shedding light on the reality that net-enabled fridges are making up the ranks of modern day spam botnets. Still interested in achieving technology nirvana in the form of checking email off your fridge? Count me out. (Image Source: GUI Champs Forums)
Even scarier are the possibilities of what such an interconnected home could experience at the hands of attackers. If it's currently possible to take over someone's entire lighting array (of the Philips kind only -- for now), what if attackers truly took their cunning skills to the next level?
Imagine sophisticated thieves working with hackers who take down your home's electricity from afar. They communicate via encrypted voice chats over Lync with bandits hiding out in a large van outside your home. They happen to tap into the home alarm system and render it useless as well. If they're good, they'll take over your router which controls traffic for your VoIP home phone system and redirect all calls to their own endpoint. Ocean's Eleven style.
The rest isn't too hard to map out. You can leave it up to your own imagination. Theft, kidnapping, etc. The possibilities become limitless when the bad guys have a disproportionate upper hand against the technologies you believed were able to keep you safe. And that's the Internet of Things I'm most afraid of. The one that unlocks a Pandora's Box of unknown.
Universal Remotes Have Been Around for 30 Years -- And They Still Suck
Wikipedia claims that the first universal remote came out back in 1985, thanks to Magnavox (a Philips company; yes, the same Philips that made the exploitable Hue light system above). Mike Elgan makes numerous connections to the messy future of the IoT in his Computerworld article, and I think it's a brilliant analogy.
If you've ever looked into the universal remote market, or tried some of the devices available, you likely have good reference as to what Mike and myself are alluding to. In the 30 years that these remotes have been available, have they gotten any easier to configure? Has their compatibility truly matured to the level you would expect? And better yet, how usable are they in real life? You're probably laughing at the ridiculousness of universal remotes like I am while writing this, because they're anything but universal or easy.
Even if you can find a unit that successfully works with all of your televisions and receivers, you're likely to be roadblocked when it comes time to configure your set top box. Or, if you're like me, and have a HD projector for a television, none of the remotes will be able to talk to it (I have a BenQ SP890, if you're wondering). And even if -- a big if -- you can get everything configured, try teaching your technologically-challenged significant other how to use the darn thing.
If you've given up in frustration, you're not alone. The consumer electronics industry has failed abysmally at solving one of the seemingly easiest conundrums, IR compatibility in remotes. A full 30 years after the first universal remote hit the market, electronics users near ubiquitously hate them for the reasons I mentioned. Every manufacturer would prefer you to use their remote for all your devices. Besides, they think they know best. What proper consumer electronics corporation wouldn't take this stance?
Remotes aren't alone in their anger-inducing compatibility deadspins. Betamax and VHS duked it out for years in the marketplace. HD-DVD and Blu-Ray rehashed a similar battle royale which ended only a few years back. Even smartphones are suffering from the same compatibility fragmentation when it comes to connectors and power adapters. Apple continues to think that the world wants to continue upholding its custom connector designs, while the rest of the sane phone market trods along with micro USB. So we're stuck as the lab rats in this giant consumerization experiment, purchasing adapters for dongles for converters just to connect simple things like phones to our computing devices.
Even HDMI can be considered a standard which should have never been allowed to overtake the market. The body that backs HDMI has created a "pay to play" ecosystem where vendors who incorporate the tech must pay thousands a year in "adopter agreement" fees, and then also pay another $0.15 in royalties per device sold. I try to use DisplayPort when possible to fight the good fight here, but it's tough when HDMI is becoming further entrenched into most media electronics. Take a guess who's footing the bill for these standards-taxes? You are, at the register, when you buy anything with an HDMI port on it like a laptop, TV, receiver, cable, etc.
Most IT pros forget that if it weren't for the forces that backed DDR memory more than ten years ago, we would have been living with expensive RIMM chips for our computers to this day. Just check eBay to see how inflated the pricing is for this KIA technology. Some of it comes down to supply and demand, but just as much is from the "Rambus Tax" that was charged back to memory chip makers.
Catch my drift here? If this is what we have to contend with today, what could possibly be waiting for us in an Internet of Things scenario, as ascribed by the likes of feel-good Cisco? The only vestige of true standardization that has a chance of saving the IoT are the vendor-neutral underpinnings of the internet. DNS, IPv6, MAC addresses, etc etc.
But the same forces who have fought the standards wars in the computer arena, to which we can thank unneeded connectors like FireWire and Thunderbolt to name a few, are likely to be pitching their own tents in the IoT race. I wouldn't be surprised if there was a competing connection technology to Bluetooth and Wi-Fi to get introduced, which gets major backing and comes with licensing fees, and fractures a bevy of devices that hit the marketplace. So we'll be at it again, fighting with non-conforming standards that require expensive converters and adapters.
Mind you, the above examples merely entail simple entities like connectors, ports, and digital storage formats. Yet we're supposed to believe that ambulances are going to communicate with stoplights while transmitting patient info back to a hospital in real time? I'll believe it when I see it.
The Greener Side of the IoT
I've been quite harsh on the M2M predestined future being forced down upon us. There are, however, some interesting things going on in this arena; things that don't require as much cause for alarm.
Computerworld reported just a few days ago about the work of Zepp, a California based company that is working on sensors for all walks of the sporting world. It currently has sensors in the works for baseball, golf, and tennis, but has been infused with more cash that will undoubtedly allow it to expand its horizons.
As a lifelong hockey player, I would be quite curious in trying some of the eventual gear that this company will hopefully create. Ideally, I may be able to skate down the rink, take a shot, and have statistics sent back to my phone about how long I had possession of the puck, how hard my shot was, and the positioning of every shot I took in an entire game. The system would be able to recognize which shots were actual goals, and designate them as such in my data analysis. So theoretically, after a game, I could sit down and view a dashboard showing me where I'm having the most success on the rink and where I need improvement. This is wicked in theory, and I don't think it's too far off, judging by what Zepp has already brought into reality.
Zepp is a company that currently makes some of the sensors which make up the Internet of Things some are predicting. As a baseball player, you may be curious about the speed of impact for your last swing and also what height you hit the ball at. A golfer may be able to up his/her game by finding that their follow through on a Par-3 drive just isn't what they thought it was. This kind of insight was only possible on virtual reality sims -- until now. (Image Source: Engadget)
I could imagine this being taken to the next level, and becoming something that sports teams begin to leverage. Say, for example, a basketball team could use such sensors on both players and on the basketballs used during pre-game warmups. The balls would be able to send back live data on which players are having the best shot completion percentage, what parts of the court they are playing best in, what teammates they were working best with, and so on. Coaches could then analyze the data from the 20 min warmup period to make dynamic changes to a lineup. The tech media is raving about Big Data and its impact on business decisions. I think we can justifiably call this Little Data working to a team's advantage!
Microsoft Research has been busy on the home side of things, furthering a project it is calling the Lab of Things. There's a neat intro video on what the project entails right at the home page, and it covers just what Microsoft envisions for its own Internet of Things. Through a piece of software called HomeOS, tech enthusiasts can already piece together their own home IoT ecosystems with pre-fab sensors and gizmos that can watch over light levels, temperatures, provide webcam views into rooms, and much more.
But with so much focus being spent on wooing us into the IoT, little talk has been dedicated to describing what failsafes manufacturers are going to implement to prevent these SkyNet-like armies of spambots. How will consumers be able to get insight as to whether their devices have been compromised? Will firmware updates become seamless and painless, or will we have to chase firmware release files for all of our IoT gear on a monthly basis? What kind of security standards will be agreed upon in the industry to prevent this from becoming a wild west of machinery working at the beholden of criminals?
It's scary enough to hear that the US electrical grid is already the target of daily cyberattacks which are not getting any weaker. And these are complex, advanced networks pieced together with some of the best industrial, engineering, and technical minds in the world. If they're having a hard time keeping up, how are consumers going to fight back when their net-connected toasters and fridges are overrun? Not even Cisco is answering that one yet.
Mike Elgan made the assertion that the Internet of Things may never happen. I have to agree, and even if it does rear its head, it's going to be an unfulfilled fantasy for years (er, decades) to come due to all the rightful concerns about compatibility, security, etc.
If the manufacturers can't answer basic questions of how we're going to securely envelop ourselves in a sea of M2M-enabled devices, I'll opt to sit back and see how this unfolds. As much as I love the Terminator series, the last thing I need is a personal SkyNet arming itself in my home. Metaphorically, of course.
Photo Credit: Oliver Sved/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
With as much time as I've spent in the education sector, as a student on one end and a high school IT specialist on the other, I know the landscape of educational learning management systems (LMS) decently well. And to be completely honest, it's a landscape rife with half-baked products delivering a fragmented me-too experience.
There's a lot to be desired from LMS environments, at least the one's I've played with in the last half decade. As a grad student at DePaul University (Chicago, IL USA) right now, I'm juggling between no less than three distinct platforms the school relies on.
Professors that run classes in the computing school wing of DePaul (CDM) have the choice to use the department's homegrown LMS called "CDM Course Online" or a commercial alternative titled "Desire2Learn" (D2L in short, as students and faculty call it). And then the University administration throws a third platform at us, called Campus Connect, which is the student-facing portal system for class registration, grade delivery, etc.
Three distinct software environments, no less, and none of them do an exceptionally great job at anything. They're all so-so in most regards; you won't find anyone on campus raving about the great experience they have in any of them. And the University is likewise leveraging each for its strengths, confusing students and professors alike in this juggling act.
Each LMS has its own nuances, settings, functional benefits, capabilities, etc. The only thing beneficial that all of them share is a single sign on profile with a unified account login. Thank god for that late 1990s era technology; I'd probably be ripping my hair out if I had to manage separate IDs for each portal.
It's not just higher education that's stuck in this LMS rut. My former high school district, and the one which employed me as an IT staffer for four years after college, basked in the same technology conundrums. They're juggling three software platforms of their own, with each overlapping in some degree over what the other offerings provide.
There is a website CMS that runs department, staff, class web pages as the "face" of the district. You've got a commercial SIS (student information system) called Aspen which was being messily custom-coded and updated around the clock just before I left my old job. And then of course there's Google Apps, the third leg of the charade that is primarily used for student/staff email, but is leveraged for Google Sites, Google Docs, etc.
A fourth now-forgotten platform that crashed and burned at the above district was Moodle. Most people don't even mention that mess anymore. But I bring it up as a case-in-point.
Say hello to Moodle, one of the largest open source LMS projects on the internet, free for any educational institution to use. Unfortunately, it suffers from the same clutter-brain design mentality that plagues LibreOffice today. The platform may have a lot of bells and whistles, but end user uptake at my former high school district flatlined pretty early. Can SharePoint offer a better all-inclusive experience? (Image Source: Moodle)
Both of the above scenarios are not unique in any way. I'm sure IT pros across the spectrum in the education sector can speak to similar mashup approaches their districts and universities are taking. There has got to be a better way for students and teachers (and parents) to interact. I can't imagine that this fragmented, best of breed approach is benefiting anyone but the salespeople who pawn their solutions as golden remedies for education.
We're in 2014 already, and I already advocated why my company moved from Google Apps to Office 365. It's not a question of who's got the better browser-based client, or who's offering me a few more GB of storage today. It's about the ecosystem. Anyone can piece together a Frankenstein of solutions, as the above educational examples I mentioned have done. But if we're interested in user adoption and experience at the end of the day, fragmenting our technologies into a grab-bag of anything under the sun is recipe for disengagement, pessimism, and ultimately technology rejection.
I fully believe that investing in a unified platform which delivers most of what you want is ten times better than adopting numerous, fragmented platforms that each do one thing exceptionally well. For my company, our virtual toolbox was growing too large to handle.
We had Google Apps for email and document collaboration. Podio was our work order and time entry system. GoToMeeting was our webinar and web conferencing hub. And since Google Drive really didn't care for Microsoft Office files, we continued to use a physical QNAP NAS storage box at our office. All of these systems did something really well, but none of them unified the workflow experience in any great way.
We moved to Office 365 back on Turkey Day 2013, and trimmed down 4 platforms into one common solution, with a single login, unified functional experience, and one set of processes to learn that just make sense. Email now flows through Outlook Web App/Outlook. Our trifecta of Podio, Google Drive, and the NAS gave way to SharePoint Online. And GoToMeeting/Google Chat gave way to Lync which handles all internal communication during the day and customer-facing meetings.
All of them share the same security compliance standards (like HIPAA) that Microsoft advertises for Office 365, unlike our previous approach which was a wild west of everything under the sun.
But the focus of this article is Learning Management Systems, and specifically, how SharePoint may deliver an alternate perspective on this arena. I'm not here to say that Microsoft's SharePoint is an end-all, be-all to our LMS dilemma this day in age. Besides, SharePoint is still primarily a large organization-focused document management/collaboration platform at heart. You won't find Microsoft pitching a SharePoint tent at the K-12 educational tech conferences that I've been to anytime soon.
Nor have I ever personally worked with Office 365 or SharePoint as a student or educational IT pro in an LMS-sense. I'm merely here to opine on what Microsoft has demo'd in a 20 min live video at their offices.
Besides, with as many me-too vendors in the LMS game this day in age, what's the harm in Microsoft taking a stab? It's not like the bar for success is that high to begin with.
Where Does Office 365 Fit Into Education?
The Microsoft blog post that peaked my interest on this topic launched January 2 of this year, and went relatively under the radar. The LMS arena is a rather un-glamorous space in the technology sphere, so no, I don't blame anyone for not taking notice. But as someone who loves tech like myself, and who has been rather interested in the LMS playing field the last half decade or so, Microsoft got my attention here.
Google undoubtedly has stolen much of the glitz over the last few years for its deep rooted efforts in bringing Google Apps for Edu front and center for primary and secondary education. My old high school district where I worked was one of the first big high school districts in Illinois (USA) to "go Google" so I was knee deep in this technology for many years. It's a great product for the price tag of free.
But just like the struggles I had with Google's approach to unified communications and document management in my private sector life, I had similar headaches with the "Google way" in some aspects as an educational IT support person.
Luckily, Google isn't the only party in the free educational LMS realm. Microsoft's got Office 365 for Education (A2) which is also free of charge to verified institutions. The pieces of Office 365 that you receive as a part of A2 are eerily similar to what the E1 licensing level gets businesses that I provide consulting for. 50GB email, SharePoint/SkyDrive Pro, Office Web Apps, Lync, and 24/7 support from Redmond. On a functional level, Google Apps for Edu and Office 365 for Education are for all intents and purposes extremely similar.
But the details are in the execution, and we all know that each respective suite has a certain feng shui to it.
Microsoft's video shows off how students and teachers can leverage Lync for online meetings and distance learning sessions. After using both Google's and Microsoft's competing video conferencing technologies for some time now, I have to say the Lync experience is just that much cleaner. Unlike Google, Microsoft generally doesn't boycott making apps for competing devices. And Redmond doesn't force a social network (Google+) down your throat in order to use their product. All key areas I knock Google Hangouts in.
Google Drive and its Docs component will happily work with your Microsoft Office files -- as long as you are willing to convert them all into native Docs format. SharePoint and Office Web Apps don't require any conversion and will slurp up any traditional Office file you can throw them.
Google Hangouts is tied to Google+, an often-overlooked requirement for using Hangouts for group video calls (Google publicly admits it). Microsoft allows Lync users to invite anyone they want into meetings, internal or outside users alike.
Google Sites and Google Drive work decently well together for the purpose of creating class web pages and project sites. But they are still two separate apps that don't work 100 percent in unison. Microsoft's SharePoint platform can tackle all of these tasks in a single tool.
Of course, each suite can get the job done at the end of the day. But if you feel my pain from LMS hell as I described earlier, you have a different appreciation when I say it's "all about the ecosystem."
SharePoint as a LMS: Hot Air or a Viable Option?
I'll leave judgment up to others as to whether SharePoint is a feasible platform to use for LMS purposes. As I said, I've got nothing more to go on beyond my private sector experiences with the product as a cloud file server and intranet portal. I'm merely going off of what Microsoft has demo'd in its recent video blog entry.
The juicy part of the video starts at 6:15 honestly; the prior six minutes is just marketing fluff you can skip through. They start off by interviewing Jethro Seghers, a Solutions Architect at J-Solutions out of Belgium. He discusses with the interviewer on how distance learning has been buoyed by students and professors that are leveraging Lync and recording class sessions for later viewing.
As someone who is going to grad school right now as a distant learner, I can speak first hand about how awesome such technology is. To not have to travel downtown for class after spending a full day consulting with clients is a godsend I wish I was afforded during my undergrad years. And I can honestly say that using Lync compared to the homebrew solution my school has now would make the experience quite a few notches better. But I digress.
Jethro also covers how SharePoint, and the Office Web Apps specifically, offer native language translation for international students. This is huge -- not for me, but for foreign students who I see struggling in my classes when they get documents that are naturally all in English. Having the capability to translate such documents on the fly is a feature that I see enabling better geo-global learning for distance students, regardless of what native tongue they may come from.
After Jethro's interview on conceptual usage of Office 365 in the distance learning scenarios, Microsoft finally demos how these tools work together in real life. Yes, most of this was canned and prepped for this segment, but that doesn't devalue the positives behind what this tech can offer educators and students alike.
First off, a demo class web page was shown for a hypothetical "Marketing 302" college course. A discussion board, shared document center, and class site sections were all displayed and visible in this working demo. The site itself clearly runs on SharePoint, as if the logo in the upper left corner didn't give it away:
While I have never thought of SharePoint in a LMS sense, it actually does seem like it may have the prowess to compete with the other big players. All of the underlying pieces that students and teachers are yearning for are present. Document sharing, calendaring, discussions, intranet-esque page design, and of course, tight permissions for users. Whether it delivers in the end is hard to say, but Microsoft sure makes it look smooth in their demonstration.
The two Microsoft employees who are the students in the video then go on to portray their usage of the various Office 365 technologies for a class project they've been tasked with. One of the students happened to miss class one day, as the story went, so OneNote notes were shared between the two:
And it's pretty clear that the above shot was taken from an iPad, not a Windows device. Microsoft is hitting Google pretty publicly where it hurts in insinuating that their cross-platform support is far superior -- and I have to agree. Being a Windows Phone 8 user myself, I felt the shock of not being able to get Google Maps or Voice on my Win Phone. Yet when I was on Android, I had access to a full array of Microsoft apps like SkyDrive, Lync, etc. For those who say Google doesn't play favorites, guess again.
Of course, a video demo wouldn't be complete without showing off a project collaboration site that the students used with their group. While not displayed below, the video shows off how the project team was able to use a shared Tasks app in SharePoint to pinpoint progress on their endeavor at each step of the way:
To round out the scenario, the two show off how they touch base on Skype and handle a professor meeting over Lync. All plausible situations for a modern class.
As merely a proof-of-concept, the video does win me over. I would have much preferred to see the suite in action at a real life college or high school setting, as that is where the battle for the hearts and minds of administrators, teachers, students, and parents is taking place right now. But it is what it is, and for what it is, the video does a good job in showing an alternative approach to what the fragmented LMS scene has yet failed to achieve. Harmony.
I'd be interested in digging into SharePoint further to see how applicable Microsoft's use case scenarios are. They seem to fit right into the needs gap for what I have been piecing numerous technology platforms and packages together to solve in my own day to day studies in grad school. My company has been having great success at using SharePoint as a cloud file server so far, and Microsoft's demo situation isn't so far fetched to believe as such.
It would be neat to hear from people who may leverage SharePoint and/or Office 365 in the education sector currently. How well does it work? How customizable is it? Do staff/professors like it better than students or vice versa? What are the pain points you are seeing with it? How does it stack up against Google Apps? This is the kind of real world insight I'm intrigued to hear about.
One such example of in-the-field usage that I could find was in the form of a presentation about how Arts University College Bournemouth (UK) made a switch from BlackBoard to SharePoint for its LMS needs. The post is live on one of Microsoft's MSDN Education blogs, run by Ray Fleming. There's even a dedicated commercial product called SharePointLMS which seems to be one of the premier packaged LMS solutions that bring SharePoint into the classroom. This is a paid solution of course, not to be confused with Microsoft's completely free vanilla SharePoint Online that is part of the Office 365 for Edu A2 suite.
If Office 365 and SharePoint are truly as functional in the classroom as Microsoft makes them out to be in this video, then school districts across the USA are leaving valuable free tools on the table by not signing up. It's good to see that Google isn't the only one doling out complimentary licenses for education.
You can watch the whole video I'm referencing in its entirety below:
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
You're probably expecting me to write a scathing exposé on how I've come to dislike Google Apps. That's quite far from the truth behind why we left Google. There is a lot more to the story than meets the eye. It goes way farther than just a decision based on boxes checked off on a spec sheet. After more than one month since making the move to Office 365 full time, I can comfortably say we made the right decision as a company.
And of anyone who can make an honest dissection of Google Apps against Office 365, I'd say I'm as well suited as anyone in the IT blogosphere to be passing such critical judgement. Notwithstanding my own personal usage of Gmail since 2005 and Google Apps for my IT company since early 2010, I've likewise been both a Google Apps Certified Trainer and Google Apps Certified Deployment Specialist for years now. And I've personally been involved in Google Apps transitions for numerous small and large organizations in both the public and private sectors. So to say that I've been deeply invested in Google-ism for some time now is an understatement.
I've written some in-depth reviews of Google Apps and Office 365 separately in the past, and get frequent mail from both of them based on how I pitted one suite against the other in this category or that aspect. And while I'm not saying for a moment that I take back any of the statements I made in those pieces, I do honestly believe that "dogfooding" a given platform into your day to day business needs is the truest way to form the most accurate opinion of a product.
Surely, all of the monthly consulting time I spend helping other clients with their Office 365 and Google Apps installations gives me a raw insight with which to form solid opinions upon. But eating the dogfood you're peddling to clients? That puts your own skin in the game in ways that doesn't compare otherwise.
So that was my intended experiment of sorts. After spending nearly four years on Google Apps, learning its every nook and crevice, I threw an audible at my staff and told them we were transitioning to Office 365 by Thanksgiving 2013. And that's exactly what we did. By Turkey Day, we were fully transitioned off Google Apps and drinking Redmond's email kool-aid primetime.
The last month and a few days have been an interesting ride. From UI shock during the first week or so, down to natural comfort at this point. Here's the skinny on what insight we've learned about leaving Google behind.
Forget Spec Sheets: This is a Battle of the Ecosystems
For anyone that has cold-called me asking about whether they should go Google or Microsoft for email, they know full well I don't tote the corporate line that either company wishes. The big players in the cloud email arena tend to have pitched their tents in one camp or another. They're either Microsoft Office 365 only, or conversely, stuck in Google Apps-ville. Unlike car dealers, where buyers stepping through the doors know exactly what they're going to hear when they walk in, clients looking for honest direction for their email and UC needs want more than marketing drivel.
The battle between Microsoft and Google goes a lot further than who has bigger inboxes, more mobile apps, or whatever new whizbang feature can generate easy buzz. I've carefully learned that this is more-so a battle of the ecosystems at this point. Who's got the all-encompassing platform that is looking to solve business needs the way your company views them? Who's going to solve your email problems today, but offer you a segway to cloud document storage & unified communications & etc tomorrow?
That's the question companies and organizations should be asking themselves. Because it's the realization I've come to after our two-feet-first jump onto Office 365 a little more than a month ago. Google Apps isn't a bad platform by any means. In fact, it's pretty darn good. But in my eyes, when you view both suites as the sum of their individual parts, as a collective experience, Office 365 takes the upper hand. And I'll explain why in detail.
At face value, Google's core Apps offerings in the form of Gmail, Docs/Drive, Sites, and Hangouts are fairly solid offerings. But as a collective whole, they lack a certain polish. That x-factor which takes a platform from just good or great, to excellent. Google's way is just that -- the Google way or the highway.
This in-or-out dilemma exists in many facets in the Google Apps realm. For example, using Google's Hangouts functionality for video and voice chat requires you to have a Google+ account activated. It's basically a Google account that is opted into Google's social network, Google+.
I have nothing against Google+ as I find it vibrantly different and more gratifying than Facebook these days, but forcing your meeting participants to all have Google+ enabled on top of having Google accounts as well? That's more than a bit self serving if you ask me. In contrast, Micrososft's Lync doesn't require any of this for me to initiate meetings with external users. As long as I myself have a paid account for Lync, I can invite whoever I want (up to 250 of them, in fact) no matter if they ever had an Office 365 or Microsoft account in their life.
Google plays the same card on the way they treat Microsoft Office users. Sure, you can upload anything you want into Google Drive and store it to your heart's content -- but good luck trying to edit or collaborate on those documents in a web browser. Google will gladly convert those files into Google Docs, and force you to play the Docs-vs-Office juggling act in your file storage needs. We did it for years, but I had enough.
The same goes for Google's half-hearted support for Microsoft Outlook. I know very well that Google has been advertising their half baked Google Apps Sync for Outlook tool for years. I'm far from an Outlook desktop app lover, as I use Outlook Web App nearly 99 percent of the time on Office 365, but I know many companies live or die by it. You don't want me to describe the feelings that users of this plugin which have been conveyed to me. The comments I've heard in the field would make a Comedy Central comedian blush.
Not to mention that Google spent the better part of 2013 lambasting Microsoft for making changes to how Office installs via Click-to-Run, saying that their Sync tool wouldn't be compatible with the new 2013 edition of Office for this reason. And then they made a 180 degree about face come November 2013 and released an edition of Sync that actually does work with Click-to-Run after all. I guess enough enterprise customers poured their lungs out at Google support and they eventually kowtowed.
Mind you, Outlook extensions of all sorts were functional with Outlook 2013 leagues before Google got their act together, including ACT! by Sage and ESET NOD32 Antivirus, to name a few. But I digress.
At face value, Google claims their Sync for Outlook tool is the perfect holdover for those who wish to use Outlook on Google Apps. In reality, I know this is far from the case. Of the numerous companies I've moved to Google Apps who are reliant on Outlook and use this tool, not one has been completely satisfied with the product due to bugs, glitches, and other oddities we run into all the time. Google should be advertising their Sync tool as sorta works, sorta doesn't. (Image Source: Google)
If you take a look at the ecosystem that Office 365 affords, it's breadth and approach is different in every conceivable way. Google believes in an all-you-can-eat pricing approach; Microsoft believes in paying for only what you need. Google's Drive cloud storage app treats Office files like the plague; Microsoft believes you should be able to work on the desktop or in the browser as you choose. Google's Hangouts tool gets first class treatment in Google branded products only (Chrome, Android); Microsoft offers Lync capability nearly ubiquitously on almost every device and OS on the market.
The same can be said about Google's approach to an industry compliance necessity for the medical sector, HIPAA, which has begun affecting our company due to our status as a business associate for healthcare customers. While Office 365 has supported full HIPAA compliance since its early days, Google has been a holdout until Sep of last year. Mind you, Sep 23 was the deadline under the HITECH Act amendment that stated health organizations had to be in full compliance by that date. In short, too little too late from Google's end -- they shouldn't be surprised that healthcare is staying far away from their platform.
It goes without saying, then, that if you are solely chasing feature matrices when making your decision between Google and Microsoft, you're only revealing half the story. An email platform in 2013 is not just an inbox; it's a unified communications tool that will make or break the way your organization works with the rest of the world.
SharePoint vs Google Drive: Who's Hosting your Cloud File Server?
Up until Office 365, my company was living a double life in terms of its document storage needs. We had an office NAS box (a nice QNAP TS-239 Pro II+, which we still use for bare metal client PC backups) that was storing traditional Office documents for some aspects of our day to day needs. And then Google Drive, which was the hub for collaborative authoring that we needed for our onsite tech support team and training team.
But this duality was causing more confusion and headache as time went on. Was something stored on the NAS or Drive? Was it a Word document that we converted to Docs? Which copy was the master at that point? I call this mess the "Docs v Office juggling nightmare" and I was sick of it. Google Docs is awesome for sharing and collaborating, but Google forces you into using their online file format; it's an all or nothing proposition.
So we ate our own dogfood once again after the 365 move, and converted our two-pronged data storage approach into a single unified SharePoint "file server in the cloud." It's definitely not pick-up-and-play like Google Drive/Docs is, but the time invested in building out document libraries with proper permissions was well worth it.
First of all, Google's thinking around how they allocate and manage storage in Drive has always driven my clients and myself nuts. Instead of being able to dole out storage space that is meant for separated, shared purposes -- like shared folders that represent root file shares of a traditional server -- they force storage to be tied to someone's Google account. That is usually an admin, a lead user, or someone similar. In theory, it works decently, but you run into traps easily.
For example, if someone creates a root level folder outside the scope of a folder already owned and controlled by an account that has extra storage allocated to it, then anything placed inside that new directory will be counted against the respective owner's storage quota. So as your organization grows, and people start using Drive the way that it was meant to be used -- in a laissez faire kind of way -- then you better hope all your users have a good handle on how Google Drive storage allocation works behind the scenes. If not, you'll be falling into such "who's storage are we using?" holes. The K-12 sector taking up Apps in droves is running into these headaches head on, I'm hearing.
SharePoint Online and SkyDrive Pro in Office 365 skip that mess altogether. If you're working in true shared folders, or document libraries as SharePoint calls them, you're working off pooled storage space available to all permitted users in your domain. By default, all E-level Office 365 plans (the only ones we recommend to clients) come with a 10GB base of shared space, with an extra 500MB of space added for each extra paid SharePoint user on your account. So if you are a company with 15 users and have SharePoint rights, you have 17.5GB of SharePoint space for your document libraries in the cloud. Simple as that.
SharePoint Online works hand in hand with SkyDrive Pro and allows me to securely sync our company's entire cloud file shares to my laptop, which is locally secured by BitLocker in case of theft or loss. I have access to the exact same files on my desktop SSD (right side) as I do in the cloud and via web browser (left side). This is the cloud file storage nirvana I dreamt of with Google Drive starting back in 2012, but Google has thus far failed to deliver. As much as they claim otherwise, I can't live in a Microsoft Office-less world ... yet.
And SkyDrive Pro offers a completely distinct 25GB of space per person for their personal data storage needs. Think of it as a My Docs in the Cloud. It works more akin to the way Google Drive does, but for good reason: that space is ONLY meant for you, not to be shared with others in a file share environment. You can freely share docs out via Office 2013 or in Office Web Apps, but this is meant to be done on a limited basis with a few people. More formal sharing should be handled in document libraries in SharePoint sites.
Specs aside, have we lost any functionality on SharePoint? Not one bit. Usage of SharePoint and SkyDrive Pro is better in our organization now than under Google Apps on Drive previously, mostly due to there being no more need to juggle between what files can be Office documents and which ones have to be Google Docs. All of our Office documents (Word, Excel, PowerPoint, OneNote) can be shared and worked on offline and online equally well. Office Web Apps don't have 100 percent feature fidelity yet, but they're on-par with what Google Docs offered for getting work done quick and dirty in a web browser.
And we're going to be leveraging SkyDrive Pro heavily soon with a new way of digital work orders that we are going to roll out for our techs which mixes SDP along with Excel, topped off with our new chosen endpoint devices: Lenovo Thinkpad X230 Tablets. A technician's workhorse laptop hybrid with full stylus-driven tablet functionality. Initial tests have been working out real well for us.
I plan on writing a longer expose on how we made the move to SharePoint, but at face value, we are enjoying SharePoint Online due to the numerous benefits it has provided. We erased duplicity headaches, streamlined our file storage into one platform, and combined what Google Sites and Drive had to offer in combination with what SharePoint now does in one single interface.
I'm not saying Google Drive is a bad product in any way. But it didn't solve our needs the way I had expected it to back when Google rolled out it in 2012. After loathing SharePoint for how complex it has always been, the new 2013 iteration is a refreshing product that when coupled with Office 365 is a no-brainer option for moving your file server to the cloud. And clients who we have been moving to SharePoint Online have been equally impressed.
Lync is for the Workplace; Hangouts is for Friends
Another huge tool we've grasped onto quickly is Lync for our intra-company communication needs. Our office manager is pinging our techs in the field with info and vice versa; I'm able to discuss problems with my staff over IM or free voice calls even in areas where cell signal is dead. And as I wrote about last year already, we ditched GoToMeeting a while ago in lieu of Lync for our online conferencing needs.
The battle of the ecosystems between Google and Microsoft is on full display in the video/voice/IM arena. Google is trying its best to transition a muddled 3-pronged offering landscape on its end into one single system. But for all their efforts, I'm still just as confused with their intentions because as of Jan 2014, we still have three distinct options for unified communications in the land of Google Apps.
Google Talk, the previous go-to option for intra-Google communication via IM and voice chat still exists in some remnants. My Google Apps Gmail account still offers it, for example. And then you have Google Voice, which has been Google's POTS-enabled offering for more than a few years now for softphone telephone needs. But some of that functionality is being tangled into Google Hangouts, which is their bona-fide video and voice chat platform going forward.
If you asked me what Google's UC strategy looks like in one sentence, I wouldn't be able to answer you succinctly. It's because at Google it feels like the left arm is on a different page than the right leg, and so you get the picture. They have a fractured array of offerings that all do a little something different, and many have overlapping features -- and so the Google Apps proposition is confounded by too many choices, none of which present a single solid solution for what companies are yearning for in unified communications.
Stop the madness. Since our move to Office 365, Lync has been the answer to our frustrations. I don't have to juggle between Talk and Hangouts for my conferencing and IM needs. I have one single app, one single set of functions to learn, and a tool which arguably ties into the rest of Office 365 very nicely.
Whereas Google relies on Hangouts, a tool that is for all intents and purposes a function deeply rooted in their social network Google+, Lync is an all-inclusive app that can stand on its own via various Lync desktop/mobile apps, but is also present in some facets in the web browser as well. As a heavy user of Outlook Web App, I can see presence information for my staff in emails that they send me, and the same goes for docs on SharePoint document libraries. It seems that I'm never more than a click away from starting a conversation over IM or voice with someone on Lync, reducing the barriers to getting the answers I need fast.
My favorite aspect of Lync has to be the universal access I have to the app no matter what device I am on. If I start a Lync conversation on my laptop at the office, I can head out on an emergency client call and continue the conversation on my smartphone (I use a Lumia 925 now) without a hitch. This was only possible on Google Apps when I was within the web browser and using an Android phone previously. Google doesn't offer much of anything for Windows Phones; but Microsoft offers almost everything for Google's and Apple's platforms. Who's playing favorites in reality?
Google Hangouts limits you to measly 10 person meetings. Lync allows us to max out at 250 participants, and we can even tie into fancy Lync Room Systems as shown above for formal conference style gatherings. It doesn't cost any extra than the price of an Office 365 E1 account or higher. A darn good deal in my eyes -- especially if you can ditch GoToMeeting/Webex altogether. (Image Source: TechNet)
Yes, I will admit Microsoft's approach to Lync for the Mac desktop platform is still a bit pitiful, as numerous features in the Windows version are not available on the Mac side yet. And I called Microsoft out on it in a previous post. But from everything I'm hearing in my circles, Office 2014 for Mac (as we expect it to be called) will bring Lync and the rest of Office up to speed with what Windows users are afforded.
Another area that I am anticipating to launch any month now is Microsoft's first-party Lync enterprise voice offering that will enable us to use regular SIP desk phones with the Lync service for full telephone capabilities. While we are using RingCentral right now without a hitch and love it, I think Lync-hosted cloud voice is the holy grail of unifying communications for my small business. And judging from the number of people that email me asking about this functionality, I'm not alone in my wants. Seeing what Microsoft reps stated last May about this coming-soon feature, all signs are pointing to an inevitable 2014 launch.
Is Lync perfect? Not by a long shot. Mac support is still dodgy and behind the Windows client. I deal with off and on bouts of messages that refuse to reach my staff with errors pointing to behind-the-scenes goofy Lync network issues. And Microsoft needs to vastly improve the Outlook Web App integration of Lync to the level of what Google Talk has in Google Apps; the rudimentary support enabled right now is a bit of a disgrace compared to what it could offer users like me.
But unlike Google, which continues to distribute its efforts between three hobbled apps (Hangouts, Talk, Voice), Microsoft is 100 percent committed to building out Lync. And that's a ride I am comfortable sticking around for, as it's serving us well so far.
The Truth Behind Google Apps and HIPAA Compliance
One of the primary reasons I decided to take a swim in Office 365 land is due to Google's lackluster adoption of HIPAA compliance for their suite. If all you use Google Apps for is email and document storage, Google's got you covered.
But is your medical organization interested in building out an internal wiki or intranet on Google Sites? Sorry, that's not allowed under their HIPAA usage policy. Or are you looking to perhaps do some video conferencing with Hangouts between other physicians or even patients? It's a hot and burgeoning sub-sector of healthcare called telemedicine, but don't plan on using Google Apps's Hangouts for it -- Google says you must keep Google+ shut down in your Apps domain to stay compliant with HIPAA. The list of dont's 'doesn't end there.
I reached out to Google Enterprise Support to get some clarification on what they meant by requiring us to have core services enabled to keep HIPAA compliance, and a Patrick from their department replied to me via email:
My apologies for the misunderstanding, you are indeed correct. If you are under a BAA, you can turn off non-core services but core services such as Gmail, Drive, Contracts etc must remain turned on.
This is another unpleasant necessity for organizations that want to, for example, merely enable Google Drive for cloud document sharing between staff members but do not wish for Gmail to be turned on. Coming from the public education sector before going on my own, I know full well that the picking and choosing of services in Google Apps is a highly desired function and one of the biggest selling points for Apps to begin with. So what the heck, Google?
Don't get me wrong. HIPAA compliance with Google is now fully possible, but only if you're willing to bow down to Google's backwards requirements of what you can and can't use on their suite.
I had full HIPAA compliance with Office 365 on the first day we went live, and I didn't have to sacrifice SharePoint, Lync, SkyDrive Pro, or any of the other value-added benefits that come with the ecosystem. Seeing that Google Apps has been on the market for over 3 years more than Office 365, I find it quite unacceptable for Google to come to the game with one hand tied behind its back, and late at that.
I'm calling Google out because I know they can do much better than what they are advertising now as HIPAA compliance with Apps. And until that happens, I'm refusing to recommend Apps for any clients even remotely associated with the healthcare industry so they don't have to go through the pains I described.
Office 365: Still Not Perfect, But A Value Proposition Better than Apps
There's a lot of things I love about Google Apps. Its release schedule for new features is blazing fast; much quicker than what Office 365 has. Google has a knack for releasing innovative features, even if they don't fill needs gaps for what I am yearning for. And their all-you-can-eat price point of $50/year USD for Apps is a hard price point to beat even for Office 365.
But I've come to learn that wading through marketing speak and engrossing yourself in a product as massive as an email suite is the only way to truly uncover what each platform has to offer. No amount of consulting for clients could give me the insight on these two suites as actually using them day to day has afforded me, in direct knowledge and purpose-driven understanding.
I don't regret for a minute the four years we spent on Google Apps. It's a solid, good product. It's second only to Office 365 in my eyes. Hosted Exchange, Lotus, Groupwise, and all the other second-tier options are far behind in contrast to these two suites in pricing, bang for the buck, and security/compliance standards. But Microsoft's value proposition is one which I can relate to better.
Splitting the apps apart, you will likely find areas where Google's respective apps do a better job at this or that. But an email platform investment is a two-foot dive into an all-encompassing experience that goes beyond the inbox today moreso than ever before. And that's where I find Microsoft to be winning the ecosystem battle: in providing an immersive experience that doesn't have rough edges drowning in engineering experimentation.
At the end of the day, I have a tech consulting business to run. While I enjoy fiddling with the ins and outs of features for my customer needs, when I come back after a ten hour day onsite, the last thing I want to be doing is bending over backwards to work the way a suite expects me to. And that is increasingly what I was feeling with Google Apps. Google's vision of cloud computing is markedly different than most others', and if you can't abide by their rules, you will pay the price in lost time and functionality.
My customers have learned this very fact with the Google Apps Sync for Outlook tool. I've experienced this with our frustrations with Drive/Docs. And most recently, Google's HIPAA compliance stance leaves me scratching my head. So for the time being, we've bid Google farewell for our own internal needs.
Will we return someday? I hope so. But for now, Office 365 is doing a darn good job and I'm more than pleased, even if Microsoft has its own kinks to work out with Lync and Outlook Web App. I've brought my company onboard for the ecosystem, not purely for an email inbox. If you can step back and objectively compare email platforms in the same manner, you may come to a very different conclusion as to what vendor you should be sleeping with tonight.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Just a short three years ago, skeptics called outgoing Microsoft CEO Steve Ballmer as utterly foolish when he uttered his three iconic words: "We're all in." Ballmer, of course, was referring squarely to Microsoft's position towards that 'cloud thing' which was rearing its head swiftly three years ago. Most of us were taken aback, since Microsoft seemed like the last company interested in shedding market share to non-traditional licensing.
Turn the page to 2013, and Microsoft is one of the cloud's most vocal champions -- with growth numbers to prove the cloud is a hot market and growing even hotter. In February 2013, Redmond re-launched Office 365 for Business, which was a big enough improvement over the suite's questionable former self that I gave it my two thumbs up finally. I'm not the only one seeing Office 365's about face; a full 37 percent of organizations surveyed recently are admittedly adopting Office 365 within 24 months.
Cloud email and office productivity isn't the only thing on Microsoft's mind. The software giant let loose that as of this past summer, a full 50 percent of the Fortune 500 was already using its cloud PaaS/IaaS ecosystem Windows Azure for internal needs. Keep in mind that Azure was only unveiled in 2010, making this second place competitor to Amazon a feisty up-and-coming underdog.
Not to be outdone, Amazon is posting similar growth and uptake of its cloud services, with net sales as of Sep 30, 2013 hitting a monstrous $960 million. To put it into full perspective, that's a 58 percent year-over-year improvement for Amazon's AWS division. Even one of the oldest players in the cloud game, Salesforce, posted 36.5 percent year-over-year revenue growth recently.
This past year has been one full of experimentation for some vendors, maturation for others, as well as entrance for numerous new faces. 2014 is no doubt going to continue defining the shape of the cloud's many areas, and I've got some predictions for how the next year is going to play out.
Here's my cloudy outlook for the year ahead.
#7) Data Portability Comes Into Focus for Organizations
The cloud's all arms in letting you in. Every vendor's got a red carpet leading into their own respective ecosystems. But leaving the party, whether to take data back in-house or merely to switch providers, is a bit trickier. The big players like Windows Azure and Amazon AWS claim to be making strides in data portability, but the status quo isn't anything to be proud of.
Welcome OpenStack to the fray. This first truly vendor-neutral cloud operating system aims to create a modular, level playing field for any vendor looking to offer cloud services to customers. What's the big deal? Imagine a future where switching cloud vendors is as easy as asking for a bulk transfer of your data into another datacenter of a competing vendor. No conversions, manipulations, transfigurations, or other similar wacko requirements are asked of you in order to make a switch.
Before OpenStack, the cloud was looking more and more like a modern age Wild West. 2013 saw a huge push from this new cloud OS community in bringing together swathes of partners dedicated to one purpose: bringing down the barriers and offering a vendor-neutral approach to cloud data portability. (Image Source: OpenStack)
OpenStack is still relatively young, and rather misunderstand to most IT professionals, let alone consumers. 2014 may indeed be the breakout year in which this initiative gains a ton of traction as organizations continue demanding data portability openness and fairness. Check out this neat video Rackspace made as a sort of OpenStack 101 for beginners.
#6) Standalone Private Clouds Bid Farewell
I'm not sure where the private cloud push got its momentum, because I've always seen it as a bastardized me-too approach to cloud computing. Clouds that are developed solely to serve a single organization's users... hmm, didn't that already exist? Yes, it was called the internal datacenter or merely servers hosted by the IT department which had external access opened up. Whoever tried to re-invent this has-been approach, please stand up?
My thoughts on the private cloud's demise are shared by others, like Joe Kinsella of CloudHealth Technologies. Just a few days ago he wrote that "The private cloud has one major flaw in its armor: it is developed, deployed and managed by the same IT organizations that provided enterprises their current underutilized, underperforming, and non-agile data centers." Bingo! The private cloud moniker was nothing more than a perpetuation of the same yesteryear approach to enterprise IT, but in a prettier packaging.
If you want to keep your internal servers, feel free to do so. But don't peg it as a private cloud. 2014 is going to continue shifting the debate towards whether organizations should instead go hybrid or full-on public cloud for their needs. I've written recently about some of the hybrid clouds my company has been piecing together for clients, and I'm confident that the hybrid approach is going to balloon in popularity next year due to the flexibility, choice, and reasoned cost savings that it offers.
#5) The Legacy PBX Keeps Crumbling Thanks to VoIPMy last piece on BetaNews dove deeply into this very subject. VoIP in all of its various flavors is a hot subject. So much so that just a year ago, I could count the number of small businesses I support who were moving to VoIP on one hand. Now, everyone and their grandmother is itching to ditch their onsite PBX for good.
Unified communications, the all-encompassing umbrella that is web conferencing, telephony, voicemail, email, and IM, is driving this push in many ways. Companies are realizing that the old way of doing business with fragmented technologies being layered in order to provide a cohesive end user experience just isn't so ideal anymore.
A legacy phone system pushing phone calls to static desk phones, voicemails to aging Exchange systems, coupled in with web conferencing being doled out to third parties like Webex, and conference call bridge service being used by one completely different link in the chain. You can catch my drift -- a sprawling technology mess. Unifying these experiences and reducing friction is key to all forms of VoIP entering the hands of workers who were otherwise resistant to coming on board. The status quo in communication was too hard to handle for many, hence their apprehensive attitudes thus far.
I think that cloud-hosted VoIP is going to continue to be a huge driving force in bringing this technology down to the masses. This is due in large part to heavily reduced upfront costs and ease of administration. I've long voiced my support for the likes of RingCentral, but I'm expecting 2014 to bring Microsoft into the picture with a cloud-hosted Lync Online solution that ties in the missing piece of the puzzle: desk phone and enterprise voice service.
If Microsoft's comments back in May of this year still hold true, then we should be seeing an Office 365 based cloud VoIP solution by the end of 2014. VaaS, or Voice as a Service, may get that much more appealing for those who have been yearning for an ultimate all in one offering that is hosted in the cloud and ties in all their communication needs into one central offering. If anyone can deliver on this kind of promise, my money's on Microsoft.
Who won't be enjoying 2014 as much? Legacy PBX installers and technicians. A word of advice to them: it's time to start repositioning your skillset in Lync, VoIP, and other modern voice alternatives. For all intents and purposes, the traditional on-premise PBX is (almost) dead.
#4) Hybrid Cloud Becomes The Go-To Choice
VMWare and Amazon both have it wrong. Each of them is wildly successful at what they do, don't get me wrong. But as the playing field continues to level out, their exclusive -- as opposed to inclusive -- approaches to the cloud are going to continue building vocal foes in the enterprise. How so?
Amazon and its AWS ecosystem are touting the public cloud as the best place for your data. Conversely, VMWare is tooting the private cloud horn, which I predicted a few paragraphs back is going to start hitting the dust -- as far as forward looking IT strategy is concerned. The only big competitor letting the consumer choose seems to be Microsoft with its Azure offering. Who's to say that strictly private or public clouds are the best for every organization? Do Amazon or VMWare truly know better than the CTO or IT Director?
Microsoft, instead, is kindly telling companies: you make that cloud path decision. We'll provide the backbone for you, no matter what you're most comfortable with. It's a viewpoint many of my own customers are opting for, and the reason Azure is an attractive alternative to the two former players.
#3) Cloud Document Storage & Sync Services Grow Up
Cloud file sync services like Box.net, Dropbox, and Egnyte have been marking their territory for years. What's been the biggest thing holding them back? Data security concerns and pricing, in my eyes. Both of these previous black holes are not completely gone, but getting much better.
The cost of cloud storage continues to drop as big vendors like Box and Dropbox increasingly leverage their growing economies of scale. Likewise, rising user bases are providing the influx of cash needed to continue development on bringing more features that were relegated to the internal data center -- tight security controls, auditing capabilities, etc -- into the cloud infrastructures behind these products.
While security is still a touchy subject among C-level execs and IT pros alike, new surveys are showing improvement on the trust relationship between consumers and vendors. I touched on one survey of small businesses by comScore a few months back that found 62 percent of them claimed their data privacy actually rose due to their moves into the cloud.
And another recent survey highlighted by Infosecurity Magazine found similar rising confidence in cloud services and their ability to host confidential company data. There's no doubting that cloud vendors, especially the biggest players, are taking datacenter security more seriously than ever before. And the market sentiment is finally reacting to it.
Microsoft's newest cloud-based SharePoint Online is a remarkable departure from the traditional confusing releases of prior years. Not only can you take your document file server to the cloud with ease, but Office Web Apps are almost good enough to allow basic desk workers to ditch their desktop Office. No joke. I can't say the same about the competing cloud storage vendors yet. (Image Source: MS SharePoint Forums)
I'm eating my own dogfood, too, on cloud document storage. My managed IT services company just ditched a hybrid file storage solution based around a NAS box and Google Drive in exchange for a pure SharePoint Online "file server in the cloud" approach. The centralized security controls, auto document versioning, and SkyDrive Pro syncing capabilities are much better than what we had sprawled out before. And customers of ours are itching to be moved into a similar path in greater numbers than ever before based on calls we're receiving.
If your feelings towards cloud file storage were previously bitter, 2014 may be the time to honestly reconsider.
#2) Price Battles Turn Into All-Out Price Wars
The past few years have been nothing but posturing by the big boys. I'm predicting that 2014 will see even fiercer competition on the price front, especially among the 800 pound gorillas in the public cloud sphere. Amazon AWS, Windows Azure, RackSpace -- as more IT budgets are setting aside big sums for cloud-first hybrid initiatives, each of these vendors is going to yearn for bigger slices of the consumer pie.
Commentators were pretty vocal back in spring 2013 when it seemed like Azure and AWS were set to start an all our price grudge match. RackSpace dipped its toes into battle as well this past year, and got bit on the hand a little harder than it wanted. 2014 is going to allow the gloves to come off, I believe, for two reasons: the cloud market is becoming increasingly flooded with competition, and technology levelers like OpenStack will allow even the smaller fries to go toe to toe with the cloud's sheriffs.
The winner is none other than the consumer in this war, which is the best part. Better uptimes, greater choice, and lower prices. Who can complain?
#1) VDI Makes Its Second Coming: In It To Win It (Finally)
Thin clients. Zero clients. Call them whatever you wish. The technology for virtual desktop infrastructure (VDI) has been around for about a decade now, ever since Microsoft first offered Terminal Services way back when in its infancy. But a number of things always kept VDI out of reach for the masses. The technology (both backend and frontend) was costly, tough to install, and even nastier to maintain in scale.
I know first hand how terrible VDI used to be. We dabbled in VDI at my previous job working in IT for a public high school district. The product we attempted to leverage was one offered by NComputing which was supposed to provide simple thin clients for students that needed general access web browsers and Microsoft Word around our buildings. Needless to say, after just a few months of trialing, and endless rounds of drowning support calls, we pulled the plug prematurely. There had to be a better way then that nightmare.
I've got renewed faith in VDI come 2014, as the latest market offerings are spurring tons of new hope. We've been deploying a flavor of VDI, namely Windows RDS off Server 2012 R2, for medical offices and mobile workforces with tons of success and low cost. The same can also be done purely from the cloud now, too, as Microsoft quietly opened the doors to Azure hosting Windows RDS sessions this past summer.
But Microsoft is far from the only player attempting to bring VDI back into the forefront. Amazon wants in on this area, and it recently unveiled its huge new Amazon Workspace offering which hopes to bring DaaS to the masses. VMWare made headway as well by gobbling up one of the biggest existing DaaS vendors, Desktone, back in early November. What was a lonely market space just a short time ago is going to explode in 2014 as each vendors' respective offerings get spruced up and flown into battle.
I can only wonder how the non-Microsofts of the world are going to scurry past Redmond's licensing rulebook on this next go around in VDI/DaaS. Former DaaS vendor OnLive got shut down by Microsoft's legal teams pretty quickly after the assertion was made that they were breaking longtime Windows licensing terms in how they delivered desktop instances to customers. Prior to being acquired by VMWare, Desktone published a massive 4 part series of Q&As about how it handles these licensing gray areas. We'll all be the spectators in this one as it begins to play out next year.
Amazon is taking its WorkSpaces offering into unchartered territory: bringing a twist on Microsoft's own RDS technology to the working masses. If you thought Amazon wasn't interested in Desktop-as-a-Service, guess again. (Image Source: Amazon)
There's one offering which hasn't even been announced yet, but has every Microsoft watcher (including the likes of Mary Jo Foley, Paul Thurrott, etc) claiming its existence in development back at Redmond. I've touched on it numerous times as well the last few months. "Project Mohoro" as it's rumored, is Microsoft's Azure-based attempt to bring us a full fledged Desktop-as-a-Service offering.
If Microsoft can truly consumerize the DaaS experience into a "pay as you go" environment like Azure, where worker desktops could be spun up with the same ease as a Windows Server instance, then it would give organizations looking to provide desktops on the cheap for internal or mobile staff an easy outlet.
Sure, you could build out your own internal Windows RDS infrastructure, or play with doing it on Azure (if you can handle wading through Microsoft's ludicrous SPLA requirements) but either direction involves a sizable time and IT investment -- both of which are increasingly in short demand at organizations.
I'll be one of the first to call Microsoft out: the time is now for Windows licensing rights to change so that VDI can take off once and for all. Windows RDS is great, but it's a Redmond shoe-in and Microsoft knows that full well. Level the playing field by allowing any version of Windows, down to Vista, to be run in a DaaS-based environment by any vendor who plays by fair rules. If you've got the better vision for how DaaS should be run, then let's see Mohoro come to fruition and fight an honest fight against the other innovators on the market.
But don't hold back the competition artificially through licensing red tape and legal black holes. Everyone including Microsoft will be the long term loser if the status quo continues.
Another Year Down, And Another One Full of Uncertainty Ahead
I don't have any magic glass ball to peek into. My observations are based solely on what I see playing out in the marketplace, with a sprinkling of customer realities and external insights bearing down on my opinion. I'm always curious to hear what others have to say. Am I off track with my 2014 predictions? Did I miss a big item which you think should have made the list? Sound off in the comments area below.
But grab a drink while you're at it. 2014's cloud forecast is looking rather bright no matter how you slice it.
Photo Credit: Creativa/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
It's a fairly typical situation these days: a small business approaches me with a need to replace an aging Exchange 2003 server and Office 2003 for 14 users. They want to compare purchasing their upgrades outright vs just renting them from Microsoft. The in-house server approach for email and Office software will run them roughly $10K USD before any consulting labor -- or they could opt to have us move them into Office 365 E3 for $280/month.
At face value, sure, you could say that the in-house approach pays for itself in just about 3 years compared to paying for Office 365 E3 over that period of time. But you're squarely forgetting about all the hidden nasties which I brought into full light in a previous article on the TCO of cloud vs on-premise technology.
That in-house server will suck down gobs of electricity every month. You'll still need to pay companies like mine to support and manage that server each month. And the "5 year rule" will have you back at the drawing board, pricing out a new server, licensing, and consulting costs once over in due time.
On the other hand, going the Office 365 E3 route affords you access to the latest Office edition (in Mac and Windows flavors) on up to 5 machines for each worker, plus access to Lync, SharePoint, 50GB email inboxes, as well as industry leading security compliance and 24/7 uptime guarantees. I'm not trying to sound like a Microsoft salesperson, but for all intents and purposes, the modern rented alternative sells itself.
I'm not preaching an unwelcome message here. A clear majority of our customers are finding the cloud route, or SaaS (software as a subscription) to make financial and technical sense for their organizations. In an age where Microsoft software release cycles have been cut in more than half, and the incentives on renting vs buying are becoming hard to ignore, rarely does it make sense to view technology purchasing in the eyes of a CAPEX perspective.
Treating our modern software needs as an OPEX cost aligns business financials with a direct need to convert 6-8 year plans into 24-36 month modernization cycles.
The Case for OPEX Over CAPEX Is Hard to Ignore
Ever since we started offering Google Apps to customers back in early 2011, we've been trying to make the case for why turning software purchasing theory on its head is the future. For a while it used to be a tough sell. Many considered it heresy not to have full ownership over the licenses and hardware used to conduct your business.
A recent op/ed by Don Jones from Redmond Magazine put this discussion back into full focus. He was spot-on in wading past the usual marketing bliss offered by cloud vendors and putting a concrete dollars and sense position back into the argument.
"Many IT decision makers are pushing back. Some are doing so for valid reasons and some not so much," Don wrote in his piece. "You know who's almost never pushing back, though? The CFO. That's because the CFO has a special tool IT decision makers often lack -- a calculator."
His op/ed touched on numerous aspects which I fully agree with, but the one which business owners will care a lot about -- namely the CAPEX vs OPEX debate -- reared its head in a logical way. Don correctly asserted, "Buy a piece of software and in most cases you can't write off the expense as a capital investment right away. You have to depreciate it over five years." With Microsoft churning out a release cadence of 2-3 years on average now for Windows and Office editions, does the status quo make sense anymore?
"With rentals, you write off the entire amount in the same year you make the payment. Your bookkeeping becomes slightly easier. You're no longer sitting on capital investments that have yet to be depreciated," Don continued. And he's exactly right: are you truly benefiting from a five year depreciation schedule? And how many businesses truly take advantage of converting any leftover value on their older software by putting it on the auction block? I haven't worked with any.
Purchased software turns out to be worse than a new car. Activate that license serial code, and you know for a fact you're getting nothing in return on the backend. A pretty lousy deal these days, if you ask me.
Scale Up, Scale Down On a Whim
Don briefly touched on this same aspect in his op/ed I mentioned above. A lot of companies that we work with use outside contractors that are provided internal email accounts for image consistency. If your company had a project that was bringing 25 new contractors in the loop, the old way of doing business meant you purchased extra sets of Exchange licenses for them in bulk. One year later, when the project was over, you ate that sunk cost hard, especially if you were experiencing downturns in the number of contractors being brought on -- meaning many licenses sat idle and went unused (perhaps indefinitely).
Traditional licensing is a nasty affair in forecasting, budgeting, and tracking usage counts. The cloud rental approach flips this mess on its head. Office 365 licenses (above), for example, can be added in a matter of clicks, and Microsoft allows for you to scale up in hundreds or thousands of licenses in a single purchase. Trimming your needs is just as easy. Licensing is no longer a CAPEX sunk cost. (Image Source: Microsoft TechNet)
The 2013 way of handling this in a platform like Office 365 entails purchasing a subscription for these 25 contractors, and dumping the licenses when the project is over. No sunk CAPEX costs to worry about, no depreciation to tangle with. It's done and over, and these users can be turned on and off with an hour of work on a web-based administration console. Sure beats the drawn out process of getting approval on a license pack purchase, waiting for its delivery, and asking your IT support team to install and configure the licensing along with new accounts on your own server.
This same methodology is great for businesses that deal with seasonal workers, like landscapers and general contractors and retail businesses. The old way of handling licensing meant they had to bulk up their licensing portfolios just for the times when they were super busy with large staff counts for a few months out of the year. Now, with software rentals, scaling up and down doesn't have to be a regretful experience. It happens just because it has to.
Rapid Release Is The New Status Quo
Microsoft delivered the sizable Office 2003 suite in the year that it's titled after. Four years later, Office 2007 hit the scene. Just three years after that we had Office 2010, and a mere two years later Office 2013 came upon us. Catch the pattern? Microsoft's release schedules are looking eerily similar to that of Google's, which doesn't even place versions on Google Apps. New features are being streamed into the Apps platform at a rate of 2-3 per month now, sometimes more.
It's no secret that Microsoft keeping its product line on the bleeding edge of freshness is a market necessity these days, especially up against Google's business email suite. Tony Redmond from Windows IT Pro just a few days ago penned about how Microsoft views Office 365 for Business as the ultimate "dogfood" environment for what its on-premise editions of Exchange Server will be getting upon release.
Tony explained in detail how "Office 365 runs code for weeks or months before the code is packaged and provided to on-premises customers... I see lots of goodness here because a solid shake-down by running code used by millions of cloud mailboxes seems like an excellent way to find bugs." I fully agree.
Microsoft and Google aren't alone in their shift to subscription based software models. Adobe destroyed any hope that boxed editions of Creative Suite would continue hitting shelves, as Creative Cloud was launched as the new norm for all things Adobe. David Pogue of the New York Times weighed in on the dizzying array of pricing and feature levels that CC offers. For some users, it's a much cheaper way to use Adobe product. For some, slightly more expensive. But one thing holds true across the board: if you're on Creative Cloud, you're never out of date.
Not only does the Office 365 and Google Apps approach to software bring new features to users at a much faster rate, but they also solve this oft-forgotten IT black hole: many on-premise servers remain unpatched for months, if not years, over the course of their lifespan.
Even larger organizations with dedicated IT staffs aren't always shining stars as you may suspect. You don't want to know how many times I've been met with blank stares by IT staff when I was brought in for consulting projects and the question of "what kind of patching routine do you have?" is asked. Shocking, but not surprising, knowing how many IT pros prefer a path of least resistance in keeping systems up and running.
Swinging back around to an earlier point, I wonder how much tougher it is to justify the expenditure of new software suites like Office 2013 when you know for a fact that Microsoft will have a new release available in just 12-24 months time? Sure, you could opt to skip releases like many organizations regularly choose to do, but at what point will falling further behind become a business need that has to be addressed?
Google Apps has perhaps one of the fastest agile development schedules in the cloud email industry. The above screenshot from my company's Google Apps domain (above) shows the existence of an option Google offers just so companies can slow down the pace at which new features are introduced to their internal users. Microsoft is moving Office 365 to a similar release schedule with each passing month. But I don't see this as necessarily a bad thing.
The margins of competitive edge are being driven closer and closer together. Having those extra few features that save you just a little more time and allow for an extra couple dozen of sales will increasingly mean the difference between leading your industry and being left in the dust. I may be ahead of the curve with this assertion, but give the business world 5-10 years. Being 2-3 editions out of date will be the death of many if software development cadence keeps moving in the direction it's already headed.
We started the software space race in the realm of the cloud; slowly shifted over to virtualization; now the next big battlefields are already shaping up in unified communications (UC) and Desktop as a Service (DaaS). Renting software is becoming increasingly key to maintaining competitive advantage by avoiding software stagnation.
Who's Policing Your Internal Licensing?
Organizations such as the Business Software Alliance (BSA) work on behalf of software big shots like Microsoft, Adobe, and Apple to name a few, in cracking down on all things related to illegal licensing practices. Piracy among large organizations is still a big issue for the industry, no doubt, but a lot of the unsung victims of software licensing messes are far from determined white collar criminals.
They're business owners like the ones I consult with regularly who merely don't have licensing experts on staff that have a degree in the byzantine rules set forth by the likes of Adobe or Microsoft. They get entangled in honest mistakes of over-installing copies of Office or Creative Suite on their machines, and get penalized in the form of audits by the software makers.
I feel sorry for those who get dogged in penalties but never meant to do any harm; those who actually really tried to keep in line with proper licensing. Even simple calls to my rep at CDW about licensing for things like SPLA from Microsoft leads to days of delay and numerous rounds of emails between "licensing experts" at the distributor's offices. I guess I'm not the only business owner who can't wade through the volumes of licensing SKUs and subtext without coming out more confused afterwards than when I started.
And if you think having a coveted "Enterprise Agreement" with Microsoft is some sort of panacea, guess again. I spoke candidly with an industry colleague who works for a to-be-unnamed cloud tax software company in northern Illinois (USA), that happens to have an EA with Microsoft, and she was brutally honest about what getting and keeping such licensing status entails. Hours upon hours wasted in preparing usage counts in huge spreadsheets, determining shortcomings or over-allotments of given licenses, and then re-negotiating pricing with a Microsoft representative. This is a yearly or bi-yearly affair, mind you.
Microsoft touts its Enterprise Agreements as a way to "streamline" licensing for organizations, but anyone with insider knowledge knows this is anything but the case. Rented license platforms like Office 365 put the ball squarely in Microsoft's court for keeping accurate license counts, and prevents companies from overpaying on unused licenses. It forces an honest hand from each side of the table, and eliminates the dizzying self-policing organizations under EAs with Microsoft had to endure. (Image Source: MaximumPC)
Whatever savings may be gained on a per-unit basis from pricey SKUs like SQL Server or Lync Server get slowly washed away in the reality that companies are likely over-committing themselves to higher license levels just to stay on the legal side of licensing themselves, in case Microsoft asks to do a complete audit of their license usage counts. Each infraction brings a penalty, further dwindling any semblance of savings away.
Microsoft isn't the only software giant guilty of making licensing a living hell for IT professionals. Adobe, Oracle, Apple and all the rest have similar structures and policies in place to complicate licensing past the point of knowledge by the average person. Some licensing apologists make the claim that this is a necessary evil due to the amount of piracy out in the world today, but I opt to believe that there has to be a better way to treat customers.
Welcome Software as a Subscription (SaaS). It solves nearly all of these ills in one swoop. Since companies like Microsoft and Adobe are charging you by the month for their software, the onus to be fair on license counts is a two way street. A company won't pay any more than what it needs to license on a rental basis, and likewise, the vendor ensures that it is receiving honest compensation for what is being rented out.
Teleworkers or contractors that need access to short term licenses can also be easily accounted for down to a T. For example, Microsoft gives away 5 copies of Office for each Office 365 E3 user you license. As soon as that account is disabled by an organization, all instances of Office tied to that former user get disabled -- as should be the case in an ideal world. You're not hounding former staff to play honestly by ridding their machines of Office licenses, and Microsoft isn't breathing down your back in licensing audits anymore. The technology polices itself, turning software licensing from a hated necessity to a mere fact of business life.
You end your lease on a car from the dealer, and you have no choice but to hand back the set of keys. Software is finally shifting to work in this same manner when it comes to Office 365, Creative Cloud, Google Apps -- name your pick. Is this really such a bad thing? Unless you made a living as a software licensing specialist or consultant, in which case, I feel sorry for you doubly.
Purchasing Software Will Soon Become Passé
The modern business world is moving too fast to be kept back by five year release schedules and purchasing/implementation cycles that take just as long. While the small and midsize business sectors are taking up the notion of rented software at a faster pace than big business, it's only a matter of time until it makes sense for organizations of all sizes.
Purchased software licensing carries a lot of negative baggage. From unfriendly five year depreciation schedules, to licensing-tracking nightmares, down to apathetic approaches to internal patching routines fostering insecure computing environments. I won't trust big software companies with everything they do, but when it comes to rented licensing like Office 365 and Creative Cloud, I genuinely believe they're onto something genius here.
My own company FireLogic rents (almost) every software license we need today. Our email runs exclusively on rented Office 365 inboxes. A rented QuickBooks Online subscription is how we keep our books. Our "file server" is now in essence a team site in the rented SharePoint Online ecosystem, which is part of Office 365. And we handle all inter-staff IM along with web conferencing over Lync Online -- once again, a rented piece in Office 365 land.
I know very well that there are still situations, however, that benefit from outright purchasing. But I'm personally seeing these as far and few in-between as time goes on. Maybe we can tear down the Berlin Wall that is traditional software licensing once and for all by the end of the decade. That would be a tyranny few would miss.
Photo Credit: alexmillos/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
When it comes to Office 365, some people think I'm too soft on Microsoft because I'm always writing about the good things I see in the service. And don't get me wrong, I think the platform is leagues better now than it was just a year ago. Just peruse some of the brutal honesty I wrote about Office 365 in the head to head piece I did against Google Apps back in mid 2012, and you may be shocked about my current viewpoint on the product.
Microsoft has indeed come a long way with the service as a whole. Before the 2013 edition of the suite, I found 365 to be a cluttered "me too" offering that did nothing to differentiate against Google Apps. My biggest gripe was that Microsoft was working too hard to cram desktop-first software into a cloud experience that felt half baked in the end. That notion got turned on its head earlier this year, and my feelings about the latest Office 365 for Business ecosystem are pretty positive overall.
Not for a minute, however, do I believe Microsoft doesn't have room for improvement. Being knee deep in Office 365 domains for many of our customers on a weekly basis, I am seeing the issues being raised on the front lines in a very raw manner. The bugs that most people wouldn't know about unless they worked in such a diverse set of end user scenarios day to day. I've taken the liberty of collecting a list of my biggest gripes with Office 365 as it stands today, both from my customers' experiences and my company's internal trials as we prep to migrate to the platform from Google Apps.
This public flogging of Office 365's biggest sore thumbs were not filtered by Microsoft or anyone else for that matter. In exchange for all the good word I've written about Office 365 lately, this is my chance to shed light on those aspects which I feel need attention to get proper fixes and not just proposed lip service. I'm holding nothing back in my feelings on the cloud suite in this article.
Without any further hesitation, here we go.
The Lync Clients: A Joy Ride in Feature Disparity
Lync has to be one of my favorite aspects of Office 365 for Business. But it's also the one which causes me great frustration when I have to deal with its plethora of disparities across all the iterations it exists in today. For all intents and purposes, Lync has been a product that has been growing in different forms since its earliest iteration of Live Communications Server 2003. That's one full decade of innovation, but I wish we were further along in some aspects of Lync's maturity.
First and foremost, the feature set among the desktop clients for Lync (on Windows and Mac computers, namely) is nothing less than a carnival ride of comparison charts. In fact, it's so confusing, that Microsoft has an official TechNet post dedicated to outlining all these differences. In a throwback to the craziness that was Windows edition counts in the pre-Windows 8 era, you can count no less than 4 different clients (2 Windows clients, a Mac client, and a browser based Lync Web App).
For starters, an Office 365 user that has Lync rights needs to wade through a complicated decision process, usually pre-purchase (hopefully), in order to ensure they will have the features in Lync that they want to leverage, based on the end client they choose to use. Any Lync user can download and install Lync Basic 2013 on Windows, but they lose things like multi-party video gallery, OneNote meeting collaboration, and my biggest favorite: meeting recording into MP4.
Even worse, the full Lync 2013 client comes bundled with E3 level subscriptions and up as part of Office 2013 Pro Plus, but if someone wants to instead purchase Lync 2013 standalone, finding someone that sells this is nearly impossible. You could opt not to mess with any of the installable clients, and use Lync Web App or Lync via OWA, but you get even further reduced functionality which makes day to day Lync usage pretty cruddy. This decision process should not be this excruciating; nor should it take a seasoned IT pro like myself just to consult people on the differences.
And don't even ask me about Lync from within Outlook Web App. I prefer OWA over Outlook any day of the week (I'm a Google Apps browser user for years now) but the Lync experience embedded into OWA is pitiful. You can't view a list of contacts that you can chat with, a la Google Apps and Google Chat in your inbox, and your only option is to click on IM icons within the address book to message people. Even this provides a disjointed experience, since you have to deal with separate popup windows to chat with people that look like pre-AOL Instant Messenger chat boxes. Microsoft has got to bring the browser experience up to par with the desktop apps -- at least to make the browser experience usable.
Mac users, like one of my daytime support technicians, have just as tough of a time on the Lync for Mac 2011 client. From initial Office 365 sign in problems off the bat, to lack of recording capability, and lack of program sharing, this is also a no-man's land in feature disparity. Is Microsoft hard at work on Lync for Mac as part of the new Office, likely Office 2014? Here's hoping they will stop treating our Mac friends like second class citizens.
Microsoft's myriad of Lync client options, especially on the computer side, is almost dizzying. For Windows you can choose between Lync 2013, Lync Basic 2013, Lync Web App, and now the Lync Windows Store App -- all with varying levels of functionality and options. Confused yet? Microsoft needs to clean up its app footprint once and for all with Lync and stop confusing end users with what is otherwise a very respectable, stable messaging service. (Image Source: Microsoft)
I hate getting into discussions with clients about all the nice aspects Lync provides, and the power of its unified messaging, and then find out that they will be using the Mac as their primary computer. Some backtracking ensues, and Lync's appeal suffers depending on which features the end users can't have on their Mac. And I'm not one to begin recommending Bootcamp or Parallels just so customers can work properly; it's a silly patch for a bigger problem.
If you're curious about the steps needed to get signed into Lync for Mac 2011 as an Office 365 user, Microsoft has a lovely 7 step checklist for you which Windows users don't have to deal with. It's not an issue for me as I'm on Windows 8.1, but still, these basics are rookie mistakes that should have been cleaned up by now.
My list of gripes with Lync don't end there. Mobile clients, like my Windows Phone 8 app, do not allow for conversation history syncing to my Office 365 inbox. Google has had this down with Google Talk on their mobile apps; why can't Microsoft bake this in already? And why does Lync seem to have a mind of its own when making decisions on where to send conversations if I'm logged in on both my Windows Phone and desktop client? Can't they just treat incoming messages like Google Chat does, and send them to both places intelligently so I don't have to play device hot potato? It's these aspects of polish that are holding back Lync from primetime glory.
While I have a lot of love for what Lync does well for my clients and my own business, it needs to ditch the overlapping client editions, clean up its bigger bugs, reduce feature disparity, and actually make a valid effort in the Outlook Web Access client for browser-first users.
SkyDrive Pro and SharePoint Online: Hobbled by Sync Limits
The SharePoint Online and SkyDrive Pro companion service are two facets of Office 365 that are allowing my company to move our document file server needs up into the cloud finally. Office Web Apps are markedly better than what Google Docs has provided us on Google Apps, especially because we don't have to play the juggling act of making decisions on what stays in Office file format and what gets changed into Google Docs, and vice versa. It's a mess that I believe is holding back Google Drive from mass enterprise adoption, and I'm hearing gripes from customers first hand every month.
But this isn't a post on Google Apps' shortcomings. I'm here targeting aspects of O365 that aren't up to snuff yet, and SkyDrive Pro happens to be in its rookie year still. This isn't a bad thing by any means. I think the product is working well for what it is, and once configured, it tends to churn away and do its thing. But the keywords here are once configured.
Dropbox and Google Drive, and heck even SkyDrive Personal, have got the "easy setup" notion down pat. Install, sign in, and select what you want to sync. But this sadly isn't the case with SkyDrive Pro. While signing into the product gets you far enough to have access to your personal "My Docs in the cloud", you have to go out of your way to setup sync links into document libraries you have access to.
SkyDrive Pro asks users to setup syncing via the use of manual library URLs, which means setup is rarely a point and click affair for my clients. Why can't SDP be as easy to setup for shared library usage as Dropbox or Google Drive are? There is no reason engineers can't develop SDP to check against what doc libraries a user may have rights to, and offer them a checklist they can use to turn syncing on and off within.
For the uninitiated, document libraries are how we can transform SharePoint Online sites into bona fide cloud file server shares for your users. I explain them in some detail in my overview tutorial I made a few weeks back for YouTube. You can create any number of these libraries on SharePoint Online and map them into SkyDrive Pro for local file access that works exactly like Google Drive, Dropbox, you name it.
But Microsoft forces you to web browse into the libraries you want to have syncing access to, and for the instances where the native "sync" button doesn't work in SharePoint Online to auto configure SkyDrive Pro, you're forced to then copy manual links into SkyDrive Pro for setup. It's a bit archaic and reminds me of something that belongs in a Windows XP era computing environment -- not a circa 2013 cloud platform like Office 365.
And can Microsoft explain why there isn't a SkyDrive Pro client for Mac yet? We have had to turn a few customers away from Office 365 because they couldn't allow their Mac users to take advantage of the localized syncing offered by SharePoint Online document libraries. A darn shame, because moving your file server into the SharePoint cloud is pretty darn attractive these days as opposed to replacing aging file servers.
The other nasty thorn that affects SkyDrive Pro users is that there is a little advertised 5000 item limit on the number of items you can sync to your PC from a SharePoint library. This is especially odd, because the same SkyDrive Pro can sync 20,000 items from your personal "My Docs" library to your desktop. Does that make any reasonable sense? I'm not sure why Microsoft did this, as most companies these days are using SharePoint for cloud file hosting in shared libraries -- not asking their users to keep gobs of personal data in their own little silos on SDP. This thinking has got to be revised for future releases.
And I'm personally begging Microsoft to increase the amount of shared storage space that SharePoint Online gets by default. 10GB plus 500MB for each subscribed user in your domain is nothing spectacular, especially since they are giving individual users a full 25GB for their personal cloud storage needs. I'd much rather see this space given to SharePoint Online base storage, with smaller amounts for personal usage, since companies moving file shares up to the cloud need this space more than Jack or Jane do.
Mobile Access for Shared Contacts, Calendars Non-Existent
Most of our customers these days are highly concerned about the mobile access their email platform affords, and in most respects, Office 365 has this area well covered. Until, that is, you want to start doing anything outside the box of kosher traditional mobile email on your device.
For starters, I'm not certain why Microsoft has made available such a great mobile client for the personal Outlook.com service (I'm referring to the Android scene here) and totally neglects the Business edition users of Office 365 who have to use native Exchange functionality on their phone. Unless Google has improved things in the newest KitKat release, my recent experiences with the native email client on my old Galaxy S3 were pitiful to say the least. While Gmail got first party status, it felt like Office 365 email was relegated to the back of the bus in every sense.
Sure, I could have went with a third party client like the stellar TouchDown, but I ask: why should I have to pay for a third party mail app on top of my Office 365 subscription? This doesn't make any sense. O365 usage on my Windows Phone is quite heavenly, but it being a Microsoft product, I wouldn't expect any less.
It would be nice if Microsoft finally released a single Outlook app that behaves just like the current Outlook.com offering in the Play Store, but allowed for Office 365 functionality. Many customers have asked me the same for months already, and I have no solid answer for them. Perhaps it's a pitch to get a Windows Phone, or just one of those moments reminiscent of what some Lync users deal with, as described earlier.
One of the biggest gripes my own company is dealing with now, as we transition FireLogic over to Office 365 around Thanksgiving, is the cruddy ways in which shared calendars and shared contacts are handled on smartphones. In all fairness, I'm not going to single out Microsoft here, as they are just as terrible in this regard as Google Apps (Google got calendaring right, but dropped every ball on shared contacts flat out).
At least on Google, we were able to use calendar sharing natively with Google Calendar, and shared contacts were held together with a free tool that has since gone paid, called FlashPanel. That tool had a feature called Mobile Contact Sync which allowed the shared contact address book in Google Apps (which is not accessible from the Google Apps admin panel -- go figure!) to be copied into people's personal address books. Great tool, until they turned this and most other cool functions of the app into paid options behind our back.
From other previous experiences with customer deployments, we have found that Microsoft's shared calendaring and contacts implementation has not matured past its Outlook-centric approach. And this holds true entirely; users who are on computers with full Outlook can share and co-mingle calendars and contacts with near fluid ease. Convert this into the OWA web browser experience, or worse, onto a smartphone, and you're dealing with 1990's era limitations. For a mobile onsite service company like mine, in the field with clients near incessantly, whipping out full Outlook to get client contacts or appts off our shared calendar is a non-starter.
Why can't Microsoft incorporate proper shared contact functionality into the Office 365 administrative console, using the already included Global Address List functionality? The current setup works great as long as you don't need access to contacts on the go. This little modern necessity is where Office 365 (and Google Apps, for that matter) falter badly. It's hard to believe that I'm the only one running into this as a gaping feature hole.
I posted onto the Office 365 forums recently, seeing if any headway was made in fixing this glaring feature hole in Office 365. The only suggestions the forum support reps could offer was to use Outlook Web Access on our phones to gain access to shared contacts. This is completely unfeasible, unworkable, and senseless. Am I supposed to pull over on the side of the road to log into OWA on my Lumia 925 in order to see my GAL or shared calendar with appointments on it? A bit ludicrous, and I'm even surprised support reps are offering such options to customers.
Darrel Webster took the time earlier this year to blog about his suggested workaround: just using a separate Exchange Online Plan 1 account for the purpose of sharing. This is a great workaround, which provides native calendar and contact access on smartphones of all walks, but it does NOT solve the question of keeping controlled access to contacts and calendars. This is especially true of shared contacts; I much prefer to have full administrative control over our shared contact list, so we don't begin suffering from the "too many hands in the cookie jar" dilemma.
Oh well. Here's hoping Microsoft can either reform the way shared calendars and contacts work in Office 365, or make shared mailboxes fully accessible from smartphones. We don't live in an Outlook centric mobile atmosphere anymore. Office 365 dearly needs to adapt to the needs of its modern user base in this area.
Office Web Apps: Maturing Nicely, But Still Show Amateur Growing Pains
Given that Office Web Apps have lived roughly half the lifespan of their Google Docs counterparts, I have to say that Microsoft has made incredible progress on where they were merely a year ago. But being as good as the other guys isn't enough for me as an IT professional. I want to see Microsoft get Office Web Apps on a level where feature parity with desktop apps is closer to 95 percent, not 70-75 percent.
For instance, OneNote Web App still represents a very young sibling to what the full desktop app provides. My biggest gripe with it? Unless I'm missing something, it has zero inking support. My Thinkpad X230 Tablet with its awesome stylus is worthless if I were to use the web app iteration of OneNote in its current shape.
And while real-time co-authoring has made great strides for users working within the Web Apps together, it's unfortunate to see that the collaboration between desktop app users and web app users cannot be kept at a similar pace. Just today I tried to demo Excel Web App for client looking to move from Dropbox to SkyDrive Pro, and my desktop/web app demonstration using Excel was cut short when the desktop version of the same spreadsheet was showing up as locked because another user was working in it. I presume that meant the Web App side, but aren't the two sides supposed to work in unison now? Isn't that the purpose of SkyDrive Pro to begin with?
For the most part, the Office Web Apps are progressing at a rapid pace in terms of development, but quarterly updates aren't enough these days, when Google is pushing nearly bi-monthly changes to its infrastructure now on Google Apps. Microsoft needs to catch up and get ahead if it plans on winning the browser-based office suite war in the long run.
Outlook Web App: Still a Sad Excuse for Full Outlook, But Why?
Most people forget that out of all the Office Web Apps Microsoft has now, Outlook Web App has been around in the form of Outlook Web Access since Exchange 2000. That's nearly 13 years of presence on the market, which even has consumer Gmail beat by more than a few years. So why the heck is Microsoft still playing catch up in the web based email interface wars?
No clue on that answer, honestly. You would think 13 years of innovation would have yielded something more fine tuned than Outlook Web App 2013 which we have now. The app works decently enough; I think it's much improved over the pile that was Outlook Web App 2010. But being neck and neck with Google doesn't seem to have lit any fires under Microsoft's rear end.
My first area of pain as a browser based user is in the form of tasks. I can make my own tasks, but I can't delegate any to other users, and receiving delegated tasks results in a drab email that provides no usefulness other than to alert me of the info placed within the task. It does not show up on my task list as you would presume. It's almost as if tasks are a complete afterthought of the OWA experience, which is a darn shame because we were hoping to fully replace our reliance of Podio for task delegation.
I mentioned above how shared contacts and calendars are almost non-existent in the web interface. OK, to be fair, shared calendars do work, but they need to be handled on a user by user basis in the form of adding the calendar to that person's Outlook calendar interface. Google makes the calendar sharing and adding experience much more seamless, and this is one of my favorite aspects about Google. Microsoft acts like the only people who should have full parity with shared calendaring are Outlook desktop users. Don't force me to use the full Outlook client if I don't want to!
Outlook Web Access (version 2000 shown above) has been around since Exchange 2000 -- years before Gmail even hit the scene. So why hasn't OWA trounced Gmail in light of its long lifespan already? The answer: Microsoft has been too busy pumping features into Outlook while forgetting about browser users until recently. Here's hoping OWA gains feature parity with full Outlook in two releases or less. (Image Source: Microsoft)
One of the biggest things I miss about Google Apps was the excellent integrated chat we had between colleagues integrated into the browser experience. For Google, Gmail is their first rate experience, and it definitely shows. Chat is seamless within the UI; not feeling tacked on like Lync is within OWA. In fact, if it weren't for the status color showing my Lync presence in OWA, I'd almost forget that Lync was accessible within OWA. For all other intents and purposes, the experience is completely half baked, and forces users to dig through their address book to initiate conversations with coworkers. I see this as one step back for IM maturity, which is unfortunate, because the full Lync desktop client really rocks my socks otherwise. It's just a shame I have to use the full client just to have a decent messaging experience.
There are numerous other small things which Gmail did exceptionally well in the browser that OWA is a no-show on. For one, Google's undo send Lab function that allowed me to take back emails within 10-30 seconds of sending them out was a killer feature for me. Almost daily I would send an email with a missing attachment or that forgot a particular CC contact that I adjusted and re-sent without the end user having to deal with my "oops" moment. OWA also has no way to easily add pictures into signatures; something I personally dislike, but clients seem to have grown accustomed to. Yes, I know you can host a photo on an external URL and link it via HTML, but this is clumsy and not friendly to regular end users without web coding experience.
Overall I think Microsoft needs to dedicate as much time as it does to Outlook development now to their OWA interface. More and more users like myself are ditching desktop apps in exchange for browser experiences, and after living on Google Apps for nearly four years in the browser, I can't imagine going to Outlook instead. I'm a minimalistic user that prefers a seamless experience across computers, no matter if its my own or not. OWA needs to stop being a second cut offering and move up the priority list so people don't have to continue doling out for Office editions just to use their Office 365 email in its full glory.
Microsoft's Missing Lync Offering: Cloud-Hosted VoIP
If there's one sole aspect holding Office 365 back from being the de-facto standard for small business unified messaging, it comes in the form of Microsoft not delivering on getting cloud-hosted Lync VoIP into users' hands yet. There was a sub-par offering from a company called JaJah Voice called Lync Hybrid Voice which Microsoft half-heartedly supported until this past May, when the functionality was shot behind the barn with no exact date on a proper replacement.
Hybrid Voice was a way for companies to leverage the Lync client to make regular calls to landline phones -- exactly what an on-premise installation of Lync Server 2013 can provide a larger organization. Cloud-hosted VoIP is such an attractive option for small businesses I consult because it allows them to take advantage of modern telephone service, ditch their PRI and PBX legacy hardware, and not have to maintain a single onsite server for phone service. It's an incredible value proposition for small organizations, and why we have been installing these like wildfire since last year.
Yes, you can get Lync to take over your PBX today in the form of Lync Server along with Office 365 E4 account subscriptions, but the process for getting this done involves so many loops and headaches that I don't know a single small customer (under 100 seats) that has even contemplated trying to do this. For larger organizations, we have been lucky enough to partner with Lync specialist SPS in getting such solutions into place, but the time, expense, and expertise needed to get this working is far beyond what the average small business would ever invest in.
The only way Microsoft can get Lync Voice into the hands of the average Joe on a cost effective basis would be to offer the service in a cloud-hosted manner, right alongside their already stellar Lync options in Office 365 E plans. I'm not sure why they didn't plan on having this gameplan together already. The experiment with bundling JaJah to provide hybrid Lync voice service was a terrible mistake, as proven by the canning of this partnership in under a year after it was started. I'm guessing the quality of service was not up to customer expectations. Microsoft clearly needs to control the experience from end to end, just like they do with all other facets of Office 365 today.
It's interesting to note that through my research, Lync Server cannot currently handle any form of outgoing faxing. This means that unless Microsoft makes adjustments for its supposed upcoming offering, traditional fax machines will not be able to be used with the service (if current limitations still exist). I personally think faxing should have died years ago, but regardless, many of my customers still rely on this technology for their daily work. Even in the name of progress, Microsoft should make strides to accommodate fax users so they don't look to services like RingCentral instead.
Once Microsoft unveils Lync Voice hosted from the cloud, I think the vision of unified messaging that brings together the Lync experience across all devices will finally become a reality. And judging by what I've seen of this functionality on the enterprise level, using Lync Server 2013, it will absolutely knock the socks off the competition. One Lync platform to handle your IM, voice, and video chat in a single easy to use package is something the market as a whole has been yearning for.
I can say right now that I would switch from RingCentral to Lync Voice if this sort of offering were released, and I'm sure I'm not alone when I say this. I've taken a look at some of the hosted Lync voice providers on the market, and my initial dealings with them have all been less than stellar (although I will not name any names, you can easily do your own research and get an idea for who I'm referring to here).
What Are Your Gripes With Office 365?
I know the list I shared above is not complete in any fashion. What kind of areas do you feel Microsoft needs to improve in its Office 365 for Business product? Or for that matter, even Office 365 Home Premium? Are there areas where the platform has let you down? Have you made the switch from Google Apps and feel some buyer's remorse over certain aspects? Conversely, have you switched from Office 365 to Google Apps because Microsoft's offering didn't meet your needs in some way?
Let's hear what you have to say so hopefully Microsoft will take heed and get working on improving its service. Constructive criticism is good for any cloud provider, so feel free to let loose below.
To cloud or not to cloud? It's a question a lot of my clients are asking more often, and is undoubtedly one of the biggest trends in the IT industry right now. SaaS, PaaS, IaaS, and soon to be DaaS -- all acronyms which represent offloading critical functions of some sort to the cloud or into virtualized environments. All the big cloud providers are guilty of throwing fancy numbers around to make their case. But do their trumpeted cost savings really add up?
You'll have to make it to the end of this piece to find out what I think about that personally. Because in all honesty, it depends. Too many business owners I work with make the same cost comparison blunders over and over again. Most of them are so blindly focused on raw face value costs -- the "easy ones" -- that they lose focus on the bigger picture, namely their TCO (total cost of ownership).
Too many people want to compare cloud vs on premise costs in a purely single dimension, which is akin to judging a book solely on its cover alone. For on-premise, they mistakenly believe that costs stop and start with how much new hardware/software is needed to put a solution into place. And for cloud, similarly, all they see is that recurring monthly service cost.
I can make some general statements about trends I've seen in my own comparisons, but by no means does this apply to 100 percent of organizations. And mind you, my company FireLogic specializes in 25 seat and under organizations, so my experience is a bit biased towards the needs of this crowd.
For example, on the average, I have yet to see a case where hosting email in house anymore is a worthy endeavor from a cost or uptime perspective. The same can be said for niche LOB (line of business) single app servers. Office 365 and Windows Azure are two cloud services we are offloading a lot of formerly on-premise based workloads into.
But the cloud doesn't win them all. Instances that are still usually best suited for on-premise servers include large capacity file shares (50GB in size or more) or operations that would be bandwidth-prohibitive in a cloud scenario. Offices that have smaller pipes to the internet usually have to take this into account much more acutely. My suburban and urban customers usually have plentiful choices in this arena, but rural offices unfortunately don't always have this option.
Regardless of what technology you are trying to decide on a future path for, don't just follow the crowd. Take a look into the areas I am going to shed some light on, because your initial intentions may be skewed once you find out the entirety of costs entailed with staying on premise or moving to the cloud. The real picture of what's cheaper goes much deeper than the cost of a new server or a year's worth of Office 365 subscription fees.
6) Why You Should Be Factoring in Electricity Costs
I can't remember the last business owner that actually even blinked an eye towards getting a grasp on how much his/her on-premise servers was costing them 24/7. I don't accuse them of being ignorant, because realistically, most of us don't think about energy as a cost tied directly to IT operations. It's that invisible commodity that just happens, and regardless of any computers being in an office, we need it anyway.
That's true. Business doesn't operate without it. But even though today's servers are more energy efficient than they ever were in the past twenty years, this doesn't mean their electrical consumption needs should be ignored. Unlike standard computers or laptop, the average solid server has a mixture of multiple socketed processors, dual (or more) power supplies, multiple sets of hard drives in RAID arrays, and numerous other components that regular computers don't need because the key to a server's namesake is uptime and availability. That doesn't happen without all the wonderful doodads that support such a stable system.
But with that stellar uptime comes our first hidden cost: electrical overhead. And not just the power needed to keep servers humming. All that magic spits out higher than average levels of warm air, which in turn also needs to be cooled, unless you prefer replacing your servers more often than every five years or so.
Putting a figure on your server's total energy footprint is tough because it's something that is hard to do accurately unless you have a datacenter with powerful reporting equipment. And most small business owners I consult don't keep a Kill a Watt handy for these purposes. So in the effort of finding an industry average that provides a nice baseline for discussion's sake, I found this excellent post by Teena Hammond of ZDNet that actually ran some numbers to this effect.
According to her numbers, which use an average kWh cost for energy from the US Energy Information Administration as of January 2013, she figures that an average in-house server in the USA (accounting for both direct IT power and cooling) sucks up about $731.94 per year in electricity.
While this certainly makes the case for virtualizing as much as possible if you have to keep on-premise servers, it could also sway your decision to just move your workload to the cloud. It's hard to justify keeping email or small file server needs insourced if $730+ per year per server is accurate, especially as the number of users you may have gets smaller.
If you think this point is not that serious, just ask Google or Microsoft. They manage hundreds of thousands of servers across the world at any given time. Efficiency and energy use for them is a do or die endeavor. The industry standard these days for measuring efficiency by the big boys is known as PUE (power usage effectiveness). It's as simple as taking the total energy of your facility (or office) and dividing it by the direct energy consumed by IT equipment (in this case, servers).
Microsoft's newest datacenters have PUEs ranging from 1.13 to 1.2, and Google also does a fine job with a PUE of about 1.14. Keep in mind that these entities have large budgets solely dedicated to energy usage conservation efforts. The average small business throwing servers into a back closet will likely never take two winks at this. Which is reasonable given the circumstances, but again, accentuates the notion that we should be decreasing server footprint -- potentially entirely -- if the numbers add up. I wouldn't want to see how bad the PUE would be for the average SMB client I support.
Of course your own needs may represent a different story. If your office is in a large building with a shared data closet, as is common at places like the Sears Tower in Chicago where we have some customers, then you may be able to share some of these direct electrical costs. But for most smaller organizations that are on the hook for all the electricity they use, energy needs should be on the table when discussing a potential move to the cloud.
5) Bandwidth Can Be Your Best or Worst Friend
The cloud can bring a lot of potential savings to the table, but just like energy consumption is the sore thumb of keeping servers in-house, cloud migrations can slap us with a nasty realization in another area: that we may not have enough bandwidth.
When your servers are in-house, your only limits are your internal network's infrastructure. Switches and cabling that is usually more than sufficient to serve bandwidth hungry applications inside office walls. Cloud scenarios are getting smarter with how they leverage bandwidth these days, but there is no getting away from the fact that offloading hefty workloads to the cloud will call for a bigger pipe.
Cloud email like Office 365 gets around this in two main ways. For one, no one is ever sifting through reading the entire contents of their mailbox at a single sitting. Even doing massive searches across a 365 mailbox is usually handled server-side, and only the pertinent emails are downloaded in real time that need to be opened. And similarly, in the case of Outlook users on 365, they almost always keep a local cache of their email mailbox -- which further negates any major issues here. Depending on how many users an office may have, this argument dynamic could change because it becomes tougher to predict bandwidth needs when user bases start to grow.
Another prime example of a function moving to the cloud now at smaller organizations is file share needs. With the advent of services like SharePoint Online, businesses are moving to cloud file servers (including my own) for their document sharing and storage needs.
Moving a 30-40GB file share or set of shares to the cloud is an excellent example, but in turn, we need to keep in mind that without sufficient bandwidth in and out, any savings of not running an internal server could potentially be negated. If you are hitting consumption limits (like is the case with cellular services) or just purely don't have enough pipeline to go around for all at a given time (DSL or T1, I'm looking at you) slowness and productivity loss will start rearing their heads.
This excellent online calculator from BandwidthPool.com can give you a fairly decent idea of what you should be looking at in terms of connection speeds for your office. It may not have enough detail for some complex situations, but as a broad tool to guide your decision making, I definitely recommend it. In general, my company is usually moving offices from DSL or T1 over to business coax cable in the Chicago area, and in the case of T1 -- at a huge cost savings while gaining large increases in bandwidth.
If you're contemplating any kind of major cloud move, speak with a trusted consultant on what kind of bandwidth you should have in place to have a comfortable experience. We've been called into too many situations where owners made their own moves to the cloud and were frustrated with performance because they never took into account the internet connections they needed for a usable experience.
Any increased needs in internet connection costs should be accounted for in an objective comparison of going cloud or staying in-house. Situations that call for unreasonable amounts of bandwidth which may be cost prohibitive could sway you to keep your workload(s) in-house for the time being.
4) Outbound Bandwidth from Cloud Servers Will Cost You
I absolutely love cloud virtual servers. The amount of maintenance they need is less than half that of physical servers. Their uptime is unparalleled to anything I can guarantee for on-premise systems. For my SMB customers, their TCO tends to be considerably less than a physical server box. But ... and there's always a but. They come with a hidden cost in the form of outbound bandwidth fees.
All the big players work in a similar manner. They let you move as much data as you wish into their cloud servers, but when it comes to pulling data out, it's on your dime after a certain threshold. And I can't blame them entirely. If you're moving these kind of data sets across the pipes, you're likely saving more than enough cash from hosting these systems onsite that you can spend a little to cover some of the bandwidth costs they are incurring. There is no such thing as a free lunch, and cloud server bandwidth is no different.
Microsoft's Azure ecosystem has a fairly generous 5GB of free outbound bandwidth included per month, which for relatively small workloads -- or even scenarios where nothing but Remote Desktop is being used to work on a server in the cloud -- this limit may never be touched. The 900 pound gorilla in cloud servers, Amazon's EC2, is a bit stingier at only 1GB of free outbound bandwidth per month.
Pricing after the initial freebie data cap per month is pretty reasonable for the customer workloads I consult with. Microsoft and Amazon are neck and neck when it comes to pricing for bandwidth, with a reduced pricing scale as you move up in consumption (or volume). For example, if you wanted to move 100GB out of Azure in a month down to your office, after the first 5GB of free pipe, you would be on the hook for an extra $11.40 in bandwidth charges.
On a 1TB workload of outbound bandwidth in a month, you would owe an extra $119.40. Again, depending on your comfort zone and workload transfer levels needed by your scenario, every situation may come to a different conclusion. In general, I'm finding that smaller workloads which require little in/out transfer to the cloud are great candidates for the "server in the cloud" approach.
And one other symptom of moving large workloads off to the cloud is the amount of time it takes to transfer data. If your business relies on moving tens (or hundreds) of gigabytes between workstations or other onsite servers at a time for daily operation, cloud hosting of your servers may not be a smart approach. At that point, you aren't limited by the fat pipes of providers like Azure (which are extremely large - better than most connections I see at organizations I support).
The limiting factor at this point becomes your office internet connection. Even a moderately priced 50Mbps pipeline may not be enough to transfer these workloads in timely manners between endpoints. In this case, staying in-house is likely a solid bet.
We're in the process of setting up an entire accounting firm hosted up in an Azure Windows 2012 R2 Server, which after running the numbers against doing it in-house or on a niche cloud VM provider like Cloud9, the customer is going to be saving hundreds of dollars every month. In this specific instance, the cloud made perfect sense. Putting their "server in the cloud" was a good business decision due to the uptime requirements they wanted; the geographic disparity of the workforce; and the fact that we wanted to enable staff and clients to use the system as a cloud file server for industry specific sharing needs.
For those interested, you can read my full review from earlier this year on Windows Azure's Virtual Machine hosting service, and why I think its got Amazon, Rackspace, and the others beat. I touch a bit more on Azure's credibility further down in this piece.
3) The Forgotten "5 Year Rule" For On-Premise Servers
Don't try Googling the 5 year rule. It's not industry verbiage in any way, but it is something I'm coining for the purpose of this hidden cost which is almost never discussed. The 5 year rule is very simple. On the average, from my experience, organizations are replacing on-premise systems every 5 years. It may be different depending on your industry, but again, on the whole, five years is a good lifespan for a 24/7 server used in most workplaces.
For those of you pushing servers past that lifespan, this discussion also applies, but your rule may be closer to that of a seven or eight year one -- as bad practice as that may be. I say that with all honesty because when you stretch a server lifespan so far, you're usually entailing risk of unexpected failure during the migration to a new system, or increasing the risk of paying for more costly migration fees because your software dove into a much further obsolescence than it would have otherwise had at a decent five year timespan.
Back to my original point, though. The 5 year rule is something that many decision makers don't take into account because they see the cost as being too far off to consider now. Yet, when looking at cloud vs on-premise, I think it's super important to consider your TCO (which I will touch on at #1 since it is the most important hidden cost). Part of that TCO entails your upgrade costs that hit every xx number of years when it comes time to retire old on-prem servers.
And this is where the sticker shock sets in, and where cloud services tend to justify their higher recurring costs quite nicely. Yes, while you are usually paying a slight premium to keep your needs up in someone else's datacenter, their economies of scale are offsetting what it will otherwise cost you around that nasty "5 year" mark. Even in situations where staying in house may be cheaper than going to the cloud on a monthly basis, your five year replacement/upgrade costs may be so hefty due to the size of the hardware needed or licensing entailed, that going to the cloud may still be the better long term option.
The above sample calculations I made using the TCO Calculator compare a generic line of business app server on-premise vs hosted up in a large Azure virtual machine instance. While the yearly recurring costs up to year four are fairly similar, you can see the spike that year five introduces. That extra $6190 in year five is an average aggregated cost of new hardware, software, labor fees, and related costs to replacing one physical server at end-of-life. Business owners are always oblivious to this reality in their own comparisons.
The cloud approach always entails a subscription cost which brings higher recurring monthly fees than hosting in-house, but this is not necessarily a bad thing. You aren't getting knocked with any Year 5 Rule spikes in capital expenditures because cloud providers are constantly moving their services to better servers behind the scenes, giving you the benefits of stable hardware replacements without your knowledge. It's one of the biggest benefits to the cloud route.
Not everyone will find that the cloud approach is better for them. For example, in a case where an old file server may be decommissioned in place of an energy efficient NAS box, the cloud may be tremendously more expensive if looking at providers like Dropbox for Business as a replacement. If your business was moving to Office 365 for email needs, then co-mingling SharePoint Online as a cloud file server would be ideal and fairly inexpensive. But it all depends on how much data you are storing, what your internet connection options are, etc. The variables are too numerous to put blanket statements on in a single piece like this.
The biggest thing I want people to take away is that you cannot pit on-premise cost comparisons solely on initial capital outlay of a server and the recurring monthly fees of a cloud service. Your decision making will be skewed from looking at a situation in a very strict lens, one that does not do your business long term financial justice.
2) What Does Each Hour of Downtime Cost Your Business?
The cloud sure gets a lot of publicity about its outages. Google had Gmail go down in late September to much fanfare. Amazon's virtual machine hosting on EC2 got hit with issues in the same month. Even Microsoft's cloud IaaS/PaaS ecosystem, Windows Azure, experienced its third outage of the year recently.
Hoever, compared to on-premise systems, on the average, the public and private clouds still see much better reliability and uptime. The numbers prove it, as you'll see shortly. After dealing with organizations of all ends of the spectrum in my own consulting experience, I can definitely say this is the case.
The real question here which applies to both physical and cloud environments is: what does each hour of downtime cost your business? $500? $50,000? Or perhaps $500,000? The number is different for each organization and varies per industry, but go ahead and run your own numbers to find out. You may be surprised.
InformationWeek shed light on a nice 2011 study done by CA Technologies which tried to give us an idea of what downtime costs businesses on a broad scale. Of 200 surveyed businesses across the USA and Europe, they found that a total of $26.5 Billion USD is lost each year due to IT downtime. That's an average of about $55,000 in lost revenue for smaller enterprises, $91,000 for midsize organizations, and a whopping $1 million+ for large companies.
At the study's number of 14 average hours of downtime per year per company, and using the above $55,000 in lost revenue for smaller enterprises, it's fairly safe to say that an hour of downtime for this crowd equals roughly $3929 in lost revenues per hour.
At the large enterprise level, this comes down to about $71,429 in lost revenue for each hour of downtime. You can see how important uptime is when it comes to production level systems, and why considering downtime costs is a hidden factor which shouldn't be skimmed over.
Even organizations that aren't necessarily profit bearing, such as K-12 education and nonprofits, should place some weight on downtime. Lost productivity, disruption of communications, and inability to access critical systems are all symptoms of IT downtime which can be just as damaging as lost profits.
In a very common example, we can look at an easy candidate for being moved into the cloud: email and calendaring. In the last three years, I have yet to run into a situation for small and midsize organizations where going in-house for email was the better option. Not only from a cost and maintenance perspective, but also from an uptime viewpoint.
As Google made public from an otherwise paid-only study from the Radicati Group, in-house Exchange email systems see an average of about 2.5hrs of downtime per month or about 30 hours per year. That translates into an average across the board uptime of about 99.66 percent -- or a full 1.25 days of downtime per year on the flipside. Ouch.
Office 365, in contrast, has had a proven track record lately of achieving better than 99.9 percent uptime (as tracked between July 2012 and June 2013). For the sake of calculation, keeping to Microsoft's advertised SLA of 99.9 percent uptime, this translates into a downtime of only about 8.76 hours per year. If Microsoft's track record keeps up, it will be downright demolishing the uptime numbers of its on-premise email solution of the same blood.
Microsoft has got the uptime numbers to back up just how stable 365 is proving to be. On-premise Exchange servers are maintenance heavy; aren't always managed according to labor-intensive best practices; and usually don't have the expansive infrastructure that providers like Microsoft can offer on the cheap, in scale. This explains why in-house Exchange systems average an uptime of only 99.66 percent, and Office 365 is hitting a 99.9 percent+ month after month. You just can't compare the two anymore. (Credit to EzUTC.com for the visual uptime calculator.)
How do cloud providers achieve such excellent uptime figures, even in light of the bad PR they get in the media? The vast technical backbones that power cloud data centers are technologies that are out of reach for anyone except the largest enterprises. Geographically mirrored data sets, down to server cluster farms which number in the hundreds of systems each -- the average small business with a Dell server just can't compete.
And even in spite of being put under the microscope recently for a spate of short outages, Windows Azure has been winning over the experts in some extensive comparisons that are being done. For example, cloud storage vendor Nasuni released a study in February 2013 showing that Azure had bested Amazon S3 as the best overall cloud storage provider based on stress tests in availability, speed, scalability, and other head to head aspects.
"The results are clear: Microsoft Azure has taken a significant step ahead of Amazon S3 in almost every category tested," the study went on to say. "Microsoft’s investment in its second generation cloud storage, which it made available to customers last year, has clearly paid off,” according to Andres Rodriguez, the CEO of Nasuni.
Whichever direction you are heading with your IT plans, be sure that you know what your downtime cost per hour figure is, and what kind of uptime your prospective approach is going to afford you. That cheap on-premise solution that could be seeing 15-30 hours of downtime in a year may not be so attractive after all if you can put a hard number on your losses for each hour of being out of commission.
1) Anyone Can Compare Recurring Costs, But Do You Know Your TCO?
I saved number one for the very end, because this is the biggest gotcha of them all. The one that only the most in-depth business owners tend to put onto paper when running their numbers. Most of the time, this never even gets discussed because as I said earlier: out of sight, out of mind. How wrong such a mentality can be.
Total Cost of Ownership is the most accurate, objective way to place on-premise and cloud solutions on an apples to apples comparison table. This is because of the innate disparity between the two paths when viewed in the traditional lens. Recurring monthly costs always take precedence over initial hardware CAPEX costs, and the nasty "5 Year Rule" spike, which is why TCO puts all of these into perspective in the same paradigm.
Just because cloud services are almost exclusively dealt with in a subscription based model, this doesn't mean we can't compare our TCO across a five, eight, or ten year timespan. Using the simple TCO calculator I mentioned earlier in the 5 Year Rule discussion, we can find our ten year TCO on a given comparative decision for IT in the same manner.
Let's ring back around to a very common scenario for my client base. Where should I host my email now that our eight year old Dell server is about to die? If we keep things in-house, we need a generously powered Dell server (I used a T420 Poweredge tower from a recent customer quote in this example) with licensing for Exchange 2013 along with server user CALs and Exchange user CALs for ten people.
Likewise, a move to the cloud introduces the need for Office 365 E3 licensing for everyone which provides email, Lync, and Office 2013 download rights for all users. To make things fair, I included the cost of a $900 QNAP loaded with enterprise Seagate drives for on-prem file storage after the old server goes down, since our scenario doesn't make SharePoint Online a feasible option.
Here's how the eight year TCO lines up for such a comparison:
At year 1, on-premise hits our example customer with a heavy load of fees. Not only do we account for initial upfront hardware/licensing, but we need antivirus software, onsite backup software (we like ShadowProtect a lot), cloud backup (via CrashPlan Pro), not to mention monthly maintenance and patching of the server, electricity consumption, along with unforeseen hardware costs. 365 wins out on recurring costs month to month (by a little), and then again, our Year 5 Rule comes into effect and the new system costs shoot up TCO quite heavily.
Most importantly, look where our TCO is after eight years. The on-premise route has us at around $65-68K, while over the same timeframe, the cloud approach with Office 365 would have put us at just about $30K. That's more than a 50 percent savings over eight years. Most people wouldn't have figured such costs, because once again, they are so squarely focused on their upfront CAPEX for on-prem or recurring costs on the cloud side.
Does the cloud always win out as shown above? Absolutely not. If I pitted an on-prem NAS storage box, like the aforementioned $900 QNAP with some enterprise Seagate drives for file storage, the TCO compared to using something like Dropbox for Business or Box.net would be tremendously lower. The cloud is not mature enough in the area of mass file storage yet where it makes financial sense to dump physical in-house storage options in exchange for such services. On a smaller scale, we've found SharePoint Online to be a great alternative for lighter needs, but not if you are hosting mass dumps of CAD drawings or media or similarly bulky file sets.
Run the numbers, do your math, and find out where your needs stand. If you don't understand what your 5, 8, 10 year (or longer) TCO looks like, you cannot pit the cloud and physical systems head to head at an accurately analytical level. Making the same mistake that I see time and time again at the organizations we consult will end up giving you the false impression that your chosen approach is saving you money, when in the end, that may be far from the case.
I'm not trying to be a blind advocate for one camp or another. My company is busily helping organizations come to proper conclusions so they can make their own decisions on what path is best for their futures. While the cloud is tipping the scale in its favor more often than not recently, I outlined many scenarios above where this is not the case.
Don't let salespeople guide your decisions based on their pitches alone. If you can objectively compare your own standing from TCO to downtime cost per hour, among other factors, you're in a position of power to make the best choice slated in solid fact. And that's a position that invariably leads to the best results. There are enough tools available today as I described above which can make this comparison process as easy for the average Joe as it is for the CTO.
Photo credit: Tom Wang/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
In the mid-2000s, walking into a college classroom holding a laptop that came with a stylus for the purpose of note-taking was without a doubt out of place. The smartphone craze was still years away, and for all intents and purposes, touchscreens were relegated to two platforms: the Nintendo DS, and the last hurrah of Palm devices like the Treo. So when I sat in my undergrad classes taking notes in OneNote 2003 on my Thinkpad X41, people looked at me like I was an alien. Professors even asked from time to time whether I brought my paper notebook to class, so I wasn't playing with my "toy" the whole time.
Tablet PCs had a real personality dilemma way back then. Aside from OneNote, they were a sort of a pariah in the PC industry. Cool, sleek, powerful, and usually fairly light -- but they were held back in one major way: the operating system. I bless Microsoft for taking chances in areas where no one else dared, which undoubtedly led us to the current revolution being driven by Windows 8.1, but the first wave of Tablet PCs had real potential. The hardware was there, but the operating system was the large bottleneck by far.
Seeing how OneNote is hitting its 10th anniversary right about now, I wanted to take a moment to reflect on how useful the tool has been in my personal and professional life. In fact, I don't think any other app in the Office suite has come so close to fulfilling the long dream by users like myself to ditch the paper notebook once and for all.
In version 2003, they were merely proving a concept. 10 years later, come version 2013 and Windows 8.1, I think OneNote shifts the entire debate about how useful touch input can be, and why new age touch devices can allow us to rethink about where computers fit into our daily lives.
I never shied away from giving others impromptu demos of what my Thinkpad could do. Besides, the laptop landscape at the time was fairly drowned in uniformity. You had the usual flock of PC users, with a scattering of white Macbooks here and there by the trendsetters. But aside from one other student I vividly remember using a Fujitsu Lifebook (a computing school student, no less) there was no such thing as market penetration for Tablet PCs on campus. "Eww, what's that?" was more commonly the response from fellow students.
But again, the problem wasn't the form factor itself. Five or so years after I ended my first foray into tablet PC computing, I've come full circle back to another beloved Thinkpad, the X230 Tablet, and I've fallen (back) in love with not only the device itself, but this time, the operating system providing the UX as well. OneNote 2013 is indeed the killer touchscreen app for any tablet PC user (with stylus capability) and Windows 8.1 fills in the rest of the gaps.
Windows XP Tablet PC Edition Sucked, Yet OneNote Rocked -- What Gives?
I swear if it wasn't for that god forsaken awful excuse for a tablet operating system, XP Tablet PC Edition, Microsoft could have gotten the credit it deserved in getting the public thinking seriously about touch. Five years before anyone even contemplated Apple releasing a tablet, and a couple of years before the iPhone started the smartphone craze, Microsoft already had the seeds planted for what the next generation of computing could look like.
No doubt, it took us over seven years and no less than three major revisions of Windows to get there, but in many respects, Windows 8.1 and to some extent, OneNote 2013, are the culminating aspects of that vision Microsoft set out to achieve in the mid-2000s. As I said, the hardware wasn't the problem. My Thinkpad X41 Tablet of circa 2005 had excellent battery life of 4-4.5 hours; the touchscreen and stylus worked admirably for its time; and the form factor was just as good as what's on the market today (for convertible tablets - not pure slates, which are apples and oranges comparisons). Windows itself was the anchor holding the boat at bay.
Tablet PCs are nothing new. Steve Ballmer helped introduce the Thinkpad X41 Tablet (above) back in 2005, the same device I used throughout much of my undergrad studies at DePaul University. While OneNote 2003 was a fantastic tool that held its own weight, the hardware was clearly way ahead of its time in contrast to the software (namely, the cruddy Windows XP Tablet PC Edition). Windows 8.1 finally brings us the missing piece of this two-pronged functionality puzzle. (Image Source: CNET)
Perhaps Microsoft was trying to fill a void that was just way too ahead of its time. OneNote 2003 was truly a hidden gem of the Office suite back then. Most people hadn't heard of it, let alone tried it. Those who dabbled in it usually couldn't leverage it for what it was worth, because, as someone told me back in the day: why don't I just use Word? They didn't get it, but I couldn't blame them. Without a proper device and OS to put OneNote in its full glory, the application was shunned to the corner of the room when it came to mindshare.
Windows Vista came out in late 2006, and it too attempted to pick up where XP Tablet PC Edition left off, but it didn't fare much better. While it solved the problem of integrating the touch capability into the OS (instead of relying on layered-on drivers as was the case with the dreaded XP Tablet PC Edition), and making the user interface slightly more touch capable, the OS ruined other aspects of Windows that we came to love. Battery life (pre-SP2) was horrid, and application and driver compatibility were nightmarish at best (again, pre-SP2). So it was still a lose-lose situation for those like me who wanted to dive into a digital post-paper realm.
Windows 7 upped the ante further, by giving us excellent battery life and performance levels on lower end hardware that was never before seen, but the touch experience was still not fully leveraged. 7 was still at heart a desktop-only OS, with touch as a slight afterthought. OneNote was on a ship of its own, maturing through version 2007 and 2010, and landing finally on version 2013 -- resembling a product that truly deserves every ounce of its ten years of maturity.
It wasn't until Windows 8 launched that everything just clicked. Finally, an OS that provides the UX necessary to do OneNote and other touch-enabled power apps justice. Not only that, but the current generation of emerging Windows devices are proving to fill both the personal and work gaps for today's users. While Apple continues to push a notion that multiple devices are needed to fulfil a digital lifestyle nirvana -- a Macbook for work, an iPad at home as they prefer it -- Microsoft has eschewed towards a 'one device for all purposes' future.
What value proposition does the iPad bring to the table against Windows 8.1 devices aside from app selection? iWork? "The Apple Experience?" As I've said before, anyone can build a great consumption driven tablet. But giving users something they can leverage to fulfil a paperless lifestyle? The iPad doesn't come close to the Surface here.
If you think about it, much of it makes sense. How many people have upwards of $2000 to spend on a combination of laptop and tablet (going the Apple route, specifically)? Outside of the USA, this is far from feasible for the average person. Instead, Microsoft is slowly proving that a single device like a Surface can kick just as much rear end at the office as it can at home on the couch. Even at the $900 price point of the Surface Pro 2, that's still half of what it costs to equip oneself with a combo of Apple devices to fill the same needs. And in my eyes, you aren't sacrificing functionality in any way, either, unless a million count app store is your primary deciding factor. And even the disparity here is drifting away as the Windows Store now has over half of the most popular iOS apps on its ecosystem already, and that figure was from over two months ago.
If you are still curious at what Microsoft is trying to get at with the unified Modern UI across its ecosystem, you're clearly not seeing the writing on the wall. Next generation computing is just as much about touch as it is extending the traditional capability of Windows apps as we know them, and the Windows platform is here to solve all those needs without increasing the number of devices in your bag. That's true technical evolution in the purest sense.
Evernote is Great, But OneNote is Still King
A lot of people like to compare OneNote to the obvious not-so-distant cousin, which is Evernote. They are both billed as note taking apps. Each of them has similar ways in dividing content into areas and sections. And both of them have searching capabilities, import/export tools, and extra bells and whistles aimed at replacing our pen pads for good.
The big difference? One actually accomplishes that vision of replacing a paper notebook, and the other simply provides a very clean and organized way of collecting content and organizing it. For those that don't have access to a true tablet device with stylus input, either app will work just as well for your needs. But when it comes to true digital handwritten note taking, OneNote sweeps Evernote away in some major ways.
It's not like Evernote doesn't have hand writing support for inking. And it also claims to have OCR. But both of these functions seem like afterthoughts, at least from my experiences. Even on version 5.0.3, which is the latest release for Windows as of October 28, 2013, the program couldn't recognize my simple iteration of the written word "welcome" even after re-writing it about 4-5 times. Evernote's OCR process is server-side and free users get pushed to the back of the line. If I need to pay $45/year to become a premium Evernote user, it doesn't seem like that great of a value to me.
Why do I say that? Pound for pound, Evernote premium really doesn't provide much in terms of "must have" value compared to its free offering. Sure, I could shell out the $45/year and pick up a few novel features, but I think it's only fair to pit Evernote against that of what Office 365 Home Premium provides since that is what most average consumers would purchase to get access to OneNote 2013. In the interest of fairness, here is a little comparison chart I made to pit the two sides apples to apples:
As the above comparison chart shows, OneNote 2013 (as a part of Office 365 HP) is a stellar deal for what it provides. At $45/year, it's tough to justify the few benefits that Evernote Premium brings to the table, even if it is less than half the cost of Office 365 Home Premium. Getting 27GB of cloud storage, Office on Demand, and full desktop Office rights for up to 5 computers is tremendous value that easily trounces what Evernote is offering here. Mind you, none of the areas where Evernote trumps OneNote (namely in device support) are aspects of the paid Premium subscription.
While Evernote Premium admittedly has a more solid showing across a broader array of devices, the benefits end there. Office 365 Home Premium not only gives us access to OneNote, but the rest of the Professional equivalent of the suite for up to 5 computers (Mac or PC) which is on its own a stellar deal compared to face retail value for Office 2013 Pro. And when you don't have access to your desktop apps, you can launch the formidable Office Web Apps from SkyDrive, or opt to use the full blown "cloud streamed" equivalents via Office on Demand.
If you have your doubts about Office on Demand, have a look at the live demo I did at my Office 2013 class this past summer (it's towards the end of the video at about 1 hr 37 mins). It works, and it works darn well. Nothing about my live demo was canned in any way, so you may be just as surprised with how awesome Office on Demand really is.
Storage space is also much more plentiful on OneNote's side. Not only are your max note limits 10x higher at 1GB per notebook, but Microsoft's 27GB of space provided on the SkyDrive platform is also generous. This aspect can get a little thorny because yes, Evernote's cap is a monthly quota on uploads and Microsoft is capping your account by space used, not bandwidth, but by the time you reach 27GB you can easily purchase cheap upgrade space -- and that's if Microsoft hasn't raised its cap at that point.
And don't forget that Evernote's cloud storage is respectfully only for its own notes. SkyDrive is a regular bona-fide personal cloud storage space which you can use to not only store OneNote notebooks, but photos, music files, Word documents, etc etc. I just switched my own personal storage needs over to SkyDrive full time and happen to enjoy the service quite a lot. If you want to go the freebie route, you can get 7GB of space at no cost just by getting yourself a free Microsoft account.
The built in OCR capabilities of OneNote 2013 are also far beyond that of what Evernote is providing. Premium users get pushed to the "front of the line" when it comes to getting their notes OCR'd, and that's because their service relies on cloud rendering of anything that needs conversion. Free users get relegated behind that of Premium users, and in my testing, that's pretty accurate. Of the few simple tests I did on getting my handwriting searchable within the Windows desktop Evernote app, it failed on every attempt.
Seeing as Microsoft has been building on OneNote's OCR capabilities since 2003, they definitely have a maturity gap in their favor here. And since Microsoft's product doesn't rely on doing OCR in the cloud, this means you have the same level of searching access across your notebooks on a plane with no WiFi as you do at your home office with internet. This is a big benefit for travelers who rely on local notetaking access away from consistent internet connections on the road.
One of my favorite features about OneNote 2013 is how darn accurate it is in instantly searching my handwritten notes. It's not perfect -- my written hand is pretty sloppy, I admit. But as the photo above proves, OneNote had no issues tracking down the word I was searching for ("variable") in a multi-section notebook I am building in my latest grad school class. Evernote has to handle its OCR in the cloud, which when it works, is a slow experience for free edition users. None of my OCR tests with inking on Evernote were successful.
And it's also the extra fine touches that OneNote provides which push it over the top for me. For example, I can record audio (or video) right from within OneNote 2013 and have it embedded directly into my notebook page seamlessly. As a student in grad school, this is a great way to take snippets (or full recordings, if you wish) of class lectures and have them tucked contextually into my notes for that class. The files can even be played back right within OneNote; there's no need to manage separate media files in a clumsy manner. These functions aren't just feature fluff -- they work as advertised and do come in handy.
Some other interesting aspects are also readily apparent once you dig in. Math students should give the "Ink to Math" capability a try, as you can naturally write out tough equations that are otherwise a downright bear to get into text form without the help of paid addons to Word like the famous MathType which is nearly $60 for students on its own (more than half of the cost of Office 365 Home Premium alone). You can insert Visio and Excel documents natively into your notebook pages too, for quick markup capability without having to export your source files first.
Not to mention that you can screen clip anything on your computer screen and push it into your notebook for direct markup. In my above photo, you can see screen clips of PowerPoint slides that my grad school professor gives us access to before classes. So as we are following along in class, as other students are frantically trying to copy things into paper notebooks from her presentation and whiteboard notes, I'm merely clipping copies of slides into my OneNote page and bringing in her whiteboard notes with my own inking. I spend less time transcribing and more time adding important context to my class notes, making studying for exams much easier than my experiences using a paper notebook.
OneNote isn't just great for traditional note taking. It's also a fantastic drawing, mindmapping, and visual brainstorming tool. A perfect example is the above diagram I drew out for a client who asked me to describe a backup plan for an office server overhaul we are doing. Instead of spending 30+ mins in Visio, or writing a novel over email, I had a simple hand drawn diagram emailed out in 5 minutes.
This is not to say I don't use paper in any way. It may be a little while before I can fully ditch my beloved yellow flip notebook for day to day customer notes; it's almost a security blanket for me. But that's merely due to it being representative of my old guard, what I am merely "used to" in the way I work. Those processes can all change. And my move to a reliance on OneNote will slowly pull that crutch away until I no longer need the feel of a pen and paper for those instant moments of information gathering.
It's funny that it took me half a decade to come full circle and try the same technology again which I grew so accustomed to back in college. But in many ways, it is an entirely new experience. Windows XP Tablet PC Edition was a sad, sad excuse for a tablet operating system. It did the hardware no justice in providing a clean and seamless user experience. I fought against the tide because OneNote 2003 itself was a wonderful program for what it provided. Windows 8.1 finally brings together the OS and mends the program to the hardware in a way that was ever so lacking.
I've played with more than enough iPads during my time working in the public education system. I actually happen to own a Nexus 10 Android tablet (which is not half bad, at consumption, but it gets little regular usage). But for my day to day work consulting needs, and getting through grad school, Windows 8.1 with OneNote on my Thinkpad X230 Tablet is a combination of hardware/software that does everything I want. Personal and work related alike.
I don't ascribe to the Apple notion that you need 2 distinct devices to be satisfied. One is suiting me quite well, and I don't yet have any regrets about the move I made. As more working professionals start pulling back at the FUD companies like Apple love to sow, they may finally "get it" like I did.
NOTE: A lot of colleagues ask me why on earth I didn't just move to a Surface Pro or Surface Pro 2. It came down to a lack of docking options months back (which has recently changed) and the necessity to have access to a wired network port on my primary device. I'm an IT consultant by day which means I'm knee deep configuring and troubleshooting infrastructure equipment in the field for clients often. For that reason alone the Surface Pro 2 could not suit my day to day needs without carrying multiple devices. Hint to Microsoft on what the Surface Pro 3 should have?
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Google Docs and Google Drive were all I knew when it came to personal cloud document storage until this summer. I never got on the Dropbox bandwagon, and was so entrenched in the Google ecosystem that SkyDrive didn't interest me at first when it came out. While I have nothing personally against Google Drive, as it has served my company and myself quite well, I had to take a deep dive into SkyDrive territory to prepare for an Office 2013 class I taught this past summer. I was pleasantly surprised with the service, so much so that I began using it side by side next to Google Drive for my personal needs.
Fast forward to when Windows 8.1 went RTM, and I subsequently moved my primary Thinkpad X230 Tablet over to the new OS. One of the least publicized aspects of 8.1 has to be hands-down the tight integration between the OS and SkyDrive, meaning you didn't need a standalone app anymore to save/open files on the service. Some have called it Microsoft going too far, but I completely disagree. The service is 100 percent optional (you can still save locally as you would expect) and if you are using a local account instead of a MS account for your computer login on 8.1, the service is a moot point at best.
Unlike Google Drive, which allows you to upload Office documents for viewing but not edit them, I can access my entire SkyDrive file collection on any web browser now and work on any file with ease. I'm not relegated to using Google Docs like I had to with Drive, which forces you to import Office files if you want to edit them -- a nasty thorn when it comes to cohabiting files between ecosystems. This is one of the biggest things I like over that of Google Drive. No more worrying about what file format I am using depending on my situation.
I absolutely love it now. The fact that I don't have to run an extra program taking up system tray space, processing power, and memory are all fringe benefits. I've also used Drive enough personally and at customer locations to know that it has a certain nasty propensity to crash at random times, without notice. Just do a quick Google search on "Google Drive crashing" and you can sift through some of the 5 million+ results referencing to Drive's awkward stability fits. This is a problem that is getting better, but is still experienced by Windows and Mac users.
I don't recall a single crash when I used the standalone SkyDrive program, and of course, with the service being baked right into Windows 8.1, the chance for any crashing is next to zero. Not trying to brag blindly in Microsoft's favor, I'm just stating it how I see it from experience with both products. But Google does have the upper hand (right now, at least) in one area: and that's in free storage space.
The Free Space Wars: How Google Drive and SkyDrive Compare
It's a tit for tat war in the hearts and minds of cloud storage users, and it's likely far from over. While Google has the undoubted lead in the personal space right now by offering all Gmail (and Google Account) users a free 15GB, Microsoft only provides personal SkyDrive users a petty 7GB. While it's less than half of what Google is offering, it's actually plenty for most people's needs -- including my own. I moved my entire collection of personal documents (including my whole archive of college and grad school work) from a messy scattering across my business NAS and Google Drive, and consolidated them without problem on SkyDrive. And I still have about 4.6GB of space left.
You may be curious what I haven't moved over just yet. And this is where you need to make some decisions about what will be taken to the cloud and what won't. None of my old music (which I don't even organize or worry about anymore, since I have lived on Pandora for over four years now) or my large photo collection made the trip into SkyDrive. I am still keeping these on my business NAS, which is soon being moved onto an instance of FreeNAS running off the free Hyper-V Server 2012 R2 (which I'll be penning a story for separately in due time).
The fact of the matter is, none of the cloud storage warriors offer enough free space yet for anyone to move a majority of their entire digital life onto yet. As from my experience, where I had to keep my media separate, this will likely be the case for you too. Our digital lives are bloating and growing quickly, and free cloud storage is still in its infancy. This is why I decided to just move my essentials that I need most day to day, and which I can leverage SkyDrive's Office Web Apps for as well.
I do want to note that while Google offers 15GB of space now for Google Drive, you do want to be mindful that this increase came along because Google changed their policy into one that considers this 15GB shared among all the core Google services. As Google states, this 15GB bucket spans Gmail, Drive, and Google+ Photos. So while nice at face value, those with years of large Gmail history may find this headroom bitten into if you have an email inbox that has grown a beer gut of sorts. In my own case, my 15GB has been brought down by nearly 2.5GB from my Gmail usage alone. So technically, I only have about 12.5GB left for pure Drive storage.
I was curious why Google was showing my Drive as using 2.52GB, when my exported Google Drive contents only made up about 49.1MB. The caveat? Google's new storage policies pool your quota across Gmail, Drive, and Google+ Photos which can eat into your storage space quickly if one of the other services is being used heavily. You can scroll over your usage percentage in the lower left hand corner of Drive at any time to see the details of which service is using what from your quota.
I think it's only a matter of time before Microsoft ups the ante for the free storage for personal SkyDrive users. I say this because of all the love they have been giving their SkyDrive Pro users from the Office 365 side of things lately. Back in August, users of SkyDrive pro had their storage pools more than tripled at no cost from 7GB (the current personal SkyDrive offering limit) up to a fat 25GB.
Just a few days after that, Microsoft one upped themselves in the email arena for Office 365 users as well, by giving all Exchange Online accounts a free 50GB cap now. That's a complete doubling of what they used to offer at 25GB. In comparison, Google's free Gmail is at 15GB shared as I stated above, and higher end Google Apps accounts have double that at 30GB (shared, of course).
Not only that, but Outlook.com, which is Microsoft's competitor to free Gmail (and replaced the cruddy Hotmail service a year ago), has no technical limit to how much email you can store. Microsoft says you have "virtually limitless" headroom for email storage which just increases as your needs grow. I'm sure there are fair use rules around what "limitless" entails, but still, I'm glad to see that Microsoft is tearing down the usual walls around free email storage space.
I happen to like Outlook.com quite a bit too, and wrote a nice article on how you can leverage the service as a custom domain email host at no cost -- a la what Google Apps Free Edition used to provide until Google killed that off.
And to be fair, Microsoft is throwing free bones at consumers with complimentary SkyDrive place in many of its popular offerings on the market. For example, anyone purchasing either a Surface 2 or Surface 2 Pro is entitled to a free 200GB of SkyDrive space for two whole years. Students in college can pick up the dirt cheap Office 365 University edition for only $80 which covers 4 years of usage, includes Office download rights on up to 2 computers, and also throws in a free 20GB of SkyDrive space. Office 365 Home Premium includes the same free storage space offer on SkyDrive, but turns the cost into a yearly $100 fee and ups your download rights count to 5 computers.
So to say that Microsoft is being stingy with SkyDrive personal is a bit of a misguided statement. They're being quite generous is most other regards to their online services these days and picking up extra room is either cheap, or in the above instances, free with other purchases.
Where Does SkyDrive Have Google Drive Beat?
In light of the fact that Google has Microsoft beat in the amount of free space they provide on Drive, you may be wondering why the heck I even bother with SkyDrive to begin with. For numerous reasons, actually. There may be more, but these are the ones that come to mind most:
While it's not a huge deal breaker for most, it's also interesting to note that Microsoft has the widest array of mobile apps for SkyDrive, including Windows Phone which Google is avoiding to keep up its subliminal boycott of Microsoft devices. Conversely, however, Microsoft has made a SkyDrive app for Android, so you're not being punished for whichever device mixture you choose to use. Apple iOS included. +1 for Microsoft in this regard and keeping consumers above that of competitive spats.
SkyDrive accounts come with free access to use Microsoft's Office Web Apps, which are free web-based versions of its popular desktop apps. Above is a screenshot of a Word document I have open being edited in Word Web App. Notice much of a difference between editions? Feature parity between the two is still considerably vast, but Microsoft has been pushing updates out every quarter or so reducing the barriers. For what I need to get done, Office Web Apps usually work just fine -- aside from OneNote, which cannot handle tablet pen input over the web interface. Yet.
Are you ready to make the switch from Google Drive/Docs over to SkyDrive? If you've looked at your storage footprint, and made a conscious decision that SkyDrive could serve you better, then follow along as I'll walk you through the transition process step by step. It took me about a half hour to make the move, and mind you, my situation was exacerbated by a collection of items that were not only in Google Drive but also on an offline NAS box -- something which a majority of people likely won't be running up against.
Luckily, Google has put the tools into place to make the process quite easy. Here's what's involved:
1) Log into your Google Drive account and ensure you are not using over 7GB of space for your Drive needs.
As I showed earlier in the screenshot above in this same article, you can easily scroll your mouse over your storage space usage number in the lower left hand corner of Drive to see a detailed reading of what your Drive account is using in GB. For me, Gmail is the main pig for my storage needs -- with Drive using not even 1GB of room. If you are running close to the 7GB cap of what SkyDrive provides, you may need to make decisions on what stays in Drive, what needs to go on a personal NAS, or what may need a second SkyDrive account for. You can get creative with your file placement these days. It's not like Microsoft is limiting the number of SkyDrive accounts per person. Just remember that you can only tie one Microsoft account to the SkyDrive on your computer (either the standalone app route, or by using Windows 8.1 with integrated SkyDrive).
2) Head over to Google Takeout to make a personal ZIP archive of your Drive data.
Google Takeout is a little known tool that Google provides for free which is like your gateway to take anything off the Google ecosystem when moving to different services. Here, you can not only make a zip download of your entire Drive contents, but you can easily export a copy of your contacts, YouTube items, Google+ profile items, Blogger posts, etc. The only thing that is missing is Gmail, but you can easily transfer messages from Gmail to other services using Thunderbird or Outlook over IMAP.
Within Takeout, you just need to use the "Create an Archive" button on the homepage and then select Drive (and any other services you may want to pull down) and you will get a ZIP file to download to your desktop which is a full hard copy of all files in your Drive account.
Google Takeout offers you the chance to change file formats on exported files. Considering a switch from Office to LibreOffice and want to leverage OpenDocument formats instead? Go ahead and select them. SkyDrive plays just as nicely, from my experience, with OpenDocument files as it does in native Office formats.
NOTE: You will NOT get any files downloaded that are considered "Shared with You" from Drive. These are items that others have shared out from their accounts. The way Drive works, these items are not considered under your ownership and you need to manually pluck these items off if you want to move them. A pain in the rear if you have a lot of them, but it is what it is.
3) Unzip the ZIP file that Takeout gave you, and just use "drag and drop" on your computer to move it into SkyDrive.
Your ZIP file from Takeout should have a clean, Office file format copy of every file that used to be in your Google Drive. All Google Spreadsheets should be Excel files, all Google Doc files in Word format, etc (unless you opted for OpenDocument exports, that is). This makes it super easy and clean to move them over to SkyDrive. I merely did a drag and drop after cleaning up my folder structure and I was set. After merging in contents from my NAS box, which had another half of my personal doc collection, I had a full single spot to access all of my critical personal documents on SkyDrive now. No double searching across two places anymore.
And That's Really About It!
Depending on how much "stuff" you moved into SkyDrive (and if you had to purchase extra space to move all your items in) then your sync time from desktop to the cloud could be anywhere from a 10-15 mins to a few hours. I moved in about 2.3GB of files between my old Drive items and my NAS stuff, and it ended up taking me about 1.5 hrs for it all to fully sync between cloud and my desktop.
At this point, I truly don't care if my Thinkpad X230 gets lost. I see it this way: my laptop is always backed up to SkyDrive for things I properly save into those folders. The contents of my SSD are encrypted with BitLocker at all times now too. This is probably the first time in my computing life that I can call my day to day workhorse machine a truly disposable system without the pain and damage that may have been caused from theft or loss even just a few years ago. If something were to happen, I can load up any machine with Windows 8.1 and have my entire computing situation back up (barring programs, of course) in relatively short order. Even apps aren't a big deal today, as most of my apps are freeware or open source, and those which aren't -- namely Office 2013 and Creative Suite 6 -- are downloadable from the cloud easily and quickly.
Give SkyDrive a look, even if you've dismissed it in the past like I did. If you are running Windows 8.1, there is little reason you shouldn't be leveraging Microsoft's free cloud storage solution. They're truly a match made in heaven at this point, and with Office Web Apps maturing at a very quick pace, it likely won't be too long until we won't even need desktop Office to get our work done. Fingers crossed!
Photo Credit: dencg/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
If you wanted to build a case study in the perfect recipe for IT project disaster, you wouldn't have to look any further than the new official Obamacare website, Healthcare.gov. The site, which was supposed to be the official gateway for Americans to purchase cheap (now clearly up for debate) health insurance, has become an overnight poster child for just how bad the government can fumble a well-to-do technology implementation. The common symptoms are all present: budget overruns, too little time to test, poor design & planning, and you can take it from there.
So when the President came out for an impromptu press conference about the disaster this past Monday, it struck me a little odd at how he addressed the situation. He went on to describe the numerous problems plaguing the website, and then turned course to let us know that the feds were calling in the "best and brightest" in the tech industry to help solve the woes. Verizon was publicly mentioned -- which I'm not sure why that namedrop really mattered, since I didn't know Verizon's mobility expertise really had much of anything to do with a large federal website catastrophe.
But oh well. As the President subliminally admitted: if the hundreds of millions already spent didn't entitle us to the best and brightest in the industry, a few million more won't hurt at this point. Yet this really shouldn't surprise anyone. The government's track record in handling large IT projects is abysmal.
Government IT contractors took the stand on Capitol Hill Thursday to answer grilling questions on why the launch of Healthcare.gov turned out so badly. As the above contractor admitted, even he couldn't get in properly during minimal stress testing. Poor technology behind the scenes, inadequate testing, or unrealistic deadlines from above? It's a nasty mixture of everything, if you want to be honest.
Jim Johnson, founder of The Standish Group, keeps tabs on roughly 50,000 development projects that the government and commercial sectors undertake. He gave Computerworld some frank words on Healthcare.gov earlier this week. "They didn't have a chance in hell. There was no way they were going to get this right -- they only had a 6 percent chance," he said.
More accurately, his group's data claims, about 52 percent of large projects are "challenged" -- an all-inclusive term describing them as over budget, behind schedule or not meeting user expectations. The leftover 41.4 perrcent are complete failures; meaning they are abandoned or were started over. It didn't help that the project's bosses at HHS were pushing unrealistic timelines and stress tests of the website were completely underutilized.
What Was The Rush To Get The Site Launched By October 1?
As former Presidential hopeful Newt Gingrich has continually brought up on CNN's Crossfire, the implementation of Obamacare by the White House has already seen officially half of its deadlines already missed. A full 41 of 82 hard deadlines that were set forth in the controversial Affordable Care Act have come and gone without a blink. So it's ever more interesting that the White House wanted so desperately to get this half baked site off the ground when it already had a failing grade in most other aspects of getting this law rolled out.
Much of the problem that engulfed the monstrous website was an audible called by someone in the government to require all users to register on the site before they could go "window shopping" or compare plans, to put it simply. The same kind of thing we do on Amazon, Ebay, etc every day was barred initially, and for what reason, no one yet knows. I'm smelling more than a rat here, and many public voices across both sides of the aisles are calling it a politically motivated bait and switch that was enforced to keep the true costs of these plans away from users as long as possible. If that was indeed the intention, it worked.
For example, a Manhattan Institute analysis of the new Obamacare plans found that on average, of the new plans being offered for men, premiums will be 99 percent more expensive. For women, they don't get hit as hard, but still see a 62 percent increase across the board on average. This is compared to plans that were offered under the traditional system. And for those that are very healthy (like myself), the increases will be even wilder they've found. My own initial discussions with a health insurance consultant I have been talking with are tentatively justifying these suspicisions.
Behind The Scenes: Garbage In, Garbage Out
From Thursday's testimony by contractors, admissions were made that sums of work were being manually entered by humans as opposed to relying on technical automation to get the job done. While not the biggest issue plaguing the site's performance and operation thus far, it's troubling indeed to see that such a massive operation is relying even partially on widespread keyed entry.
Jeff Rauscher, the Director of Solution Design at Redwood Software, takes direct aim with this flawed mentality. “The fact that the contractors involved with the healthcare.gov site have admitted that application data is being manually keyed in is alarming, given the fact that this procedure is no way sustainable with the number of applications expected... With millions of dollars and hundreds of hours spent on the healthcare.gov site, the Obcamacare team cannot simply do-over this project -- the best solution at this point is to make use of the existing infrastructure and sync up the vulnerable holes in the site with automation."
A recent Time.com article shed some light on just how the bad situation looks behind the scenes, and how much cleanup work insurance companies are having to do in light of the technology mess in place. For example, there are nightly reports that Healthcare.gov is pumping out on the backside showing details about all new enrollees day to day. The problem is, the data flowing into these reports is either coming in garbled, duplicated, or riddled with messy syntax errors. Some instances see people getting multiple enrollments and cancellations pushed through the system, but without basic timestamps being placed on submissions, the information is useless to insurance companies for the purpose of new signup management.
It may not be directly indicative of the poor coding that went into the final product, but you have to wonder why dummy data files present on the live website have developer 'easter eggs' pointing to Star Wars and Transformers characters. Was anyone auditing the code being pumped out by the development team? And why didn't the White House call in the "best and brightest" from the tech industry to begin with, instead of waiting until the damage was done? Taxpayer money well spent once again. (Image Source: Washington Examiner)
The short term recourse? One of the benefits of the small numbers of people actually being able to sign up is that insurance companies have been using manual phone calls to customers in order to double or triple check information stemming from the mangled backend. For the time being, it's a doable patch to a bigger problem -- but there is no way this messy cluster can continue forward with this much manual labor involved. What's the purpose of a high tech website infrastructure if we are relying on pre-1970's workflow management?
The VP of Compuware APM (an application performance management company), Steve Tack , hit the nail on the head earlier today when talking to industry magazine InformationWeek. "High-performing applications are not just something nice to have. Businesses can be halted without them."
Mangled IT Projects: The Status Quo For Federal IT
Who's really surprised about the mess that is Healthcare.gov? The writing was on the wall in some ways regarding the new site, if we are to judge the implementation on the basis of how similar federal IT projects have ended up. Ailing, overspent, or simply dead on arrival in some cases.
Last year, the US Air Force pulled the final life support on its ill-fated ERP project which burned through about $1 Billion in taxpayer money before they decided to cut their losses. Dubbed the Expeditionary Combat Support System, it was meant to replace over 200 legacy systems in use now, and the project began in earnest all the way back in 2005 -- a full 8 years before it was ultimately canned. The project was going to necessitate another $1.1 billion just to get 25% of the original scope, and even that was shooting for a target completion of at least 2020.
Other prime examples of dead-end IT projects include the FBI's infamous Virtual Case File initiative which was dumped in 2005. Although the cost and scope of the project was far smaller than the Air Force catastrophe, at only $170 Million, it's still a black eye on the agency.
The replacement to Virtual Case File, called Sentinel, went live in 2012 only after $451 Million spent in taxpayer money. The project was initially developed to meet the needs of the aftermath of the September 11 attacks, in order for government systems to more easily share information on intelligence. 11 years after the attacks, America's most public law enforcement behemoth finally had a system in place to allow it to work the way it needed.
Sentinel only incurred project overrun costs of $26 Million. The emphasis on only is purely sarcastic.
And while there aren't many public figures available, the USPS relaunched its front-end website and related backend infrastructure only very recently. While FedEx and UPS have been providing advanced functionality like to-the-hour tracking of where all of their packages are for over a decade now, the USPS just joined the digital world in this level of expected performance. I can remember my dealings with the USPS in the mid-2000's regarding lost packages (which used to be common), and having postal workers tell me that their systems considered cheaper service level packages "delivered" when they reached the post office.
It's like the IRS telling you that your income tax return was considered "delivered" when it reached your bank. Not your bank account, just your bank. Insanity is what I call that.
Accountability for packages on the level which FedEx and UPS have grown famous for is in its mere infancy at the USPS. Federal bureaucracy at its best, or a pure lack of innovative capability? You call it as you see it. But time and time again, the federal government pales in comparison to the stability and advancements that we have grown accustomed to in the private sector.
Watching this entire Healthcare.gov dog and pony show unravel is part comedy, part tragedy. Americans are being told that they will be facing increasing penalties for not jumping feet first into Obamacare if they are uninsured now, but even the Obamacare call center reps are admitting publicly that no one is impressed with what they are seeing. Will this tech nightmare give way to the delay of the controversial individual mandate? Here's hoping that's the case.
Photo Credit: sunabesyou/Shutterstock
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
One of the biggest problems I have with all those fancy iPad rollouts in corporate America is that they are merely patching a larger problem instead of solving it. Let's face it, nearly 60 percent of tablet buyers currently are not replacing their primary mobile devices -- they're merely supplementing them. Less than 9 percent truly see themselves replacing their laptops with tablets. If tablets are the future of mobile computing, there is a serious problem with their perception by non-consumption driven buyers.
When one of my customers approached us about helping them migrate an aging, near-crippled fleet of netbooks into modern tablets, I knew there had to be a better way than the "iPad standard". We initially toyed with the idea of getting tablets to use in conjunction with GoToMyPC or LogMeIn, but the recurring costs on such an approach started to balloon. Plus, a workforce that lives and dies by the full Microsoft Office suite would never adjust to a touch-only future.
So our needs dwindled down to a list that I covered in detail in a previous article here:
The iPad was a potential option, but its severely limited input options that go over a single proprietary port was a big knock against it. Android tablets were also a possibility, but they didn't solve the keyboard/mouse problem in any way, and they too didn't have any way of leveraging Microsoft Office other than through expensive LogMeIn/GoToMyPC routes -- which would have also increased, not decreased, device footprint for us to manage. We toyed with full blown replacement laptops, but they too would need constant antivirus licensing and patch management which would not solve our maintenance dilemma.
What did the customer's network look like before the Surface move? A tangled mess of inefficient technologies. Netbooks and VPN working off a 1.5Mbps T1 were one of the biggest nightmares, from slow access down to VPN router problems. Three physical Server 2003 machines were doing what one physical Server 2012 box handles today. It was a patched quilt of technology that in the end didn't provide much value for the end users, and was a downright management nightmare for IT.
After tossing around ideas, we settled on giving the Surface RT tablet a trial run. It was just the device we were looking for. It gave us the port options we wanted; the tactile keyboard and touchpad users needed; a slim form factor with excellent battery life; and most of all: ability to access the entire array of traditional Windows applications including Office over Windows Remote Desktop. Yes, the near decade old technology provides a zero recurring cost gateway into a full featured Windows desktop that no other tablet can offer its users.
In a surprising about-face, the same day I launched my last article the previous week, Microsoft came out and said it was working on remote desktop apps for the iOS and Android realm. The apps should be out around the time Server 2012 R2 launches. However, like with any new software release, their feature sets will likely be crippled compared to what native Windows devices get, and will take some time to get feature parity. Even with this knowledge, I wouldn't place my bets on a tablet fleet using Remote Desktop from Apple or Android devices as the app functionality and stability is as-of-yet still unseen.
Where The Fun Begins: What's Needed On the Backend
As an infrastructure geek, the cool parts of this project were not in the end user devices. Sure, I like the sleekness and capabilities of the Surface RT like any other person, but the real joys from this project were the awesome tools that we had to deploy behind the scenes. From new switches to a fast cable internet connection, down to Windows Server 2012 and the virtualization powerhouse known as Hyper-V. The icing on the cake was Windows Remote Desktop Services which was the sandbox for providing everyone their own personalized virtual desktop accessible from the Surface RT tablets.
So what kind of necessities are there for getting this kind of setup into place? Here's a brief overview of what we used:
And that was it. Aside from Microsoft licensing for this project, which is always a bear and a half, the breadth of infrastructure needed was far from expensive or crazy. We were doing a server consolidation to begin with, so we went from 3 physical Dell servers down to one running a host OS and a guest RDS environment. We also managed to downsize our desktop PC footprint from about 12-14 systems down to only three, as all mobile users at the company are now on Surface devices. And we transitioned from cruddy T1 service between two offices to a high speed VPN powered by Comcast Business cable internet giving us speeds of roughly 25/7 on our most recent speed test.
I always believe that less is more, and this revised IT footprint we implemented at this organization exemplifies simplicity through a lean backbone that does its job. If I were doing this over today, I probably would have went straight for Server 2012 R2 to take advantage of some new Hyper-V capabilities Microsoft baked in. But that's a minor gripe. Overall the setup as outlined is working great with no issues, and the customer is very happy with what we accomplished.
The Magic That Is Windows Remote Desktop Services
All the above hardware and software items are fine and dandy, but without one critical piece of the puzzle, a Surface fleet is completely useless beyond its local capabilities. That secret ingredient is known as Remote Desktop Services, which used to be called Terminal Services until Server 2008 R2 came out. In server 2012, Microsoft has vastly increased the capabilities of what Terminal Services used to be able to do by now giving you the choice of offering users Pooled VMs or even Personal VMs. While I won't get into the details about these as they don't pertain to what this project utilized, the latter two (more costly) options are closer to what competitors like Citrix are providing in the marketplace. I suggest you take a look at the excellent overview video on RDS in Server 2012 from the TechEd 2013 conference earlier this year.
We decided to use Session Hosting for our Remote Desktop approach for two big reasons: cost and ease of management. Firstly, the amount of labor time that would need to go into not only the initial deployment, but ongoing maintenance, for a Pooled VM or Personal VM approach would be too much to bear for a smaller organization. Every extra VM you need to maintain is just like an extra server that needs upkeep, albeit without the hardware aspect.
That translates directly into consulting cost which would have deemed this project not viable to sustain for the longer term. Secondly, the complexity needed to maintain any of the other non-Session based approaches just skyrocketed for what we wanted to accomplish. If this were a large corporation with a dedicated server staff on board, these may have been feasible. But not for a 25 and under seat organization using my company for managed IT services.
Remote Desktop Services in Windows Server 2012 provides you with three options for providing virtual desktops. The way we went, which is also the cheapest to roll out and maintain, is Session Hosting. You can also branch out into Pooled VMs, or for the most personalization per user, Personal VMs. Session Hosting is the new name for traditional Terminal Services as we know it. Watch the excellent TechEd 2013 overview video about RDS in Server 2012 for a further look into the tech. (Image Source: Microsoft)
Providing session based RDS access for Surface tablets is pretty much what it sounds like. You setup an RDS server (in our case, a virtual server on Hyper-V) that has the RDS role turned on for session hosting, and users are able to remote in from anywhere with internet access to use full blown Windows apps, as if they were on a traditional laptop or desktop. Our needs for applications weren't that heavy. Microsoft Office 2013. A line of business apps. Adobe Reader. Internet Explorer 10. QuickBooks Enterprise 14. And a few other odds and ends.
Perhaps just as powerful as the ability to get access to full traditional Windows apps is the fact that we can easily sandbox our data footprint within the virtual desktop environment. Unlike a traditional VPN approach where users have to dial in and then get full open access to internal file shares, when users are working within file shares over RDS, the files never leave the internal network. They are doing their work over a "window" into the office RDS server.
There is no such thing as file loss from stolen Surface tablets. The tablets themselves are encrypted by the use of Microsoft accounts for local sign in, so loss of a device is near meaningless to the company beyond the cost of the device. The thief will very unlikely be able to get past the Surface login password, and even if so, they would need to get past the login process for Remote Desktop into the RDS server -- another unlikely hurdle as we would have forced a password change for the user's network credentials by then. A double mountain that 99 percent+ of thieves would never overcome.
One question others ask me always: do users lose any functionality compared to using a traditional laptop or desktop? Remember, with this radical new approach to a tablet fleet, we weren't aiming at doing the standard "PC-plus" approach to tablet usage -- equipping everyone with a tablet while leaving their traditional computer in place at the office. That's old hat to me at this point. This Surface fleet was fully replacing any need for a user to rely on a Windows laptop or desktop. And for the most part, we did it. Our situation didn't have any loopholes that we couldn't fill.
There are a few niche instances where you may have to fall back on a non-RDS approach for unique situations. For one, using Lync 2013 through an RDS connection is supposedly supported now, but as you can see from Microsoft's blog post, it's definitely a tricky endeavor. While I have had great results with streaming audio and video over RDS (thanks to the excellent improvements in the RDS RemoteFX technology) I have yet to test any kind of VoIP or Lync connectivity. This customer moved onto RingCentral VoIP which uses desk phones and regular cell phone capabilities, so we didn't have this roadblock, but I am forewarning anyone about the stickiness that RDS could present for VoIP, potentially.
All other critical functions are working great. File share access is fast and swift due to the new SMB 3.0 that Server 2012 support (which compared to the old Server 2003 systems they had, is a huge improvement in speed and management). Aside from a few manufacturer-related glitches we ran into regarding drivers in the beginning, printing is working for all users now at both offices they frequent. As mentioned, streaming media over YouTube and voicemail audio messages from RingCentral through Outlook client have no problems either. RDP version 8 that is available in Windows 7 SP1 and higher handles the slowest to the fastest internet connections with ease due to its innate auto scaling abilities when it comes to cranking features up or down as needed.
And users are even using the Surface USB port to transfer photos taken on their cell phones or from flash drives, and dump them straight into their RDS desktops for long term storage on file shares. No extra software beyond the standard Remote Desktop client on the Surface was necessary to get this done. If you are using more traditional PCs to connect over RDP, you can leverage connecting printers through sessions so that users can work in Word 2013 for example on their virtual desktop, yet still print to their local desk printer over USB.
And while some technicians may scoff at this, users are actually getting accustomed to the Start screen interface of their local Surface devices and that of the RDS desktops on Server 2012. Touch DOES work over RDS so they can use full pinching, swiping, and other gestures on their remote desktops. Touch when they need it -- and traditional keyboard and mouse usage when they don't. That's the flexibility that Surface RT provides which Android and iPads can't lay claim to. And for our purposes, it's working darn well.
A Note on RDS User Customization: Say Hello to User Profile Disks (UPD)
If you have used Terminal Services in the past, you may remember that there was no such thing as personalization of the desktop environment between users. What they got from the administrator, is what they were forced to use. No custom shortcuts, personalized wallpaper, etc. That's a definite thing of the past. Microsoft introduced a cool new feature called User Profile Disks in Server 2012 which takes Terminal Services to a whole new level.
Previously, Terminal Services forced all users to work off a common set of variables. The same shortcuts, desktop layout, customization options, etc. UPD is built right into the RDS platform at no extra cost, and allows you to specify a share on your file server where per-user VHDX "profile disks" can be stored for access by respective individual users. This happens behind the scenes for each new RDS user, and requires no configuration on their end. After turning the feature on within RDS, the template VHDX gets auto created and this is used to provide each new user with a baseline for their virtualized profile storage area. It's like folder redirection and roaming profiles on steroids, without the headaches of the former.
Does the technology work? I absolutely love it. I don't think I would have recommended RDS if we couldn't have the level of customization that UPD provides now. Every single user can customize their desktop shortcuts, wallpaper, and even have per-application settings such as within Outlook and other apps that would have otherwise had customizations wiped away on user logoff. If you have a lot of users, you may need to be watchful of how much space these UPD files are taking up (we're averaging 200-400MB per UPD now) but seeing as we have under 25 seats at this organization, our 2TB RAID 1 storage array is more than plenty.
I highly recommend anyone implementing RDS 2012 to leverage User Profile Disks for their deployment. Your users will definitely thank you for it.
Post-Conversion: A Lean, Mean IT Backbone
While the Surface tablet deployment was not the sole driving force in streamlining the IT footprint as far as the network goes, the time was ripe for getting rid of overlapping technologies and servers. And that's exactly what we did. From three barely functional Server 2003 boxes to a single beefy Server 2012 box that handles everything the three separate boxes did pre-conversion, and more. That's because on top of the traditional server roles we dumped onto the new server like Active Directory Domain Services, DNS, file shares, etc we also threw Hyper-V onto the box.
In the year 2013, installing a separate server just to run Hyper-V would have been ridiculous for our needs. Especially for a smaller organization, a 25 and under, where load on the server is not going to be tremendous at any given time. It was a no brainer for us. With dual 3+ GHz, quad core Xeon processors and 24GB of RAM at our disposal, there was no reason not to consolidate. We also managed to get dual RAID 1 arrays going on the box; one dual-SSD setup for the OS installation and RDS virtual server, and a second array solely for storage (file share) purposes. The arrangement is working without fuss, and markedly better than the old RAID 5 that was hobbling along.
After the consolidation was done (shown above), our IT footprint was exactly where we wanted it. A single Windows 2012 server that was hosting a virtualized Hyper-V instance of Server 2012 for Remote Desktop purposes, and our email needs were moved from on-premise Exchange 2007 to Office 365. Surface tablet users now connect directly over RDP to the RDS server anywhere in the world they have internet. We increased security, ease of accessibility, and overall speed many times over. It was a resounding success that took only a few months to complete.
Some people in the IT world claim that Remote Desktop has security concerns for usage over the internet. Of course it does -- just like any other technology that is implemented without forethought and planning. The amount of options Microsoft provides for RDP security, especially in the 2012 and 2012 R2 release of Server, trounce its 2008 and earlier brethren quite a bit. You can combine the use of certificate authentication, Network Level Authentication, RDP version restrictions, session encryption, RDP port obfuscation alongside firewall rules, as well as only allowing a small subset of your AD user base to access the RDS server. If that combination of options isn't strong enough for your tastes, you shouldn't be rolling out a mobile tablet fleet in the first place.
A guide I used for hardening our Remote Desktop Services environment was this excellent post from the Berkely University IT Department. While their guide is targeted at 2003 and 2008 servers, the guidance is all still super relevant.
For hardening the security of the actual RDS server user interface, I implemented Windows AppLocker which is the grown up version of Software Restriction Policies of years before. We did this on a local GPO level for that virtual server only, but we just as easily could have done it on a domain policy level. The technology works great in locking down a whitelist of applications that users should be able to access, and closing up everything that shouldn't have prying eyes. This simple getting started guide from Microsoft should get you off on the right footing.
Microsoft had an awesome demo on how AppLocker works back at TechEd 2010 which you can re-watch for yourself.
How Does Hyper-V Compare to VMWare for Virtualization?
I know a lot of IT pros may be skeptical of the new kid on the block. Windows Hyper-V, which is Microsoft's first serious foray into the enterprise virtualization arena (Virtual PC doesn't count), may surprise more than a few of you. VMWare is the 800 pound gorilla that has excellent products like ESXi and vSphere on the market which have defined the new era of server virtualization. But being first doesn't always mean being the best.
Making a decision on what platform you are going to virtualize your servers upon should be qualified not only through the prism of costs, but also in respect to capacity, stability, virtual networking resiliency, and other qualitative factors. While any deployment I may use Hyper-V for will never touch the scale of the headroom that the platform has, I do want to show some evidence of the kind of scalability that Hyper-V allows for in a truly enterprise setting.
Here is a chart that compares the limits of what both Hyper-V (2012) and VMWare's vSphere offer today:
If you thought Hyper-V wasn't ready for primetime enterprise usage, guess again. Hyper-V allows for double the amount of logical processors; 125 times the amount of physical memory and memory per VM; and double the amount of active VMs per host system. That's quite a showing for a relatively young product, especially up against a behemoth like VMWare. (Image Source: Chris Avis)
The upsides don't end there. Microsoft allows for native VM replication, incremental backup capability, failover prioritization, and more:
(Image Source: Chris Avis)
You can also check out some related information on how Hyper-V handles availability better than VMWare, as well as how the two giants compare in terms of licensing on features. Again, many of these aspects don't even apply to my circumstances since our deployment is so low scale and rudimentary, but if you are from a larger organization debating the merits of Hyper-V vs VMWare, these discussion points should without a doubt be on the table.
Another big reason that Hyper-V was a much better option for us is pricing. We didn't want to go the Type 1 hypervisor route for this project because it would have required a second dedicated server and would have increased our IT footprint, not lowered it. This small business wanted to keep their backbone lean. So that meant we had to choose a Type 2 hypervisor, and the two obvious choices were Hyper-V (completely free) or VMWare in the form of Workstation.
The cost for a VMWare Workstation license would have run us an easy $250 just for the hypervisor, plus another outlay of money to purchase a third party backup program to allow for automated backup of a VMWare Workstation VM instance since the software provides no out of box backup functionality. On the opposite, Microsoft provides integrated free backup of Hyper-V VMs using the native Windows Server Backup utility that is included in every copy of Server 2012. And if I didn't mention it already, installing the Hyper-V Role on full Server 2012 is always free.
You can see how the decision to go Hyper-V was a no brainer for us at this point. Virtualization licensing on Server 2012 is dirt simple, and basically comes in two forms:
The price difference between Datacenter and Standard is quite hefty, and therefore, we logically went with Server 2012 Standard for our needs. But a company that may need more than 2 VMs running off a single host may want to invest in Datacenter edition, as it will greatly reduce licensing costs over buying separate physical servers for the long term.
Hyper-V 2012 Brings a Lot of New Enterprise Features to the Table
The latest iteration of Hyper-V has a lot to offer, and much of it we aren't even leveraging at our customer location since the scale of our project is small compared to what some companies may need. But Microsoft put a lot of work into ensuring that Hyper-V 2012 isn't just another "me too" product in their portfolio. This puppy packs a punch.
You can look through all of the improvements that were introduced in the 2012 release yourself, but here are some of the major ones that will be of most benefit to organizations big and small:
You can get started on your own Hyper-V environment by getting the full Server 2012 Trial which allows you to install the Hyper-V Role for free. For those planning on setting up their own RDS environment on a single physical server, this is the best approach to take. Conversely, if you wish to just have a dedicated Hyper-V server that can host numerous VMs for your production or testing needs, you can grab the completely FREE Windows Hyper-V Server 2012 from Microsoft (yes, it is truly free).
I hope this lengthy post provides not only the proof of concept that Hyper-V is a production worthy virtualization host, but also that a true "tablet only" future IS possible if done on a proper platform like Surface RT tablets. Don't buy into the hype that iPads are the only way to go when it comes to mobile tablet fleets. The Surface line offers all of the same benefits as the iPad does, but brings full traditional desktop and input to the table in a way that no other competitor can as of yet.
Have you rolled out a Surface fleet of your own? Or, did you just finish an iPad rollout and wish you waited? Let us know in the comments area below. And we would also love to hear your feelings on the Hyper-V vs VMWare debate. Which platform is your organization using or planning on moving to?
If you didn't read part 1 of my miniseries on going Surface at your organization, please ensure you check out last week's entry titled "5 reasons Surface tablets blow away iPads for a mobile business workforce".
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
The countdown to Windows 8.1 is officially on. Whoever thinks that Windows 8.1 is squarely a consumer-centric release is heavily mistaken. After spending a month with Windows 8.1 Pro on my Thinkpad X230 Tablet, I can definitively say that Windows 8.1 is shaping up as a rock-solid option for the enterprise. I've previously written about why businesses should have been considering Windows 8 for their next upgrade cycles. With 8.1, Microsoft's latest OS is a service pack on more than a few steroids.
By any measure, I've been a vocal, bullish early adopter of Windows 8. My day to day consulting work for customers doesn't allow me to stay stuck on previous generations of Windows. Even if I did prefer Windows 7, my mixed client base is moving to 8 whether I like it or not. I need to be prepared for the questions and troubleshooting that ensues, which means I need to be their resident Windows 8 expert.
And I won't lie: Microsoft has pleasantly surprised me with Windows 8. Even on a non-touchscreen Thinkpad T410, my experiences with the new OS have been generally positive. Sure, my initial foray was met with skepticism and a twitch desire for the old and familiar, but I got past the mental speedbump and haven't turned back.
I readjusted my workflow when I made the switch off Windows 7. Mind you, I spend about 90 percent+ of my time in the Desktop interface on 8 (and recently, 8.1) and after a few hours of hunting for the Start button, I started to take advantage of the fruits that 8 offers. Searching for what I need instead of pecking through menus or organizing rows of shortcuts. Making use of the Charms. After a while it hit me: this new interface just made sense. I found myself saving time on mundane tasks that were otherwise nestled in click-here, click-there combinations.
My day to day work is highly dynamic and the needs of the hour determine what I'm working within. One minute I'm administering a server over Remote Desktop or helping a customer with our live support tool, ScreenConnect. The next I'm cranking out a proposal in Google Docs. Perhaps working on my company's financials in QuickBooks Online. Throw in some Hyper-V and VirtualBox usage for testing different OSes, and round out my day doing some number crunching in Excel. I have yet to throw a use case scenario at Win 8 that didn't feel right or work as expected. I also cannot remember a single "Blue Screen" since jumping ship from Windows 7.
This transition for the Microsoft OS ecosystem is no doubt a generational one. Not unlike Office when the Ribbon trashed the menu-heavy interface in version 2007. Not unlike how XP completely transformed the Start Menu from the die-hard Classic look/feel of Windows 2000. And to a lesser extent, not unlike Windows 3.1 which slapped a full GUI on us and rejected the notion that "everyone wanted DOS". I'm still not sure why companies like Apple can toss the revolutionary iOS on a generation of users without a hint of an uproar, and Microsoft is held to a last-generation expectation from the masses. Double standard, anyone?
Just two decades ago, the command line of DOS gave way to the first wave of Windows GUI desktops in the workplace. Still yearning to go back -- for old time's sake? The removal of the traditional Start Menu is no different in conceptual progression. (Image Source: Microsoft)
Enough with the soapbox. Windows 8.1 brings a ton of new functionality to the table for the business, government, nonprofit, and education sectors. And the best part about the new OS? It's completely free. As in zero dollars, zero cost, you name it. Businesses that have already either partially or fully moved to Windows 8 will be able to get the update at the same cost of free as any consumer. This goes for organizations who also leverage Software Assurance or Volume Licensing for their Microsoft software rights. If you have licensing for Windows 8 or are getting access soon, you will be eligible to make the move to 8.1 at no charge. As you'll see below, there are few reasons not to take advantage.
So what are these neat new functionalities that Microsoft has baked into Windows 8.1? Here's a closer look at the ones I see as most relevant, divided into logical sections so you can peruse by category with ease.
Windows 8.1: Security
Drive Encryption for Everyone: Microsoft is dead serious about encryption security for Windows 8.1. That's why every edition of Windows 8.1 will have the underlying technology behind BitLocker baked in, with the Pro and Enterprise flavors getting full administrative control as was the case previously in varying extent since Windows Vista. Upon login with a Microsoft Account, any Windows 8.1 machine will have encryption for the main OS drive turned on by default. This is excellent news for medical offices who are facing the mad dash towards IT HIPAA compliance.
Remote Selective Data Wipe: Up until now, BYOD in the workplace has been a pain in the rear. When it comes to differentiating access to corporate data, traditional VPNs and share permissions off servers have done little to paint the black and white lines of what employees should keep and what needs to get returned once a worker leaves a business. We deal with it all the time in my consulting business. The next best thing to completely blocking BYOD has been Windows Remote Desktop Services, leveraging Session Based Desktops, but this doesn't meet everyone's needs.
Remote Selective Wipe in Windows 8.1 allows companies or organizations to specify what data is owned by the employer and should get wiped or indefinitely locked on the end user's device when they leave. This means companies can dole out more liberal access to critical data without fretting about what will get plucked when a valuable employee gets picked up by the competition. This helps reduce capital investment in laptops/tablets while enabling for a secure approach to BYOD which has never been offered on such a pervasive scale before.
Work Folders to Simplify File Access: The way we used to dole out file access to users was never intended for the modern BYOD workplace. Offline Files, remote VPN access, etc were all poor band aids to a big problem: how can workers get access on the go with relatively little effort, without ruining file synchronization on the server? Work Folders in Windows 8.1 (coupled with Server 2012R2 on the backend) solves this brilliantly. The technology is a lot like what SkyDrive offers on the personal and Office 365 side of things. Files can get assigned and synced to user devices for on-the-go editing access, and when an internet connection gets restored, the files are streamed and synced back to the central server. Remote file access that finally just works.
Work Folders bring the power of SkyDrive-esque technology to the native Windows 8.1 experience specifically for the workplace. Microsoft's cruddy Offline Files technology finally bids a long overdue farewell. (Image Source: Microsoft Channel9)
Microsoft has a great overview video of the tech on Channel9 as well as a nice in-depth writeup on TechNet.
Workplace Join is like AD-lite: It was either you were on a Workgroup or an AD Domain -- and that was it. Not anymore. Windows Workplace Join allows sysadmins to provide a "purgatory" of sorts that is not quite full AD, but not as loose as being joined to a Workgroup. You can watch a full demo of how simple this technology works from this year's BUILD 2013 conference.
Native Biometric Recognition Support: Fingerprint readers have been a mainstay for business line laptops for years. Most Thinkpads I have owned over the last ten years have had one. The problem is, they always required additional vendor software to use the functionality. This increases overhead, administrative effort, and in the end, also increases likelihood for issues over the long run. Which is one of the reasons why biometric uptake has been slow on Windows. Windows 8.1 has native fingerprint and virtual smart card support out of the box to secure the logon process. About time, Microsoft.
Network Behavior Monitoring in Windows Defender: While Defender alone should not be considered a comprehensive security solution, the fact that it will now have some HIPS functionality (Host Based Intrusion Prevention System) is reassuring. My company's favorite AV product, ESET NOD32, has had HIPS functionality since version 5. Microsoft is claiming that in Windows 8.1, Windows Defender will be able to spot suspicious network behavior from both known and unknown threats alike.
I wouldn't hinge my entire company's security plan around it, but it is reassuring to know that every 8.1 system out of the box will have a fairly decent level of protection from modern complex threats natively.
Assigned Access for Dirt Easy "Kiosk Mode": This new feature is especially nice for the education and business sectors. Previously, when organizations wanted to restrict access to a single interface -- for example, a web browser locked to a certain site or a given app -- they had to leverage combinations of complex Group Policies and/or AppLocker rules. These solutions worked, but boy, the amount of legwork and testing required made even the seasoned IT sysadmin sweat.
Assigned Access, which is available in Windows 8.1 RT, Pro, and Enterprise allows for admins to lock down an 8.1-powered device to function within a single, designated Windows Store app. It could be a reading application for Surface RT tablets in K-12 education. Or perhaps a line of business program for a sales force out in the field on convertible tablets. The uses are near limitless.
"The Most Secure OS Ever Made": According to Microsoft, Windows 8 is 21 times safer than Windows XP, and 6 times safer than Windows 7, from a malware infection perspective. Renowned AV vendor ESET has said similar praises about Windows 8 being the "most secure version of Windows ever." After years of noise about a mass exodus from Windows to Mac due to security, researchers are coming out and claiming that Macs have nothing inherently more secure about them, and new malware that hit OS X earlier this year is continuing to show the cracks in Apple's security-by-obscurity approach. For the enterprise stuck on an imploding island of XP machines, Windows 8.1 is looking like a much better replacement than Windows 7 or Mac.
Security in Windows 8.1 is a big deal. Security researchers have already begun saying that government agencies looking to upgrade to Windows 8 are making a "good move." And seeing that Microsoft has promised support for Windows 8 (and 8.1) through Jan 2023, the OS will enjoy a plush 10 year lifespan for organizations that make the move now.
Windows 8.1: Networking & RDS
Big Improvements for RDS: When used in conjunction with Server 2012 R2, Windows 8.1 with Remote Desktop Services Session Host Desktops (RDSH) allows for neat extended capabilities. For example, sysadmins and IT support staff can now actively view and control user desktop sessions to provide help desk style support and answer questions on functionality through Session Shadowing. The networking side of RemoteApp has been markedly improved even further, allowing for crisper usage of RDS-hosted RemoteApp programs in low bandwidth scenarios like on mobile 4G cards or at wifi hotspots. RDS on 8.1 also finally introduces multi monitor support for users, meaning that "Terminal Services" doesn't have to be a second class experience. This also includes full touch capability for RDS sessions!
New Simple Printer Connection Options: Two new wireless-enabled printing capabilities are being added natively to Windows 8.1. The first, Wi-Fi Direct, allows anyone to connect a mobile device to a Wi-Fi Direct printer to leverage ad-hoc printing without the need for drivers or an existing wireless network. NFC tap-to-pair printing is very similar, and can allow an NFC device like a smartphone to merely "bump" a printer that has an NFC tag placed on it to establish a simple direct printing connection on the spot.
Easy Hotspot Creation from your PC: Internet Connection Sharing has been around for a long time. But it's been a junk feature for as long as I can remember. Hard to configure, and even more difficult for others to leverage. Windows 8.1 brings the notion of easy hotspot creation to every PC, which is great for situations where, for example, staff members are trying to share a single wired LAN connection at a conference room before a sales presentation. Wi-Fi isn't ubiquitous or free everywhere yet, so this is a welcome new feature for mobile road warriors.
Auto-Triggered VPN Access: We're all used to the errors and access issues presented by applications we just forget that need VPN access to the corporate network when in the field. Windows 8.1 looks to solve these dilemmas once and for all with Auto-Triggered VPN settings that can prompt for necessary VPN credentials whenever certain items necessitate them. Forgetting to connect to the VPN won't end up in a molasses-crawl to the expected "access denied" prompts anymore.
Better Out-of-Box VPN Client Support: You can cross four vendor VPN application add-ons off the list of programs needed to get onto the corporate network. Microsoft has caked in native support for SonicWall, Juniper, Check Point, and F5 VPN firewall connectivity. You check check out the short spiel on this from the BUILD 2013 event on Windows 8.1.
Simplified, Easy Management of Devices in Windows Intune: Windows 8.1 and 8.1 RT brings the next progression of Windows Intune to the forefront, allowing for unified centralized cloud management of all your Windows devices. Schools, nonprofits, and businesses can all benefit from such management capabilities, especially on the Windows RT side where devices cannot be natively joined to Active Directory. What used to be relegated to MSC 2012 for an on-premise server is now extended fully to the cloud with Intune.
This is excellent news for organizations with little budget for capital expenses on servers and extensive setup labor. Intune comes in as cheap as $6 USD/month for licensing without Software Assurance -- a real bargain compared to some BYOD management suites.
Windows 8.1: UI & Functionality
Smart Search: If you happened to enjoy the excellent built-in search capabilities in Windows 8, then 8.1 is not going to let you down. The same speedy, integrated search experience still exists and permeates in Windows 8.1, but is brought to the next level via Bing inclusion. This means you can search for anything from the Modern UI interface. Last week's financial spreadsheet, your family photos from vacation, the apps you use day to day, and internet-powered results for items like news, weather, research material, etc. Paul Thurrott goes into excellent detail on this functionality on his blog.
Windows Button "Right Click Menu": The fact that the Start Button is back in a limited form is old news. The best part about the new Start Button is the hidden "right click menu" that provides a juicy amount of easy access to numerous functions most IT pros take for granted. Heck, many of these items are now easier to get to then they were in Windows 7. This not-so-secret menu makes it simple to shut down your machine, open command prompt, or a bevy of other options. Yes, this easter egg of sorts existed in stock Windows 8, but the addition of the new power-related options make this revised menu a keeper. See below for more information.
Boot Straight to Desktop Mode: This is awesome news for workers like me, that spend a majority of their time in the Desktop interface and only dabble onto the Modern UI for a little after-hours activity. In 8.1, anyone can configure their PC to boot straight to the familiar desktop interface, meaning you can get straight to work, bypassing the Start Screen entirely. A small, but timely, touch.
Just as Fast, Just as Slim as 8: This isn't necessarily a feature per-se, but it's one heck of a nicety. Not only does Windows 8.1 have the same exact hardware requirements as Windows 8 - but these happen to be the exact same requirements as Windows 7 had. What gives? Microsoft has created a lean mean, processing machine in Windows 8 and the trend continues. While I don't have any hard numbers or results to back it up, my non-scientific feelings lead me to believe that 8.1 runs smoother than 8, most notably on startup and shutdown. SSD users are in for a blazing fast experience. You will likely miss the new boot screen with the funky Betta Fish if you don't look carefully.
This also means that those computers which would have otherwise gotten canned at the office can more than likely get upgraded to Windows 8.1. Seeing that Windows 8 did away with the memory and battery intensive Aero interface, you will also ideally see better pound-for-pound normal usage performance from Windows 8.1 then you otherwise would with the same specs loaded up on Windows 7.
The bloat that Microsoft was known for from Windows 2000 to XP, then from XP to Vista is a thing of the past.
Miracast for Wireless Display Projection: Miracast is an open alternative to (but fully compatible with) Intel's WiDi technology that started the wireless projection market. Windows 8.1 has this neat technology built right in, which means if you have a Miracast enabled projector in the board room, but your laptop doesn't have the DisplayPort connection that it needs, you can use Miracast to setup a Wi-Fi Direct signal to the unit in order to present your material. Xbox One has this built in as well, and many new displays and projectors are promising to bundle this technology going forward. Could this be the beginning of the end for cable and dongle-heavy board rooms? Time will tell.
(Image Source: BusinessInsider.com)
Windows 8/RT Store Not a Ghost Town Anymore: The media kept rubbing Windows 8 for not having much of a useful Windows Store at launch last year. How the tables have turned. While the situation is not on par with what Android or iOS have, the fact that Windows Store now offers 54% of the top 100 iOS apps is pretty commendable 10 months in. As of September 1, the Windows Store had over 114,000 apps and counting. The list will only keep growing as time goes on.
Internet Explorer 11 is Faster, Safer: As if a new Internet Explorer release was ever missing those two descriptions, IE11 promises a little bit of "better everything." Faster load times, better compliance with web standards and HTML5, side-by-side browsing of sites, and deeper anti-malware integration for catching nasties before they are even allowed to launch. For companies that rely on a strict policy of IE due to corporate protocol or ease of administration, IE11 will be a no brainer to make life easier on the end-user and management fronts.
Surface RT gets Outlook 2013 with 8.1 RT: A lot of news outlets are forgetting to cover this factoid, but for business professionals leveraging the Surface RT, they should know that Outlook 2013 is launching alongside Windows 8.1 RT for the Surface RT. Free of charge, of course. This means Office 365 users who have Exchange Online accounts, or other IMAP/POP users, can easily connect their email to arguably the most powerful email and organization app ever made for the computing world. I'm rolling out Surface RT tablets at some of my customer offices, and this will make the product even that much more attractive for the workplace.
3D Printing Support Out-of-the-Box: Not the biggest deal for most businesses (yet), but if the 3D printing revolution ever takes off, Windows 8.1 has the necessary hardware/software support built in to leverage the technology natively. My only experience with 3D printers was at my old school district job, where the Applied Arts department had one that could make CAD-developed prints out of plastic in 30-60 min timeframes. Perhaps the newest generation of these units are much more dazzling in wait times. I expect numerous markets to start taking advantage of the dropping price points and improving technology to bring their once fanciful creations to life.
Multi Monitor Support that Rocks: I thought my dual screen capabilities were awesome in stock Windows 8, but 8.1 just takes it to a whole new level. Dragging and dropping individual Modern UI apps to different screens is fully supported for the first time. The Start Screen can be accessed via Charms that work on any screen now. The taskbar can be fully extended over a single set of monitors as well. And the level of customization that is now inherently provided for snapping and resizing Modern UI apps and your Desktop interface are limitless. Multi monitor workstations will greatly benefit from Windows 8.1 -- further creating screen envy among your coworkers.
Prefer to use a hybrid Desktop-Start Screen setup with your dual screens? Such new combinations, and more, are finally possible in Windows 8.1. Microsoft has taken multi-monitor support to the next level and truly created a customizable experience worthy of what Windows 8.1 offers. (Image Source: Microsoft TechNet)
(Even) Better Battery Life: My day-to-day Thinkpad T410 got a boost of about 30-45 mins when I made the switch from Windows 7 to Windows 8 last fall. Pretty significant. While I don't have as much data to back up a statement on Windows 8.1, naturally, Microsoft is saying that OS is taking big advantage of the power saving capabilities in Intel's Haswell platform. I'm itching to see how the final RTM build handles my Thinkpad X230 Tablet as my new primary mobile workhorse, but the 8.1 Pro Preview has been giving me a solid 6 to 6.5 hrs of life on an 8 cell battery. Impressive for a pre-release, buggy iteration.
Hand-in-Hand SkyDrive Integration: It's no surprise that Microsoft is tightly combining the features of SkyDrive Personal into Windows 8.1. The cloud storage tool is a worthy competitor to Google Drive; and heck, I use it now for all of my personal Office document storage because it just works cleaner than Drive due the fact that I can leverage Office Web Apps on the go when I need to. SkyDrive is now a persistent default save location for numerous functions and apps in Windows 8.1, going beyond what Office 2013 brought to the desktop in terms of pushing SkyDrive. Hey -- there's life beyond work, after all.
Auto Predict for Touch Typing / Tablet Mode: For those using Windows 8 on tablets, like the Surface RT, typing in touch mode was a bit of a chore for a lack of proper auto predict. Windows 8.1 resolve this quite nicely. Now, you can continue typing and use predictive suggestions from the OS with complete ease. This means that Surface RT users in the field within tablet mode shouldn't have to go into laptop mode that often anymore for short-spanned typing. Should make Word 2013 a breeze to use without the cover opened!
Windows 8.1 hits the market officially for everyone come October 17, and is a completely free update for all. My company will definitely be recommending 8.1 for both consumer and business usage as time goes on, especially if the wonderful improvements from the Preview translate into real-world benefit for end users come October.
I'm calling it now: Windows 8.1 is the new Windows XP for the workplace. Just watch.
Derrick Wlodarz is an IT Specialist who owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over eight+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud-hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Just two years ago, before the Surface RT was even on the horizon, another alternative entrant in the computing market was posting miserable (Surface RT-esque) sales after launching. The suspect in question, Chromebook, was only able to post about 5000 units sold for Acer in the two months after its launch in June 2011. Samsung supposedly fared even worse. Analysts across the industry were taking bets on when Google would throw in the towel on Chromebook. They all but called the device destined to fail.
Fast forward just two years, and Chromebooks now represent the fastest growing PC segment already. In fact, as of July 2013, they officially snagged 20-25 percent of the sub-$300 laptop market. And the warm feelings for Chromebook are anywhere but over. The radical alternative to Windows and Apple laptops is poised to grow another 10 percent in just 2013 alone. The burning question still stands: how did the analysts get it so wrong?
Within two months of its official launch in June 2011, Acer and Samsung were reporting horrendous sales for the Chromebook. The industry as a whole was waiting for Google to just cut off life support. Turn to 2013, and Chromebook is poised to grow another 10 percent on top of its already commanding position as the fastest growing PC segment. The Surface RT today is the Chromebook of 2011 -- it just needs a fair chance.
Here's exactly what happened, and why all the hotshot analysts got it wrong. As an alternative computing device, the Chromebook wasn't allowed to enjoy a rookie year before getting shredded in the overzealous media. Google launched the Chromebook as a departure from what Windows and Apple and even Android were doing. They said the browser could be the focus of our computing lives. They said we didn't need rows and rows of apps on our screens. It was tough to believe Google in that the market would eventually understand this, seeing how the iPad and its never-ending App Store were the center of most positive media coverage on alternative computing starting in 2010.
Yet, even as an underdog that was kicked while it was almost out, Google pushed forward with the Chromebook to bring it to where it is today. A solid sub-$300 laptop that is now rolled out in over 2000 school districts in the USA, and is being embraced by consumers month over month it seems -- many of which are likely looking for something more capable then single-purpose tablets like the iPad and Nexus 7. And whereas Google launched the Chromebook with only two available flavors in June 2011, you can now choose from no less than eight through retailers like Best Buy.
Is the Chromebook a success story? As a platform that was being prematurely called dead meat, absolutely so. The real reason I wanted to highlight the success of the Chromebook is because we now find another fresh competitor in this same position: the Surface RT. There's much hoopla in the tech media as of late over Microsoft's embarrassing $900 million inventory writeoff on the Surface RT, and once again, the blogosphere is jumping the gun on calling the device down and out.
History is going to repeat itself with the Surface RT, just as Google proved the world wrong with Chromebook. And I'm going to be the first one to go against the tide. Here's my reasoning on why the Surface RT is in it to win it.
1) Microsoft Has a Good Record in Turning Around Loss Leaders
Too many people call Microsoft a stupid company that is out of tune with customers. These same voices are the ones which place the focus of their perception around Microsoft solely on Windows. When Windows is doing very well (Win 7) they give it a pass, but when Windows is having a tough time in the media (Win 8) they're all over Redmond like flies. The Surface RT is no different, and recent sales figures were just the ammunition the opportunist-media needed.
Little do they know, or worse -- care to remember -- but Microsoft is not the new kid on the block when it comes to turning around products that need a certain incubation period on the market. Most people wouldn't remember, but the original Xbox (and even the 360, for most of its recent life) was a profound loss leader when it came to profits. Microsoft's Home and Entertainment division was reporting the largest operating losses of any division at Microsoft back in 2002; nearly entirely driven by the overwhelming profit disappointment that was the Xbox.
And if you think that Surface RT's $900 million loss recently was a black eye for Microsoft, consider that Xbox as a whole has reportedly tanked almost $3 billion in just ten years. For those calling Surface RT a disaster, they should probably label the Xbox brand as a colossal atomic bomb for Microsoft. Yet, surprise-surprise, the Xbox has now officially been the best selling game console in the US for 31 straight months and has posted decent profits since turning the losing tide in 2008.
Microsoft launched the original Xbox in 2001, only to enjoy almost a decade of financial blunder with the system totaling the company nearly $3 billion in losses. The Xbox 360 is now solidified as the best selling game console in the USA for the past 2+ years, and is looking forward to similar success with the upcoming Xbox One. The Surface RT naysayers forget that Microsoft has a knack for 'toughing it out' on new brands. (Image Source: DigitalTrends.com)
Xbox isn't the only category where Microsoft has proven the skeptics wrong. Windows Server was another questionable entrance into a market that was being dominated by Novell, Unix, and a growing Linux diaspora back in the 1990s. The product was met with skepticism at first, and Active Directory was pegged on whether it was a plausible technology for the enterprise. Shift gears to 2013, and Microsoft now commands a dominating 52.2 percent of the server market, sitting at the top of the heap.
And even as the physical server market continues to contract as the public cloud arena takes over, Microsoft has nothing to be scared about. Its Azure PaaS/IaaS platform hasn't overtaken Amazon yet in market share, but Microsoft has enjoyed nine straight quarters of growth at 10 percent or better. And if it continues picking up legions of clients in tune with the 50 percent of the Fortune 500 that are supposedly on Azure already, then bright days are indeed ahead.
In a long roundabout, what's the moral of the story here? Early sales figures alone should not, and will not, determine how successful the Surface RT will be in the next 3-5-7 years out.
2) The Surface RT Value Proposition Against iPad Is Substantial
Robert Johnson from BetaNews posted a Surface vs iPad ad video that Microsoft released a short time ago which is right on the money. Interestingly, it hits Apple twice as hard because for those of us who recall the resemblance of the ads, they indirectly mock the fruit company in the potshots that were taken at Microsoft some half decade ago in the famous Mac vs PC TV ads. The delivery of the ads are a bit different, as Microsoft opts to tell its narrative in a head-on 'show and tell' style approach focusing on the devices. Apple hired actors to do the (rather entertaining) story telling.
The meat of the ad is spot on, and one that they should have released on TV for the masses immediately when the Surface RT launched. While the original effect of the "Surface Movement" commercials featuring synchronized dance around an ocean of Surface tablets being moved, clicked, opened, and thrown was catchy, it failed to capture the edge in an important debate. Why exactly was the Surface RT better overall than the iPad?
This newest advertisement lays it all out in a matter of 30 seconds. And the mockery of the robotic Mac OS machine voice adds the finishing touch. The lack of a USB port. The lack of any keyboard or touchpad. And a point which the ad forgot to touch on, which was a bit surprising, is the lack of Microsoft Office. At $350 with as many glaring advantages the Surface RT holds over the iPad, the only thing holding it back at this point is an app store selection that is admittedly playing catch up still.
Why it took Microsoft ten months to come out with such a spot-on ad is beyond me. But they finally hit a home run in the war on functionality. Exposing the iPad where it really hurts, namely in port selection and input limitations, and finishing off on price, should have been a launch date recipe for success. (Image Source: Microsoft TV Ad)
But the app store equation is quickly changing. It was reported earlier this month that the Windows Store has 54 percent of the top 100 iOS apps available already on the Windows 8/Surface RT ecosystem -- more than half only 10 months into its life. And this number is only going to improve as time goes on, seeing that developers are finally taking the Windows Store market seriously.
The debate around functionality should not be solely centered on app store counts. A million apps at your disposal are worthless if a majority of them are only decent, or similarly, if you're only bound to use a handful in the end anyway. The real question is which tablet provides the overall experience and capability to meet a full circle of needs for today's users. As I argued in my Chromebook vs Surface article from last year, I don't think the iPad (and likewise, Android tablets) are the silver bullets people are looking for in PC replacements.
iPads and Android tablets may have a leg up in sales right now due to being first to market, but I fully believe that Surface RT is going to win the value proposition debate in due time.
3) Surface RT is Anything But Another Zune
Longtime Windows blogger Paul Thurrott voiced in on the future of the Surface earlier this month, and while I generally value his opinion on all things Microsoft, his comparison of the Surface to that of the now-demised Zune brand is a little puzzling. For those who are unaware, the Zune was Microsoft's attempt to dethrone the iPod starting in 2006. It's one of those offerings most at Microsoft would rather forget entirely about. Zune could easily be a case study for how not to flop a product.
But where does the Surface RT fit in with the Zune? How are we drawing parallels to two products that are leagues apart in market positioning, competitive advantage, and opportunity? These are the factors many refuse to dissect and realize.
There's no apples-to-apples discussion possible between the Surface RT and Zune. Here's why:
To make it worse, the Zune didn't even offer any true value proposition that trounced the iPod during its time. Not even on price. Sure, it may have had a few specs that were better here or there, but nothing blockbuster like the Surface RT brings to the table over the iPad: Office preinstalled; keyboard/touchpad; native USB support; and the duality of the familiar Windows desktop along with the Modern UI. Once more, Zune and Surface RT are as apples and oranges as we can get in comparisons.
4) "Trying to Fit a Square Peg in a Round Hole": iPads in the Workplace
I originally highlighted Liz Davis' comment in my article a month back about the education sector and its misguided plans surrounding 1:1 computing initiatives. And yes, while her statement was aimed at expressing the frustration with getting iPads melded into a modern curriculum in a school setting, the sentiment she holds is similar to that of what I experience with many of my technology consulting customers looking initially at iPads for their lines of work.
Before we can even establish a set of requirements or business needs, too many people already have their hearts and minds set on iPads. Questions such as "Does this fit in my existing ecosystem?" and "Will this device have the capability to work the way my staff needs it to?" fall by the wayside. Instead, I'm deflecting questions like "How many iPads do we need to get this done?" and "How much will they cost?"
Pitting a solution onto the table before establishing your use case needs, wants, and limitations is a losing proposition. The business community is not alone in making these mistakes; I've already given US K-12 education a piece of my mind on its own blunders lately. If media hype is the determining factor for what technologies are best for our needs, we're all in trouble.
While I am not saying the Surface is a better option in all situations, as saying so would be a complete lie, I'm finding it to be a better bet in many scenarios for the commercial and education sectors. For example, my company is knee deep right now in replacing a fleet of netbooks for a small business with a line of Surface RT devices coupled with Windows Server 2012 Remote Desktop Services (RDS) to handle day to day mobile computing needs. And we aren't reinventing the wheel to get it done, either.
Microsoft Office 2013 out-of-the-box; a full size USB port; a MicroSD slot; and a keyboard/touchpad cover to actually get work done. If the iPad offered the same features targeted for the workplace at a comparable or lesser price point, the Surface RT would be a non-starter. Microsoft's going to win over those who find the iPad and Android tablets too limiting as laptop replacements. (Image Source: Microsoft)
We're ridding workers of flaky VPN access and a clunky form factor in exchange for a session-based RDS desktop environment with touch capability as necessary. Leveraging RDS allows us to deliver super secure traditional Office-based file creation and sharing within the company network, without the need to pay for LogMeIn or another recurring access service as would be the case on an iPad. Plus, workers have access to physical keyboards and touchpads, which are key to getting work done in short order. Touch is nice, but not when accurate customer proposals are on the line.
Situations like this are just a sample of the scenarios where the iPad (and Android tablets) are having a hard time filling the void in the workplace. Their lack of Office, physical keyboards, and the familiar Windows desktop (as well as Windows Remote Desktop functionality built in) makes it tough to say that the iPad is a better sell for the commercial customer. If your needs are fully based around consumption, any tablet out there will do.
But those looking to get work done in a familiar, secure environment with little frustration, I have yet to see any tablet offer this as easily as the Surface RT (or Pro).
5) Surface RT 2 is Already in the Works, and Outlook 2013 is Coming
I'm not going to dwell too much on this final point, but nVidia has recently let it leak that they are already well on their way into developing the innards for the Surface RT 2. Many were already suspecting that Microsoft was secretly developing the second generation device, but now we know: it's coming. We just don't know when.
It's also worth noting that the launch of Windows 8.1 is going to provide some nice improvements for the Surface RT as well. Not only will it benefit from all the interface changes, like the new Start button and boot-to-desktop functionality, but Surface RT users are going to get free copies of Outlook 2013 added into the array of other Office 2013 apps they already get for free.
This was the big sore thumb that stuck out for Surface RT purchasers, who had no recourse for getting Outlook since the RT runs a specialized edition of Windows which cannot install a regular copy of Office. Seeing how many business professionals desire Outlook for their email needs, its only natural for Microsoft to get it onto the Surface RT. One less reason to knock the Surface RT.
Why Microsoft Blew the Surface RT Launch
The Surface RT is a wonderful piece of tech. It's not the device itself holding back sales and uptake. Redmond made a few big blunders at launch which it has been slowly fixing, but here are my top 3 root causes for a stunted start:
Will the above blunders kill the Surface RT's chances? No way. Microsoft has been slowly changing course on these mistakes, and improvement in sales are already starting to be seen. The road ahead is rough and will not turn around overnight. But given Microsoft's repeated history in taking products with underwhelming sales and staying the course until profitability, I fully believe that Surface RT can turn into a top-3 contender in the tablet market -- if not an outright leader at some point.
Don't count Microsoft out yet. Google proved us all wrong on the Chromebook, and Microsoft has a good chance of doing the same with the Surface RT brand.
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Google shocked the tech world back in December of 2012 when, out of nowhere, it announced that Google Apps Free Edition was going bye-bye. I was also a bit disappointed to hear about this, as it provided a free way for clubs and small businesses of 10 users or less to leverage the power of Google Apps for their email, calendaring, contacts, etc.
Yet as a consultant to numerous clients supporting clients on the Free edition, who knows the unreasonable expectations they sometimes hold the (free) service to, I can see Google's justification for pruning the bushes here. We can all agree: it was good while it lasted.
Luckily, there is another stable (and also completely free) option for those who want to route email for their custom domain through something other than terrible POP or IMAP email through their webhost. I've got nothing against webhosts themselves -- but their 'free' email solutions are horrible. From GoDaddy to 1and1 to the other big boys all advertising their 'free' email services, none of them are worth their weight in salt.
Enter Outlook.com. Yes, that fierce new competitor to Gmail that is quickly picking up steam. BetaNews writer Mihaita Bamburic pitted it head to head against Gmail, and while he claims it doesn't topple Gmail (yet), it is still a very solid solution to free email. And Microsoft has been doing anything but sitting on its heels with this new cloud service.
In a self-boasting blog post on the Outlook blog, Microsoft wanted to ensure we all knew that the platform has hit 400 million users as of May 2013; that it's the 'fastest growing email service' in history; and that the platform has received over 600 improvements in just 12 months since inception. Not a shabby resume for the service's rookie year.
But is Outlook.com Actually a Good Service?
The definition of good when it comes free cloud email service is highly subjective. Let's frame our discussion here on the same baseline so that expectations are appropriate. Google Apps Free Edition is dead. You may be someone that would have gone that route for your non-critical custom domain email needs, but instead you're likely just forwarding messages to another service, or worse -- having to put up with pitiful registrar or webhost provided POP or IMAP email. What other decent options do you have at this point?
Now that we've framed the debate on proper terms, you can either stick with using awful half-baked webmail through a webhost, or funnel it into a bloated Outlook PST file, or look to a more plausible alternative. Outlook.com is fully capable of working with your custom domain email with just a little elbow grease. I set my own personal domain up on Outlook.com and it took me no more then about 30 minutes. And no, I didn't have any experience doing this for any customers yet.
So what do I actually think of the service? For someone that lives in Google Apps, yet is fairly familiar with the Office 365 Outlook Web App experience, Outlook.com feels right at home in most ways (you can read my full thoughts on the new O365 OWA interface from my full Office 365 review). The interface is more than similar to Outlook Web App 2013; the two seem to have come from the same mother with a few nuances changed here and there. See below to understand what I mean:
The Office 365 Outlook Web App 2013 experience (top) strikes considerable resemblance to the Outlook.com interface (bottom). From reading pane options, to a common blue top navigation bar, Microsoft is clearly unifying its cloud email ecosystem akin to what Google has between Google Apps and free Gmail. The biggest difference? An ad bar that straddles Outlook.com on the right hand side.
The similarities are not a bad thing, per se. I think Outlook Web App 2013 is a BIG improvement from what all previous versions had to offer (particularly pre-2010). It's partially why only recently have my customers become comfortable with ditching desktop Outlook in favor of Outlook Web App full time on Office 365. And I'm glad to see Microsoft taking cues from what's working on the commercial side of its offering.
The core workings of Outlook.com as a custom email domain host work the same as what any Outlook.com user already enjoys. Spam filtering, for example, is top notch and matches the pleasant experiences I have been seeing on Office 365. While Microsoft doesn't come out and say it publicly, there is a blurb on a tucked away Spam Filtering and Hygiene page in the Outlook Web App help center which alludes to Outlook.com using Exchange Online Protection.
EOP is the bona-fide powerhouse backbone that provides all things malware and spam filtering related for Office 365, and if it indeed is in use for Outlook.com users, then piggybacking Outlook.com for your own free email domain needs is a sizable win-win scenario (and a definite bargain).
Moving your email needs to Outlook.com? No need to ditch your aging AOL account, even if those scoundrels don't offer free forwarding service. Outlook.com supports native additional POP accounts that can be pulled right into your main inbox. Send and receive email from your legacy AOL, Yahoo, Comcast etc accounts with no effort after initial setup!
Microsoft even offers a fairly decent mobile app that is targeted at Android users (sorry iPhone folks) which mimics what the email experience is like on the Windows Phone side of things. And I happen to enjoy it. The app isn't getting the hottest reviews on Google Play, and I'm not dismissing all of the negativity, but in my own personal time with the app, I didn't experience any of the slowdown or gripes that some people highlight. It could be that I haven't spent enough time on it. Try it for yourself, I say, before passing judgement.
A screenshot from the Outlook.com Android app, culled from the Google Play listing, is shown above. If you've ever touched a Windows Phone before, you would be hard pressed to notice a difference. Folders slide in from the left side; there are extended options that are highlighted in the screenshot; and the buttons used are large and easy to use without making mistakes.
There Has Got to be a Catch
So what downsides could possibly exist to using Outlook.com for your own domain's email needs? There's no such thing as a free meal, and Outlook.com doesn't break that mantra. While many reviewers have focused solely on subjective opinions about usability and how the service (rightly or not) compares to Gmail, here are some solid objective topics that may sway your decision on the service:
So just like anything, Outlook.com has its upsides and downsides. But as a clear alternative to the free Google Apps edition which was wiped away, its a pretty positive option for those who do not want to keep using Outlook locally to store their PSTs or are tired of the 1990's SquirrelMail interface from their webhost's free email service.
How Can I Get My Domain Email Working on Outlook.com?
The procedure for switching your domain to use Outlook.com as its email backbone is pretty smooth, as long as you follow the instructions. Microsoft doesn't advertise these instructions publicly, but numerous individuals online have created similar tutorials since last year showing the same results. Is there a chance that Microsoft may pull the plug on this at any time? Sure -- but then again, Google shocked us when Google Apps Free Edition died, too.
I am going to walk you through a step-by-step process of turning Outlook.com into the funnel for all of your email for any custom domain you may own. Yes, this means you can potentially use this for business email service. If I said you couldn't I would be lying. But should you do so?
As an email service consultant to my clients, I would never recommend this for any sizable operation that needs 24/7 uptime and has to have an outlet of immediate support when things break. Some situations like a one person small business or a professional organization that has no IT budget may be justifiable. Just don't come crying to Microsoft if things don't work as intended at 3am when you need to get that proposal or other important email out.
With that said, as a host for a personal email domain, Outlook.com fits the bill nicely. My own personal website domain email doesn't have to be operating like clockwork, as it only deals with a handful of messages per week (non-business related) and a little downtime won't hurt anything. So I decided to make the move and see what Outlook.com had to offer. The process was seamless and the transition took little effort.
Here's how you can do the same!
1) Sign into the Windows Live Admin Center. This is where Microsoft has tucked away the slightly-hidden area that initiate the capability for Outlook.com to connect with your domain's incoming email. This is located at: [https:]. You can click on the Get Started link on this page after signing in.
2) Plug in your custom domain name. From here, you need to merely enter your custom domain name into the proper box. And be sure that "Set up Outlook.com for my domain" is chosen just above the Continue button.
3) Confirm your settings look correct. This is where Microsoft ensures everything looks right, and also asks you to accept their TOS for Outlook.com usage with a custom domain.
4) Prove your domain ownership through an MX change. This is where you need to be able to login to your registrar to make the change which enable mail flow to hit your new host, Outlook.com. Be aware that this WILL start routing email to Outlook.com fairly quickly (depending on your TTL setting, or Time to Live, on the MX record currently set). Don't go past this step if you are unsure about using Outlook.com for your ongoing email needs.
The first graphic below shows the screen Microsoft presents which gives you the necessary information to change your MX record. The second is a sample shot from my domain registrar, GoDaddy, for changing the MX records in their DNS Zone Editor. This process is different on every domain registrar so beware and consult your registrar for correct instructions. Call support if necessary -- this step can have devastating consequences on potentially lost email.
5) OPTIONAL but recommended -- SPF Record creation. Some hosts like 1and1 do not allow for TXT records on their DNS settings area, which prohibits you from using this spam fighting feature, but most other reputable domain registrars support them. You can use the TXT record entry option on your registrar to add in what is known as an SPF record. This simple little entry allows other email services which enforce the functionality to fight spam that would otherwise mask itself as coming from your domain to others. It's not critical to have, but it does help put a dent in the toolbag of the scummy spammers out there masquerading messages with your email domain on them.
6) Add your email accounts, up to 50 of them. This is where you will now be allowed to create all of the addresses (or accounts) you need on your new Outlook.com backbone. You have full control as the administrator to make any combination of addresses, up to 50, and give everyone an inbox that needs one. This beats the 10 account limit the free Google Apps used to have, so in some ways, this is actually pretty neat! If you use Outlook.com to host your family's personal domain, you can give every uncle and aunt their own email account.
7) Sign into your new Outlook.com powered custom email account! Now that you have setup your account, you can freely sign into your inbox and start playing around. Go through www.outlook.com just like any traditional user, and use your custom email domain address and password you just created. If you got your steps right from above, you should see your inbox and you can even write your first email.
There may be a point where Microsoft asks you to verify your mobile telephone number in order to fight fake bot-created accounts, and this is also to be expected. I only had to go through this once and was never asked about it ever again. It looks a little something like this:
Even in light of some of the negatives like the slightly overbearing ads and a 365-day dormant account closure warning, the upsides from this completely free and plausible platform for your custom email domain needs are plentiful. Unlimited inbox storage space, check. Mobile access on all modern smartphones as well as on desktop Outlook, check. A darn good web interface that rivals Gmail... check as well.
I am going to continue testing the platform to see if I find any items which throw me a curve ball, but the service as a whole seems pretty straightforward. It will be interesting to see if Microsoft changes any of the functionality surrounding custom domains on Outlook.com in the future, seeing that the Windows Live Admin Center is chock full of references to the now-dying Hotmail platform and 'Windows Live' brand. It could be a throwback to Microsoft being in the middle of a transition with branding on their cloud services, or it could be the vestige of a soon-to-be-closed functionality. Only time will tell.
For now, I am making an educated guess and expect Microsoft to fully keep supporting this sly alternative to the now-dead Google Apps Free Edition. Besides, the writing is on the wall in Microsoft's bitter fight to keep Google at bay in the online services arena, especially with free email. They just integrated Skype connectivity with all Outlook.com users, and the 'Scroogled' campaign is by all means far from over. If you don't have the budget to pay for Google Apps or Office 365, then Outlook.com is a darn good option in my eyes.
Photo Credit: 2jenn/Shutterstock
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
With Microsoft's recently-announced reversal of its anti-consumer DRM policies for the Xbox One, the next gen console war has suddenly been brought back to a level playing field. It's no longer a battle of who had the upper hand at E3 this year, who does/doesn't require impractical internet connectivity 24/7, or whose specs are better on paper. The most hated DRM underpinnings on the Xbox One have been unequivocally rescinded, meaning that we can finally have an honest discussion of what the two next gen consoles will offer apples-to-apples.
While Sony has been squarely riding its momentum on cruise control since E3 this year, trying to keep the debate on its terms after usurping Microsoft in the opinion war, reality is coming back into the middle ground finally. And seeing that we are just months away from these hotly anticipated console launches, Microsoft is slowly unraveling its winning plans for the Xbox One.
Microsoft's vision for the next gen console war isn't concerted on a single area. Instead, its putting its money on a holistic approach to winning the hearts and minds of gamers and media consumers alike. Here's my take on why Microsoft's on track for making it happen.
4. Hardware Specs Alone Won't Win a Console War
The PC gaming culture is used to dueling it out for bragging rights. Better video cards, more memory, faster processors -- in general, it usually equates to better in-game performance. And rightfully so: PC gamers lining up spec sheets against one another, like car enthusiasts of the digital realm, are engaging in an apples-to-apples debate. That is, PC gaming and subsequent performance is all being measured along a level playing field representing the same underlying platform.
Unlike consoles, the guts of any gaming PC can be overclocked, decked out, and expanded because in the end, we're usually talking about the same software experience. Whether it be Crysis 3 or Call of Duty or any other PC game, computer gamers are all purchasing the same bits and running their titles on some flavor of Windows (yes, I know MAC gaming exists - but it's still the exception). When one set of variables is relatively stable (i.e. software platform), raising and lowering countering variables (i.e. hardware specs) will have measurable effect that has meaningful impact on what a gamer sees.
So why are all the online discussions that describe each bit or byte that the PS4 outpaces the Xbox One at essentially hot air? Because of the very plausible reality that the Xbox One will be able to do 'more with less' just like the Xbox 360 has been consistently offering up against the PS3.
Just as a few examples, NowGamer ran a head to head comparison of the multi platform title "Rage" and found that the 360 version looked better in most instances. VideoGamer.com came to similar conclusions about the 360 version of Grant Theft Auto IV. And CVG rounded up comparisons of four other hot titles, including Call of Duty Black Ops, and offered up reasons why the 360 versions just plainly looked and played better.
Hold on a minute: wasn't the PS3 and its Cell processor dubbed as being upwards of 3x more powerful than the Xbox 360 when it came out? Even more troubling for Microsoft should be the fact that the PS4's GPU is being claimed as being 7.5x more powerful than the Xbox One's GPU chip. But if the current 360-PS3 generation console battle has taught us anything, it's that internal specs are little more than cannon fodder for fanboy wars. Xbox executive Albert Penello was absolutely right when he called the spec-war of upcoming consoles "meaningless".
Does a more powerful game console lead to better games? Even though the PS3 has more internal horsepower, this isn't translating into a better gaming experience -- at least one that shows in the sales charts. The Xbox 360 represents a full 60% of the top 10 best selling console games in the USA Jan 1-Aug 3, 2013. Of these 6 Xbox 360 titles, 4 of them have PS3 versions, further leading credence to the notion that hardware specs have little correlation with great games.
Multiple big name game developers have stated just as much publicly. John Carmack, the mastermind behind the engine powering the Doom series, let it be known that PS4 and Xbox One are "close" and "common" in capabilities based on his early experiences with the consoles. So even with a superior spec sheet, this doesn't necessarily translate into everyday performance or graphical advantages.
This is further exacerbated by Sony's unorthodox viewpoint towards allowing for easy development on its consoles. The PS2 was notoriously difficult to develop for, and the PS3 was no different -- with the CEO of Sony Computer Entertainment actually defending the practice back in 2009. Kaz Hirai of Sony CE said: "We don't provide the 'easy to program for' console that (developers) want, because 'easy to program for' means that anybody will be able to take advantage of pretty much what the hardware can do, so then the question is, what do you do for the rest of the nine-and-a-half years?"
I'm perplexed with his reasoning of making Sony's consoles so tough to write code for. If game development in Sony's eyes is an exclusive Club Med only for the select few, then they are going to continue fostering a culture of sub-par development for the PS3 and soon to be PS4. Unlike the Xbox 360 and Xbox One, which are built with a knack for bringing in development knowledge from the PC game arena (where many budding game developers have their initial experience), Sony's approach is to give game developers a completely new 'vision' to develop for with each new PlayStation release.
Is this working for them? Some people would say yes, but I'm going to disagree -- especially with how many vocal developers are speaking out against Sony's arcane development requirements.
Am I necessarily saying that there aren't titles which can leverage the power of Sony's systems? Crytek, for example, publicly admitted that it was harnessing the prowess of the PS3 a bit better for Crysis 2 back in 2010. And other titles out there are in the same boat. But the truth of the matter is, there is no chorus of developers coming out and talking about the easy time they are having working with Sony's systems. Some development shops may have much greater people-power to dump into learning the ins and outs of Sony's development style, and for those that do, the results shine. I find them to be more of the exception then the rule after scouring opinions from developers across the industry online.
For its part, it seems Microsoft isn't immune from wanting some of its own share of the bragging rights. It has been reported that Microsoft is upping the GPU core clock speed by about 53MHz, from 800MHz to 853MHz. It's to-be-seen how much of an improvement this will have, but this late in the game, Microsoft internally probably had good reason to call the audible and tweak the internal Xbox One hardware specs. This speed increase doesn't take it over the top against the PS4's GPU, but it should add some extra kick in the long run.
Clock speeds and bandwidth pipelines aside, history has already told us: if you can't effectively utilize the hardware you're given, spec sheets are worthless. As far as gaming consoles go, of course -- PC gamers need not apply here.
3. Toppling the Leading Incumbent is Easier Said than Done
The scene for the upcoming console war between PS4 and Xbox One doesn't represent a traditional election in most respects. But if you look at it another way, they are both vying for your vote -- albeit a dollar vote. And in this light, just like an election, Microsoft and Sony are both vying for mindshare, the best titles, and any other edge that will give one side a leg up.
Yet if the console war can take anything away from political life, it's that toppling incumbents is a hard thing to do. Just ask Mitt Romney. He was riding waves of positive signs before the 2012 US Presidential Election, from highly-accurate college researchers predicting certain victory down to polls which were showing Obama falling flat due to the economy and the rising Benghazi scandal. Even former Bill Clinton confidante Dick Morris predicted a huge victory for Romney.
They were all wrong. Incumbency, it seems, blindsided everyone and once again proved how large of a mountain the underdog has to climb in order to unseat a sitting leader. When it comes to current generation gaming consoles, Xbox 360 is by all accounts the leader of the pack (in the States, specifically - which is where I'm focusing on). According to all official sales data from NPD, as of the end of June 2013, Xbox 360 has held the Wii U and PS3 at bay for nearly 30 consecutive months in the United States. That's 2.5 years of leadership in the sales charts; a remarkable feat for a relatively amateur console maker (especially compared to Sony) that only made its debut with the then-newcomer Xbox just over a decade ago.
And being the best selling game console for that long carries quite a bit of 'good karma' so to say. I already showed above how 360 is carrying 60% of this year's best selling USA console game titles so far, and may finish the year even stronger if the 360's holiday season delivers on hot up-and-comers like the new Assassin's Creed, Call of Duty, and Battlefield games.
Gamers in the States have a clear affinity for the 360 and its game selection. Where the PS3 carries a hardcore gamer stigma, and the Wii (and Wii U) are a bit on the younger and gimmicky side of gaming, the Xbox 360 has been able to carve out a following of the "average" gamer. One that enjoys a wide variety of titles, and happens to leverage their console for more than just gaming.
2. Microsoft's Online Gaming Backbone Puts Sony (and Nintendo) to Shame
For the longest time, Sony's one-up on Microsoft in the online gaming arena was the fact that PlayStation Network on the PS3 was always free. Why pay for something that someone else can give you for the price of nada?
Yet, unfortunately, Sony learned a long hard lesson with it's laissez-faire approach to online gaming: a free service that happens to be fairly cruddy may as well not exist in the first place. This reality came full circle back in 2011 when Sony had to keep PSN out of commission for nearly an entire MONTH after what was later called a security breach. The full truth about this incident likely has still not been exposed, and I fully believe there is more that happened behind the scenes then Sony wants to publicly admit to. It's hard to believe that the second largest online console gaming infrastructure is brought to its knees for three weeks.
Regardless of what was or wasn't at play in Sony's PSN-cleanup-of-a-lifetime, the fact that they have made a full 180 degrees on their promise of a "free PSN" speaks for itself. Back in June, Sony did some careful word treading in describing why it finally decided that PS4 owners would be charged for online gaming in the form of PS Plus subscriptions. The equivalent to what the Xbox 360 has had since inception, Xbox Live Gold, is going to be a requirement to play any PS4 titles online when the console launches.
I'm calling it a silent admission that the "free to play" approach to PSN just wasn't working. The month-long outage of 2011 aside, the whole PSN experience is a disjointed, half-baked interface which barely scrapes the surface of what Xbox Live offers in contrast. Sure, it delivers the same access to Netflix and other streaming services like Xbox Live, but households and gamers alike are yearning for more. A clean experience that ties in an online community with seamless integration with game purchasing, app usage, and everything else that a modern console can offer. I just don't see it out of PSN today, and I'm truly hoping that the PS4 iteration of this ecosystem improves to where it should be. Especially if Sony is putting a price tag on the functionality.
Sony spent years chirping on Microsoft for charging gamers on what it was "giving away for free" on the PS3. After a less than stellar showing by PSN since the PS3's launch, and a blistering month-long outage that exposed its flaws back in 2011, Sony changed course: paid PSN Plus will be required to play online with the PS4. While PSN is playing catch up, Xbox Live has been growing and thriving for Xbox owners since its release in 2001. Anyone calling PSN a better ecosystem than Xbox Live clearly hasn't been on XBL lately. (Image source: The Verge).
And let's not forget that Microsoft has also let slip that self-publishing on the Xbox One will be made easier than it ever has before. Details are still scant, but it seems that Microsoft wants to unleash the game developer in each and every one of us. Microsoft's Corporate VP for Xbox, Marc Whitten, explained the basics to Engadget. "Our vision is that every person can be a creator. That every Xbox One can be used for development. That every game and experience can take advantage of all of the features of Xbox One and Xbox LIVE. This means self-publishing. This means Kinect, the cloud, achievements. This means great discoverability on Xbox LIVE."
Streaming video? Check. Enhanced TV DVR functionality? Check. Online gaming? Check. And now making your own games right in your living room? You bet.
1. Modern Consoles Are Entertainment Hubs -- And the Numbers Prove It
Let's face it: console gamers that consider themselves 'pure gamers' which only use their consoles for gaming and nothing else are the clear minority this day in age. Market research firm Ask Your Target Market (aytm.com) released some numbers in November 2012 which prove game consoles are becoming a central aspect of the modern living room. And not just for gaming, but for doing many other things like watching videos, listening to music, etc.
To put things in perspective, only 30% of those who responded to the AYTM study claimed to use their consoles just for gaming. A whopping 69% of respondents claimed to use their consoles for much more, either to watch videos, play DVDs, or use internet connected apps available to them. Is Sony' approach as marketing the PS4 as the "gamer's console" necessarily the best approach then? I truly don't believe so if studies such as the one above are an accurate depiction on what we're doing with game consoles today.
Another influential body in gaming, the Entertainment Software Association, released its yearly "2012: Essential Facts" report which looks at trends in the video gaming industry. For starters, nearly half of US households (49%) own some sort of dedicated gaming console. The average age of gamers today is 30 years old. And there is almost no gender gap when it comes to gaming today in the US -- 53% are male and a full 47% are female. These demographics further reinforce the notion that hardcore gamers are not the one controlling sales charts these days. It's the lower-thirties family, likely with kids, leveraging their game consoles for the all-around media experience. Gaming is just one slice of the usage pie these days; far from the sole activity it used to represent back in the Xbox and PS2 era.
Another article, this one a CNN news posting, quipped that 40% of all activity on the Xbox 360 is non-gaming related. The author correctly notes that much of this is from Netflix and Amazon's respective video streaming services, which are dirt cheap compared to going out for a night at the movies. That 40% of non-gaming activity, Microsoft states, translates to about 30 hours of video watching per month.
Think those numbers are off? Nielsen has already told us that 65% of game consoles sit in the living room today. If you can connect your own dots, it's not hard to imagine why video watching has become so popular on game consoles.
And the same CNN piece above touches further on something I alluded to above: the 'War of Graphics Capabilities' is becoming ever less relevant to most casual game players. "Gamers that care intensely about graphics will continue to do so, but I think there are fewer now than there were in the past," said the Creative Director of Zynga, Paul Neurath. "Big leaps in graphics no longer exist. Unless there's some futuristic holographic display or direct brain implement we don't know about, it's hard to get a lot better."
Which leads me to my next point. While Apple and Google are fighting tooth and nail to get their first party devices into the living room, Microsoft comfortably already has the attention of many households. In terms of value, the disparity couldn't be any greater. Sure, I could get an Apple TV device for $100. It's surely a cheap price point, but considering that I could opt for a low end Xbox 360 for only $200 -- double the price tag of the Apple TV -- and get three to four times the return on entertainment value, is there any comparison anymore?
The Xbox 360 can do everything the Apple TV can (barring some Apple-specific functions) plus gaming, web browsing, Skype for video chat, and a growing array of access to live streaming TV events. The Apple TV looks more like a paperweight in contrast to the 360 when you pin them head-to-head, especially on price and value.
Microsoft has been suspiciously silent about it since the Xbox One touched down, but the recently revealed IllumiRoom technology has some exciting potential. This Kinect-enhanced projector would enable players to light up their whole wall with fluid game experiences that are literally larger then life. It could very well be the ace in Microsoft's pocket that gives the console a decisive edge in the race of meaningful technology innovation. (Image Source: Microsoft)
Today's households are increasingly yearning for an all-around media device, and the Xbox 360 has been carving out its digital strategy for a number of years. But the head start Microsoft has over Sony is going to reap its rewards tenfold as Microsoft's content partnerships continue getting more numerous, and the justification for keeping the cable line continues to shrink.
And in the same way Microsoft's ingenuity led to the Kinect camera (which Sony quickly copied to stay from looking out of touch), they are busily preparing other neat innovations for the living room. While it likely wasn't ready for a primetime showing at E3 this year, Microsoft's research team has already released video of its upcoming IllumiRoom technology. Those wanting to expand their gaming horizon past the 50 or 60 inches that max out on most 'affordable' consumer level TVs will be happy to know that IllumiRoom is looking to take the game experience onto your entire living room wall. You really have to see it for yourself, as words cannot do this new technology justice.
Whether or not IllumiRoom will be the killer Xbox One feature that gets tacked-on after the fact, like the Kinect was for the 360, is still to be seen. But on the whole, it's utterly clear that Microsoft is dedicated to becoming the all-inclusive console choice.
Holiday 2013 Sales Will Tell All
The console war will be quite interesting as it heats up for the holiday season. Sony is aiming its sights directly on its most loyal following, hardcore gamers. Microsoft is going all-in with giving families a console that is just as appealing to the frequent game-a-holics as mom who just wants to kick back to a few hours on Netflix. While some analysts are already bold predictions about next-gen console sales, I'm going to take a cautious tone and play the wait-and-see game. When it comes down to it, no one will remember what happened at E3 this year, and holiday shoppers will certainly not be carrying around spec sheets comparing hardware between the giants. Instead, choices will be made on what provides the immersion and breadth of experience households are looking for today.
I'm not a hardcore gamer anymore. I'm looking for the next gen console that gives me the most value for the money. At least so far, Xbox One seems to be the device that will deliver on my needs this holiday season. Here's hoping Microsoft can merely bring that price tag down a bit -- my only gripe so far with the Xbox One.
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
I've long been a believer that a judgement gap, influenced largely by negative media coverage, is what continues to hold back cloud adoption among small organizations. And judging from the results of a recent study completed by comScore, my intuition has been fairly on track. The biggest issue surrounding cloud uptake, at least for small businesses worldwide, seems to be none other than an issue of perception.
How so? The study, which surveyed companies with between 25 to 499 computers in the USA, Germany, France, and the UK, found that 42 percent of small businesses which had yet to adopt cloud technologies were concerned about reliability/uptime. Likewise, a full 60 percent had cited issues with data security as reasoning for staying cloud-free.
In stark contrast, however, the study highlighted numerous beneficial aspects that companies which adopted the cloud were enjoying:
Adrienne Hall, the general manager of Trustworthy Computing at Microsoft, highlighted the findings of the study. "There’s a big gap between perception and reality when it comes to the cloud. SMBs that have adopted cloud services found security, privacy and reliability advantages to an extent they didn’t expect," she said. "The real silver lining in cloud computing is that it enables companies not only to invest more time and money into growing their business, but to better secure their data and to do so with greater degrees of service reliability as well".
Is the cloud really the black hole the media makes it out to be? If a recent comScore study has any weight behind it, the only thing seemingly holding small business owners back is an incorrect perception about the cloud and its numerous XaaS offerings. (Image Source: Microsoft)
While the study itself didn't delve too far into where this perception gap stems from, it's not hard to connect the dots if you've even had a single eye on the media over the past few years. While uptime, security, and costs have been near universal benefits of cloud adoption, business owners are getting, for lack of a better term, 'spooked' due to the disproportionate coverage garnered by even routine outages.
InfoWorld even dedicated an entire slide series to "The Worst Cloud Outages of 2013" and noted examples like Amazon's AWS outage from January which lasted a mere 49 minutes, and Microsoft's two hour outage of services like Office 365 in early February. In all fairness, some outages do last longer, but these are few and far in between.
To put things in perspective, the few customers my IT consulting company has left running on-premise servers like Exchange experience on average about 4-8 hours of downtime from first report to final resolution. I'm not referencing little issues; I'm talking items which you would classify as 'outage' worthy. These problems are exacerbated the older these systems get, especially when clients prefer to run their IT operations on a shoestring budget. While they think that they are saving money on recurring cloud costs, they forget about the emergency labor fees, time spent on the phone by both ends troubleshooting outages, among other negatives.
Some of these over-emotionalized reports of cloud outages are clearly out of touch, and their effect on perception is still pretty strong. I'm in discussions every month with customers who are sifting through information that is disproportionately bogged down with negative news reports about small outages of services they may be researching.
When I ask them how many hours their own internal systems spent out of commission the last year, the focal point of the discussion immediately changes as reality sets in. Just because there is a lot of heat on cloud providers in the media for downtime, it doesn't mean their systems are faring any worse than on-premise installations. In fact, I'm finding the exact opposite to be the case more than 9 times out of 10.
SpiceWorks Study: SMB Cloud Adoption Up, and Still Rising
In a completely unrelated set of findings by enterprise helpdesk giant SpiceWorks, SMB adoption of cloud services has been surging upward since 2010. In it's annual "State of SMB IT" report released a few months back, the organization found that current cloud adoption by small businesses stands at about 61 percent but is expected to jump up to 66 percent by year's end 2013. While the growth figures have slowed considerably compared to 2010 and 2011, the fact that over three-fifths of small businesses are in the cloud by now is a wonderful benchmark for the industry.
SpiceWorks reported in its latest "State of SMB IT" report that cloud adoption among small businesses is currently at about 61 percent, and is projected to hit 66 percent by year's end. This growth is very likely being spurred by rock-solid solutions like Office 365, Google Apps, and Windows Azure -- all platforms my own customers are adopting at unbelievable rates. (Image Source: Spiceworks)
That's not the only bright spot in the SpiceWorks findings. Virtualization of servers, which was only under consideration by most small businesses just 3-4 years ago, has taken the industry by storm as heavily as cloud services. 72 percent of small business IT pros claim that they are leveraging virtualization technologies right now, and an even higher amount -- 80 percent -- plan on adopting tools like Hyper-V or VMWare by the end of the year.
I've been getting into the virtualization arena myself, but have been steadily recommending Windows Azure for clients needing to take down physical boxes. The service has been exceeding customer expectations in reliability and cost, as I mentioned in my recent review.
Other Factors Likely to Notch Adoption Upwards Soon
I personally think the wholesale move to the cloud by small businesses can be seen as being carved out by two camps: those which adopted early, and those which are going to make the move when the lifecycles of their on-premise systems are up for consideration again. I'm personally seeing this in my own customer base quite a lot.
Some clients that were on near-dead Exchange 2003 systems made the moves to Google Apps and Office 365 when their time was up, and the remaining suspects running newer platforms like Exchange 2007 and 2010 are merely biding their time -- or sucking out every last dime of ROI on their physical servers, more appropriately.
Either way, the overall movement and adoption of cloud services is gradually becoming an easier sell. Firstly, with so many others already having acted as the 'guinea pigs' for the rest of us, there are a lot of success stories being shared which is helping overcome the media's nasty propensity to jump on cloud outages with disproportionate vigor.
And second, as the economies of scale behind these large platforms evolve, such as is the case for Windows Azure, prices continue to drop. If the cloud is to live up to its cost-saving reputation, prices on these SaaS and PaaS offerings need to continue creeping downward.
Are pure cloud solutions always the right choice for everyone? Of course not. Hybrid clouds and even fully private clouds are acceptable alternatives to the big boys on the market like Office 365, Google Apps, and Windows Azure. Some industries have data privacy concerns that are very finite which cannot be properly accounted for in a public cloud ecosystem. But as the industry continues its path of maturity, these issues will continue to shrink.
If there's any sector out there that's leading the cloud adoption bandwagon race, it's quite clearly small business. And who can blame them? With budgets that rarely can support dedicated IT staff, and a need to spend money on profitable functions, keeping dinosaur technology systems up and running is becoming less and less appealing. Let's see if this trend will trickle upward into the midsize and enterprise as well. Only time will tell.
Photo Credit: jörg röse-oberreich/Shutterstock
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
In what can be called nothing less than a clever use of well-planted deflection, the head of the NSA, General Keith Alexander, recently let loose that the NSA relies heavily on Microsoft SharePoint for its data sharing needs. Or, more accurately, he decided to namedrop on SharePoint to allude to a mistaken notion that the inherent use of SharePoint was the reason why the NSA got breached by Edward Snowden recently.
The information was first picked up by The Register after the General was caught making the admission at a recently broadcast cyber security forum (which can be viewed on YouTube in its entirety). He described of Snowden: "This leaker was a sysadmin who was trusted with moving the information to actually make sure that the right information was on the SharePoint servers that NSA Hawaii needed".
Further in, Gen Alexander went on to say how sysadmins have a "need" to use removable media for their day to day jobs. He also touched on the fact that there was a "break in trust and confidence" due to the Snowden incident, and the NSA is purportedly picking up the pieces and fixing the broken links in its internal security procedures.
The funny part about this entire charade is that this isn't the first major SharePoint incident the government has dealt with. Remember Bradley Manning and his massive document dump a few years back? His recent acquittal of the most damaging charges against him have him back in the news, but Wired Magazine reported back in 2011 that Manning used similar scraping scripts to rob the US Military of a comparable treasure trove of confidential information off SharePoint backbones.
An easy cop out is to point the finger at SharePoint for these breaches. But the problem is, SharePoint itself -- neither its mechanisms for handling security, nor the code that controls the platform's workings -- had nothing to do with how these two individuals made out like bandits. The Register pointed to a group of researchers that was set to present its findings on holes in SharePoint at DEFCON this year, but again, none of these supposed bugs was used in any way by Manning or Snowden. It's like blaming the manufacturer of the deadbolt to your front door for a criminal entering your home using a spare key. The US government has fallen victim to threats from within in every possible way.
If we're to use these as examples for how enterprise security, especially in government, needs fixing, then we need to pinpoint our efforts at the people and protocols behind how these 'big data' systems are employed.
'Best Practices' Documentation on SharePoint is Prevalent
If the NSA and its fellow government IT administrators are anything like the corporate world, their approach to security regarding IT infrastructure (especially SharePoint) is probably way too laid back for its own good. This lax attitude towards securing data sharing platforms has proven to be an issue time and time again. Most recently, InfoSecurity magazine released a (not-so) shocking study that claims only about one-third of businesses using SharePoint have proper security policies in place detailing usage and access of the leak-prone platform.
The same study found that 65% of respondents admitted their organizations are not marking data into categories or sensitivity levels in any way. And again, while the study has no representation of US government branches, it wouldn't be surprising to see government IT admins following similar laziness as the corporate sector. Judging from the Manning and Snowden incidents hitting the US government in a span of only three years apart, it seems that the curator of some of the biggest secrets of the free world still can't get it right.
The NSA doesn't seem to be alone in its sub-par security practices surrounding SharePoint. A recent study by Emedia, covered in full by InfoSecurity magazine in February 2013, found that only about one-third of organizations with 25-5000 users employing SharePoint have security policies covering the platform. Even worse, just over one-fifth, or 22%, admitted that they don't have one and won't be making one. Security complacency seems to be an industry wide problem. (Image courtesy: Boldon James)
Call it poor planning, lack of oversight -- whatever you wish. It's not like the documentation isn't out there for how to properly implement SharePoint environments for highly secretive and confidential scenarios. A simple Google search yielded hundreds of helpful results, and I'm as far from a SharePoint administrator as could be. For example, Microsoft has this publicly available document on "Governance 101: Best practices for creating and managing team sites" with basic steps on how permissions should be doled out in SharePoint environments.
Even further, Microsoft has a "Plan your permissions strategy" article posted that goes into detail about the various aspects that help restrict data access in SharePoint. And finally, they published an in-depth TechNet wiki on security planning techniques that go deeply into best practices surrounding proper preparations for sites and content in SharePoint.
Not enough reading material for a head sysadmin? Just Google "security planning for sharepoint" and there are over 7 million results culled. To say that one would be hard pressed to have an easy set of resources in properly planning and executing a confidential SharePoint environment is nothing short of lying through one's teeth.
Lessons for the Enterprise: How to Prevent Your Own Data Leak Disaster
Even though my company primarily consults smaller organizations ('50 and unders' you can call them) on their IT needs, this doesn't mean that the same security protocols don't apply. Basic principles like leveraging security groups and using the concept of "least privilege" in assigning rights are common sense scruples that any sysadmin worth his weight in salt should know. I don't actively manage any SharePoint servers in my day to day work, but perhaps some outsider knowledge wouldn't be bad for those in charge of our nation's most sensitive SharePoint systems.
If you're tasked with protecting a SharePoint ecosystem, or other valuable data-rich infrastructures, here are some key steps that should be considered. A few items are really only practical for government implementation, but I'm going to mention them anyway (since our fine gov sysadmins could clearly learn a thing or two from the Manning-Snowden double header by now):
Can organizations of all sizes learn from the mishaps of the NSA and US Military? Surely. Common sense, back to basics security principles don't take a genius to figure out. The unfortunate truth of the matter is that most system administrators think that these kinds of incidents won't happen to them, or that security planning can be put off until later down the road. This kind of flawed thinking is what leads to oversight and downright fumbling when it comes to critical data breaches.
Remember: the Snowden and Manning incidents weren't examples of external threats. These leaks happened directly from within, behind closed doors which provided the false sense of security which ultimately, and ironically, represented the largest data breaches known to the United States. It's time we stop blindly blaming the software developers behind the products we employ, and start looking inwards towards those in charge of the keys to the information.
Photo Credit: Picsfive/Shutterstock
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
After spending a number of years working in the educational tech sector, I can safely pinpoint the two camps that make up the meandering discussion about 1:1 computing plans for K-12 education today. On the one side, we have eager innovators who are determined to place a device in each student's hand -- even if that device fulfills nothing more than a checkbox on an administrator's 'five year outlook' plan.
And in contrast, we have the technical neophytes who are well entrenched in their opposition to devices in the classroom. These folks are the ones most likely to be ingrained in the "industrial force-feeding" approach to education, which by most accounts, is falling flat on its face. As the US continues to slide in education, most recently ranked 17th globally, the debate is no longer whether or not we need a wholesale adjustment of how we teach our youngest minds. Much more importantly, the discussion should be laser focused on how we get US education out of its growing rut.
The status quo, as it stands, clearly isn't working in America. As the rest of the world is evolving in education, we've been selfishly holding onto the industrial force-feeding approach to education. Fear of change; an endless reliance on the broken model of teachers' unions; and wasted funds being dumped on black hole initiatives. Our furthering decline can definitely be turned around, and I fully believe that forward-looking plans like 1:1 computing and flipped classrooms are the answers for the long run.
For as much as we claim to be spending on K-12 education, nationally we have little to show for it:
The above is just a sliver of the many issues plaguing our modern education system. For districts looking to infuse technology in an effort to overhaul the way they approach learning, there needs to be a focus on avoiding the mistakes being made nationally in emotionally-inspired initiatives.
Is an iPad in every pair of hands the silver bullet to solving this dilemma? Absolutely, positively not. Alan November penned an eye-opening piece surrounding the lackluster implementation of most tablet based 1:1 initiatives going into place around the country. He wrote, "Unless we break out of this limited vision that one-to-one computing is about the device, we are doomed to waste our resources."
Too many educational leaders are buying into the notion that cool tech will get us to where we want to be. Little do they realize that we are merely spinning in circles, placing high hopes on this year's newest iPad or Nexus without a clear justification for where the device fits into the students' overall learning infrastructure.
Liz Davis had similar thoughts on the iPad rollout in her district just outside of Boston, MA. Like many other short sighted K-12 districts in this country, a tool was named before a clear set of needs were even defined. "In many cases trying to make the iPad fit the curriculum has been like trying to fit a square peg into a round hole." My experience in discussions with former colleagues in the high school education arena yields very similar emotions, so Davis' remarks are not unique.
#7: Follow Functionality, Not Fads, In Choosing Tech
Cool is nice. It makes students and teachers feel good. It allows administrators for easy content on press releases to trumpet plans. And many times, it masks the innate issues with spending large sums of money on technology that will create little value on increasing test scores or bringing up literacy.
While things like iPads and Android tablets are the in-thing right now, is this really the standard upon which we foresee our students learning the next 5, 8, 10 years out? I'd argue that the answer is 'no' more often than not. Coming from the educational arena just as early as last year, I know very well how group-think overtakes discussions on technology in the classroom. We fall victim to believing the media hype over hard concepts that could be easily gleaned through some guided simple pilot testing combined with observation of other districts' maneuvers. The falsehood that what works in the consumer market is always applicable to the education market needs to be squashed fast and hard.
I'm not one that is falling for the ridiculous argument that the PC is dying or going away soon. The mainstay to meaningful computing will continue to be a screen, mouse, and tactile keyboard -- whatever form factor that may entail in the marketplace today or tomorrow. But the argument that a limited function tablet either replaces or eradicates the PC is willfully short sighted both in thought and expectation.
#6: Touch Shouldn't Trump All Else
Aside from the grossly expensive (and highly impractical for education) Pixel, Chromebooks do not offer any form of touch capability on their K-12 focused models. But their uptake in schools nationwide has been skyrocketing, inching on the territory that Apple claims is theirs with the iPad. What gives? Isn't touch the keyword to success in the classroom?
Far from it. Let me correct myself -- touch is not the end all, be all to leveraging 1:1 devices in schools. If it were, we would have been ditching PCs wholesale from schools in favor of iPads and Galaxy Tabs already. While touch capability is a nice way to enhance the learning experience, it doesn't by any means define the future of classroom learning.
Chromebooks don't have touchscreens, a monstrous App Store, or a fruit logo company backing them. Yet the value proposition they bring to the classroom is unparalleled due to a focus on computing features well suited for students and educators alike. (Image courtesy: Google)
Administrators need to keep in mind the balancing act that needs to be played when it comes to evaluating new devices for 1:1 programs. What kind of ports does the device offer? What is the average cost of repair for the platform? Does the device offer tactile-based input options for those who need it?
For example, handling video-out presentations on the iPad is a painstaking game of purchasing the right high priced adapters, and hoping that the iPad will support projector mode on a particular app of choice. Devices like the Chromebook and Surface allow for near seamless projector usage through relatively low priced dongles. I've even seen many Chromebooks (namely Samsung units) even come with the needed VGA dongles in-box. This ease of connectivity extends to the inclusion of standard USB ports that Chromebooks and Surface devices offer. Reducing the barriers to functionality is another key consideration for the connected classroom. The pretty packaging is only as nice as the experience it provides.
Zeroing our discussion in solely around devices drenched in touch-everything does nothing but play into today's marketing keywords. In three years time, when the media is onto something new, early adopters of misguided technologies will have wished they broadened their horizon.
#5: What Do Repair, Replacement, Support Costs Look Like for a Platform?
Narrowing the discussion about a 1:1 device solely around how much it will cost up front per-unit is lacking any realization as to the ongoing costs to keep this newfound ecosystem running. Articles such as this one by Mike Silagadze and another by Tom Daccord paint some of the realism back into this debate by highlighting the repair costs, maintenance tasks, and other challenges to keeping an iPad classroom functioning like a well oiled machine.
SquareTrade, for example, took it upon itself to let loose some stats on 50,000 iPads that it ran across during repairs in 2012. This is the very company which insures all kinds of electronics for websites like eBay and other similar outlets. They get their hands on a wide array of hardware, and keep in mind that their experiences aren't even focused on an educational audience -- one which is quite rough and tumble on the tech it uses due to the age of the users at hand.
This aptly named 2012 iPad Breakage Report told us that one in 10 iPad 2 owners damaging their device within the first twelve months of having it. The iPad 2 has an overall total failure rate of 10.1 percent. Even worse, the iPad 2 was found to be 3.5 times more prone to damage due to accident than the original iPad.
SquareTrade's iPad Breakage Report of 2012 was brutally honest about the cost of ownership when it comes to iPads. Converging a 1:1 computing plan around iPads comes with its share of headaches. (Image Source: SquareTrade)
And SquareTrade isn't alone in warning us about the repair costs of iPad ownership. USA Today did a great story covering the findings of numerous districts across the USA and how much they are spending on keeping their iPad fleets functioning. Marathon Venture Academy of Wausau, WI started providing iPads to sixth through eighth graders in 2011, and of 135 devices loaned out, a full 25 needed to be repaired. A majority of the repairs were to replaced damaged screens with an associated cost of $275/unit. Seeing that a school can purchase a brand new Chromebook for the price of an iPad screen repair, it's easy to see why comparing long term costs is very important for tight bottom lines.
Still more is to be said about the MDM (mobile device management) capabilities offered by various platforms. Chromebooks are excellently leveraged by districts that have existing Google Apps for Education ecosystems, since Chromebook Management licensing runs a mere $30/year per Chromebook device. No extra software, no extra licensing; it's all integrated within the same Control Panel that handles email and user management for the Google Apps ecosystem.
And Microsoft now offers a solution for Windows Surface tablets and traditional Windows devices in the form of Windows Intune. Educational pricing is not publicly described, but I have heard from colleagues that Intune licensing is "very affordable" for districts that already have volume licensing in place with Microsoft. This means not only can districts manage Surface tablets, but Windows laptops, and even district issued Windows, Android, and iPhone smartphones.
I don't know of any first party MDM solution from Apple, even in the new upcoming iOS 7 that is about to be released. I'm not entirely surprised, which is why I've previously penned about why I think the enterprise computing sector as a whole will never take up Apple as a Windows alternative in light of their reluctance to get serious about seamless and inexpensive management and security of devices they push.
#4: Lack of Staff Professional Development is Like Tossing Money Away
As the owner of a company that now consults school districts in their technology plans, I have zero patience for administrators that fail to see the importance of professional development when new tech is welcomed. My feelings have been hardened by first hand experience working in education and seeing how hopeless new tech initiatives were when the assumption that "if you provide it, they will learn."
Just laying claim to the fact that technology was purchased and dropped into classrooms doesn't make up for the negated fact that it will be next to useless without the correct training. Not only training, either; districts need to make concentrated efforts in championing not only the functional aspects of technical aptitude, but also the integrative possibilities with how instruction can be transformed through a digital paradigm.
Our schools have very successfully taught legions of students how to use Microsoft Word. This doesn't necessarily mean that they will have the intuition to leverage the software correctly towards meaningful content creation. The same dilemma exists when we are discussing 1:1 programs and the indiscriminate "dumping" of technologies on schools in the hopes that "spraying and praying" will solve all ills.
If teachers have a common understanding of where the technology is taking their instruction, the student body will only then be capable of being led by the next generation of instructors.
#3: Collaboration Should Be a Focus of Every 1:1 Plan
I know very well that younger students embrace tablets well because it fits into their individualized focus of self development at that age. But this point is markedly pinned on any districts going 1:1 who are home to middle school and especially high school students. Tablets like the iPad are great devices for a single end user. But in this world where teamwork and group interaction makes up a large majority of post-school work life, we should be preparing these older students for what will be expected of them post graduation.
If you look at the entire landscape of employment choices, very few industries exist that represent tablet heavy classrooms. Pick your poison, whether it be law, medicine, research, agriculture, high tech -- the people that make up these industries aren't selflessly perusing app stores and creating content which will be self contained. It boggles my mind to hear about high school districts expecting students to be college ready when they are placing iPads in their hands and wishing them the best.
Kids have a drive towards working together and creating content cohesively, not just alone. Collabration can't be an afterthought in making a concerted 1:1 computing plan for your district. Too many districts today are focused on how to make the case for costly "digital notebooks." (Image Source: Tabtimes.com)
While they come with their own flaws, Surface and Chromebook devices are much better suited towards building meaningful focus around knowledge through groups, sharing this wisdom, and leveraging everything the modern internet has to offer in curating experiences. iPads, Galaxy tablets, and such are nothing more than glorified digital notepads that double as eBook readers. If our only goal is to rid classrooms of printed textbooks, then the tablet-first approach solves this one goal well but little more.
The larger discussion of flipping classroom learning and creating purpose for the 1:1 environment goes a lot further than just saving a few trees (as noble as that is).
#2: Flipped Classrooms Are Here to Stay
I'm not the only one extolling the successes of the first generation of flipped classrooms in America. From principles, to teachers, all the way down to the students themselves. Not only is the quality of the education proving to be better, but accountability is becoming much more transparent and easier to gauage due to the inherent nature of the way technology is built into this new model from the ground up. Teachers' unions may be apprehensive about their feelings towards making the 'flip' but judging from what I have seen and read so far, this new teaching style is at the forefront of what tomorrow's classroom should be based on if tech is at its heart.
To this end, administrators who are signing off on 1:1 programs today without any second thoughts about how easy it will be for their students/teachers to engage in flipped classrooms are going to be behind the curve before their chosen tech even gets shipped out. Pilot programs today at K-12 districts should be either welcoming forethought into plans for flipping, or at the very least, observing how successful districts are making the move.
KhanAcademy has been a key player in much of the focus surrounding 1:1 flips around the nation, and Bill Gates has already put his respected dollar vote on this up and coming name. As screencasting technology becomes prevalent and seamless on new devices, I foresee KhanAcademy clones budding sooner rather than later.
#1: Stop Focusing on Consuming Content -- Producing It Matters Much More
If there's one big fault I place on the iPad-first approach to the K-12 tech mindset, it's that this device is placing consumption on such a high pedestal that organic production of content becomes forgotten. I argued this same point when I made the case for why the Microsoft Surface is much better suited for today's classroom than the iPad. There's something to be said for seeing a student engaged in learning by means of leveraging a cool app from the App Store. But seeing meaningful content produced, collaborated-on, and visually expressed in a classroom setting is every teacher's dream.
So why are we so entrenched in this belief that a single-function tablet is the savior to classroom learning? In the same brief I referenced earlier, Alan November quoted a to-be-unnamed district Superintendent that was brutally honest about the 1:1 landscape so far, from those he has encountered. “Horrible, horrible, horrible implementation from every program I visited. All of them were about the stuff, with a total lack of vision.”
Hybrid devices that combine the best of the tablet world and that of a traditional computing platform, namely the Chromebook and Surface, are ecosystems which will not place such a dis-balanced focus on consumption. Of course proper professional development and teacher engagement from the start are big keys to winning 1:1 programs, but districts moving students onto sole-purpose tablets are selfishly demeaning the engaged 21st century student.
Asking so little of our students by pigeonholing them into an incessant culture of App Store consumerism is nothing more than placing a toddler into a walled garden. Have all the fun you want -- but the boundaries aren't coming down. This is nothing more then robbing them of the intellectual growth possibilities we should instead be promoting. The bar is being raised all around them, whether we make the maneuvers necessary to prepare them or not.
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Microsoft slipped one under the radar for everyone who relies on its Azure service for Windows virtual machines. In a move that even took me off guard, Microsoft has reversed a longstanding policy of not allowing any form of RDS (Remote Desktop Services) on Windows Azure.
Previous policies strictly enforced remote desktop access on Azure only for the purposes of "administration and maintenance." As of July 1, this stumbling block for many Azure early adopters is finally gone, with a few caveats which I'll point out shortly.
I've been following the silent cold war Microsoft is slowly ramping up in the desktop-as-a-service arena for the past half year or so. In many ways, Redmond is competing against a very unlikely foe: itself. Just like Microsoft is trying to wean people off traditional desktop Office and onto Office 365 plans, with the ultimate goal being Office Web Apps taking over entirely, its plans to combat a gargantuan user base of desktop Windows users may be brewing sooner than you think.
A week ago, I covered the likely plausibility of Active Directory-as-a-Service offerings through Intune or Azure -- or a little mix of both. The idea doesn't seem so crazy if you take into account Microsoft's aggressive behind the scenes research under the guise of "Mohoro". This top-secret initiative is a supposed full blown effort to bring Windows desktop-as-a-service capability to the masses sometime later next year.
So why isn't Microsoft's latest Laissez-faire about face on Azure Remote Desktop that surprising? If you view it in the context of everything else that is seemingly going Azure-centric as of late, then it becomes clear that Microsoft is just bringing down the barriers to a quickly evolving 'Azure first' future.
What the new RDS licensing entails for Azure VMs
Remote Desktop Services (RDS) is a collective technical backbone caked right into Windows Server 2012 that allows for numerous use case scenarios. The most classic one, which we still use heavily with customers of my company, is what used to be called Terminal Services. Microsoft calls these virtual desktops run on a centralized server "session based desktops" over RDS, and this is precisely the item which was strictly forbidden on Azure -- until recently.
Session based Remote Desktops are very convenient in many ways. For one, Remote Desktop is a free tool that has been bundled with every copy of Windows since 2000. It replicates nearly everything that GoToMyPC or LogMeIn can do minus a few bells and whistles. Companies looking to either host these "virtual workspaces" for clients or for workers of their own organization were locked out from doing so due to Microsoft's arcane licensing policies regarding RDS on Azure virtual machines.
Session-based Remote Desktop hosting was strictly forbidden on Azure virtual machines until July 1. While the relaxed rules do allow a lot of flexibility in using RDS on Azure going forward, the concession brings with it a requirement of licensing each device or user on Azure RDS with Subscriber Access Licenses (SALs) that are primarily geared towards service providers. A tough sell for companies heavily invested in RDS Client Access Licenses (CALs) through Enterprise agreements.
Microsoft gave us a signal that it is moving its Azure regulations in the right direction, and on July 1, opened up the use of Azure for the purpose of serving virtually hosted desktops. Mind you, if you believe that this means you can run your copy of Windows 7 or 8 in the cloud, you're absolutely wrong.
While Microsoft does allow for desktops to be served off virtual Windows Server editions (like 2008 R2 or 2012), tossing up client copies of Windows into Azure is still strictly forbidden. To the same end, Microsoft's legalese prevents anyone from hosting client copies of Windows in any shared multi-tenant user environment -- which pretty much covers Azure, Amazon AWS, RackSpace cloud, etc.
Before you get too excited about the rule changes, Microsoft is sticking it to the large majority of organizations that could have benefited most from this move by requiring a wildly different licensing model than what corporations are used to. The traditional way that Microsoft has always licensed access to RDS session-based desktops is via RDS Client Access Licenses (CALs) sold through distributors or Enterprise Agreements.
Azure's foray into RDS desktop-as-a-service runs slightly off course and requires RDS Subscriber Access Licenses (SALs) which are procured solely through a Microsoft Services Provider Licensing Agreement (SPLA). You can read through the full FAQ provided by Microsoft to see what steps are necessary to leverage RDS on the Azure platform. While it's technically legal for any organization to purchase these licenses as far as I know, it's just another layer of expense, especially if you're already invested knee-deep in traditional CALs.
RDS on Azure only a stopgap until Mohoro is ready?
It's pretty evident that Microsoft is going only halfway in on its licensing concessions for Azure RDS in order to keep an ace up its sleeve for something to come. More likely than not, I think Microsoft opened the gates to those who were clamoring for Azure RDS the most (that is, developers who needed it to host software for customers in the cloud) while keeping the average corporation or small business at bay. Presumably, because it's got grander intentions down the road in the form of 'Mohoro'.
Looking at how Microsoft is falling in love with the subscription models that Office 365 have been pushing, it's not far fetched to believe that a hosted Windows desktop-as-a-service future is on Microsoft's near term 2-4 year radar. Doubly, the way that Azure's new licensing model for RDS strictly prohibits the use of long-established CALs in favor of the more peculiar SALs, Microsoft wants to make it a rocky road for any company looking to move its remote desktop backbone into the cloud too soon.
As a proof of concept, Azure's allowance of formal RDS may not be the golden gem that many companies were hoping for by now, but it does solidify the fact that Azure is more than capable of being considered a remote desktop powerhouse -- even if it serves it up in a flavor that isn't our cup of tea right now.
Photo Credit: auremar/Shutterstock
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
When Active Directory first hit the enterprise computing scene over a decade ago, the tech pundits dismissed AD as just another Microsoft sideshow. Something that would never see any widescale adoption in the face of NetWare and other heavy hitters in the LDAP arena. Even longtime Microsoft watcher Paul Thurrott got it wrong and doubted success. Thirteen years later and organizations small and large live and die by their Active Directory domains.
It's funny, then, that AD is the sole dinosaur running atop on-premise servers at corporations worldwide which supposedly "can't" be moved to the cloud. Microsoft has been busily converting its on-premise products into cloud platforms with relatively good results over the last 3-4 years. While Microsoft surely doesn't want to become a has-been within the physical server arena for organizations hesitant to move to the cloud, it no doubt has been playing two face when it comes to on-prem vs cloud-hosted solutions.
For the uninitiated, Active Directory is the de-facto LDAP (Lightweight Directory Access Protocol) platform for most organizations 25 users or larger today. The service is a critical component to companies and entities that have sprawling corporate networks because it allows user accounts, network resources, computers and file shares to be managed quite harmoniously. Rights are easily controlled from a sysadmin's chair through the standardized concept of least-privilege. The form of AD found in traditional on-premise server scenarios is called Active Directory Domain Services (ADDS) and it has been around since Windows 2000 transformed the corporate networking scene.
So with seemingly all other core Windows Server offerings being transformed and re-envisioned for the cloud age, what's keeping Active Directory back? Why can organizations run their email, SharePoint, file shares, SQL instances, etc in the cloud today, but Active Directory stays stuck in a virtual purgatory? As an IT consultant by day that is trying to answer this very question for customers of mine, I decided to do some digging.
The short answer? Active Directory is well on its way to shedding the on-premise dependencies it has today. The writing is clearly on the wall if you know how to read Microsoft's bread crumbs. Microsoft is publicly keeping mum on the matter, but its intentions are speaking for themselves.
Windows Azure Already Has AD, Right?
A common point of confusion among fellow IT pros and customers is: doesn't Azure already have Active Directory capability? Yes -- and no. I was originally under the assumption that cloud-hosted Active Directory being offered through Azure was meant to be the replacement for its on-premise cousin. After some fact finding, I found out that this just wasn't the case. Microsoft evangelist Chris Avis penned an excellent article about how Windows Azure Active Directory (WAAD) is not the same beast as Active Directory Domain Services (ADDS) we have befriended. How Microsoft fooled me.
For those organizations thinking that they can shimmy their Active Directory domains right up into the Azure cloud as WAAD domains, this simply won't work. For one, Azure Active Directory is a service offering that (in its current form) only allows for identity authentication across cloud services like Office 365, for example. It definitely is possible to sync up with your on-premise servers, but this doesn't reduce your on-premise physical footprint in any way. Companies trying to hit the magic number of zero servers onsite won't accomplish this goal through WAAD. Yet.
Windows Azure Active Directory (WAAD) is a great solution to solving the single sign on dilemma for the plethora of cloud services necessitating otherwise separate logins. But it does nothing to rid us of the on-prem AD server controlling users, computers, etc at our offices. Those looking for a server-less office solution should steer clear. (Image source: Microsoft.com)
Second, the Azure Active Directory service has no way of communicating with your physical computers at the office on its own. Not only that, the problem is compounded by the fact that WAAD only works with User objects; that is, traditional user logins that merely provide authentication capabilities and single sign on for access to cloud services. This is a problem because for WAAD to behave like traditional ADDS for company computers and resources, it has to provide native support for these other types of objects. Whether Microsoft is working to introduce this is anyone's guess.
In its current form, I find WAAD to be an extremely stripped back version of AD that is only beneficial for organizations looking to leverage SSO (single sign on) for cloud software like Office 365, SharePoint Online, etc. For the rest of us looking for truly cloud-hosted AD, we must look elsewhere in Microsoft's portfolio.
Server 2012 Domain Controllers, DirectAcces on Azure
If there's one thing that Azure does darn well, it's hosting virtual machines in the cloud. My previous review on Azure highlighted its winning aspects in full detail. I've tried quite a few different vendors in the cloud VM space including big players like Rackspace, Amazon EC2, and SoftSys, among others. The pricing and excellent interface of the Azure platform keep making it an easy recommendation for my customers. And knowing how determined Microsoft is to make Azure a bona fide staple in the marketplace, its dedication to innovation is clearly evident based on the numerous enhancements hitting near monthly now it seems.
In the hunt to find a better alternative to WAAD, I stumbled upon numerous online postings from individuals who have actually moved their Domain Controllers up to the Azure cloud. Not as a part of WAAD, but as fully distinct instances running in private Server 2008 R2 or Server 2012 VMs. In fact, Microsoft's own support pages even passively-condone the practice of setting up Windows Server Domain Controllers (using standard ADDS) in the cloud. Going a step further, Microsoft even advertises how you can leverage an Azure Virtual Network to connect your physical office(s) to Azure hosted domain controllers over VPN for seamless integration. Could this be it?
For workers who happen to be situated at single locations and don't need to work from home, this is great. It actually does replicate the same functionality as we know and love from physical Active Directory servers located onsite. However, the number of routers and firewall devices that Microsoft has stamped with approval for working with this functionality is far and few between. Cisco ASAs, Juniper boxes -- expensive gear that the smallest of small businesses just can't afford to purchase or maintain.
Fellow industry colleague Brian Lewis wrote about how Azure now allows for neat Point to Site VPN capability, but even this functionality is not a fool proof solution because of the complexity of the setup. Point to Site VPNs are a way for single computers to connect to Azure Networks remotely (anywhere) for secure access to an Azure fabric.
This can be used, for example, to also have direct Remote Desktop access to Azure boxes without exposing the usual RDP port to the entire world openly. For the small organizations looking to offset their IT infrastructure to the cloud, who likely don't have dedicated IT people, this is a lofty proposal for offloading Active Directory easily.
I ran across another possibility: DirectAccess. This excellent alternative to traditional VPN configurations has been around since Server 2008 R2 but it used to be a royal pain in the rear to configure and meet minimum requirements for. Server 2012 has simplified DirectAccess considerably, rendering many of the arcane requirements of the Server 2008 R2 iteration as no longer needed.
Two problems with this approach exist as well. One the one hand, Microsoft doesn't consider Azure-hosted DirectAccess as officially supported. It's interesting, however, because numerous blogs have outlined how they have gotten the functionality working already. And more importantly, perhaps the biggest downer, is that only Windows 8 Enterprise has DirectAccess baked in.
Extremely disappointing -- seeing that this could have been an easy way to leverage cloud-hosted AD on Azure with minimal effort. Enterprise edition is usually only accessible to companies or organizations with an Enterprise Agreement with Microsoft. Small businesses need not apply.
DirectAccess could be a core component of any future Active Directory Domain Services alternative introduced as an offering in Azure. This relatively recent technology eschews (sometimes complex) VPN setups for a directly-brokered tie to client computers, making connections to the corporate network seamless and automatic for the user. Just what cloud-hosted AD needs. (Image source: Microsoft.com)
In the end, there is no great "short" answer which meets the goals of ridding the office of the AD server in exchange for an Azure alternative. While Microsoft is being complacent in allowing customers to host Active Directory Domain Services on Server 2012 and 2008 R2 boxes in Azure, their usefulness is only as good as the VPN connectivity you can muster for your staff. DirectAccess, while a very promising technology, has its hands tied due to its own unfortunate limitations.
Perhaps 'Project Mohoro' Can Answer Some Questions
I've touched on this still (relatively) secretive initiative gaining steam within Microsoft a few times in the past. While Mohoro happens to double as lovely town in the Comoros Islands, Microsoft has supposedly tagged the moniker to a clandestine project team within the company dedicated to bringing a first-party VDI solution to the masses. While VDI (virtual desktop infrastructure) has the aim of simplifying delivery and management of application catalogs, it has been marred with notoriously tough deployment routines and extremely hefty costs up front.
More specifically, Mohoro is aiming at giving Microsoft a way to deliver a cloud-hosted version of an extension of the existing Remote Desktop Services (RDS) function known as RemoteApp. Why is this important? Because if Mohoro sees the light of day, this "RemoteApp-as-a-Service" offering could completely render cloud-hosted Active Directory unnecessary. Knowing Microsoft's hardened direction in moving swiftly to the cloud, they could see this as a clear cut for organizations looking to take their physical systems up to the interwebs completely. Since the applications are deployed through RDS, and there is an underlying assumption of AD connected on the backside, having the need for clients to directly speak with an AD controller on Azure becomes a non-issue.
RemoteApp (above) already exists in Server 2012, and allows applications to be delivered via a public web page accessible anywhere internet is available. Is Microsoft expecting this feature to go mainstream? If Mohoro delivers a cloud version of RemoteApp as rumored, and Intune matures into a true AD counterpart, then this powerful duo may represent the future of BYOD corporate computing. (Image source: Microsoft.com)
Besides, there's a powerful AD arch-enemy already rearing its head in Microsoft's cloud. It's called Windows Intune and its been a hot topic from Redmond for a few years now. Intune is basically a poor-man's Active Directory in the cloud that focuses less on granular policy enforcement and more-so on device oversight, security, and tracking targeted towards mobile workforces. The service recently entered into its Wave D release which continues the platform's transition from a mainly Exchange ActiveSync (EAS) powered entity, to that of one utilizing native MDM functionality more akin to what Windows Phone 8 and iPhone support out of the box. This translates into extremely powerful management capabilities that can be deeply tied into the core backbones of any connected devices.
Windows Intune is the most compelling alternative to full blown Active Directory not only because of its extreme ease of use, but also due to its very cost effective price point. The service is now only a mere $6/user per month. And the best part is that this cost doesn't cover just a single device -- but you can have up to five devices enrolled per user. The service supports nearly every ecosystem including Android, Mac OS, Windows, Windows Phone, Windows RT, and iOS. It's only missing BlackBerry 10 support it seems. Blackberry Server 10 is relatively powerful in its own right, but it lacks any control over Windows devices; a big knock for Windows-first shops.
The biggest roadblock to most small businesses that have wanted to roll out AD is the heavy cost involved with not only planning and implementing the technology, but also maintaining it. It's no secret that consultants like myself charge a pretty penny to keep these expansive systems humming.
And a lot of small customers of ours right now are not interested in upgrading their aging 2003 domains to 2012 for these reasons. They want to move into a cloud-powered, server-less future. Workers can enroll all of their devices into Intune and receive integrated patch management, tracking, remote access, security software, and more. Intune is looking a lot more attractive then the thousands of dollars it takes to deploy or upgrade an AD backbone on-premise.
What's The Future of AD Looking Like? A Bit Of Everything
I wish I knew exactly where Microsoft was heading with its endeavors. What I was hoping to be a simple, clear answer for how to deploy cloud-hosted Active Directory led me to numerous meandering options which may eventually careen back towards a single path. Microsoft has been busting its rear end in taking popular on-premise technologies like Exchange and Lync and moving them to the cloud (leading to Exchange Online and Lync Online, respectively) and the end results have been generally quite commendable.
It's not hard to connect the dots on Microsoft's intended future for enterprise computing, and it's got cloud written all over it. Intune, RemoteApp, and Azure are all taking slightly different paths towards a likely common end goal of bringing the power of corporate AD into the friendly ecosystem of the cloud. With remote workers spending more time either on the road or at home, the assumption that a corporate VPN connection is always plausible is becoming a thing of the past.
Mohoro is scheduled to launch sometime in the second part of 2014, so it will be interesting to see what direction it takes -- and if it even sees the light of day. My money's on the former if I'm calling Microsoft's next move correctly. They already took a portion of their staple Windows Server line to the slaughterhouse last year, and I wouldn't be shocked in the least to see traditional AD see the same fate.
Photo Credit: 2jenn/Shutterstock
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Ever since Edward Snowden leaked what seems to be the mother lode of the decade, the internet has been fervently abuzz with speculation about Prism. The (aptly named) program was setup by the United States NSA (National Security Agency) to work hand in hand with internet giants to cull over mountains of data related to users of numerous services from Facebook to Gmail to Hotmail. Whether or not this information is accurately being used for its intended purpose -- thwarting terror attacks -- is still up for debate. But one thing we do know for sure is not only the type of data being plucked, but more importantly the overarching power this data yields.
It seems the crafty folks at MIT haven't been sitting back and watching this drama unfold. They've gone ahead and launched a representative cloud tool called Immersion that is very accurate in its portrayal of the inner workings of your entire digital life (or at least, the one contained to your Gmail account). National Journal's Brian Fung first covered this astonishing project, and it was since picked up similarly by eWeek.
In a nutshell, Immersion is a cloud tool that takes a 'big data' approach to representing the relationships exposed by the contents of your Gmail account. There is no cost to use the tool, and it fully asks for your stated permission to tie into your chosen Gmail address to cull information for analysis. For those worried, yes, you do have the option of having the tool completely erase all of the data pulled once you are done perusing the colorful representation of your entire Gmail history.
How MIT pieced together Immersion is not really a big secret at this point. It uses the same basic underpinnings for its representation of your digital life as Google already has access to, and which the NSA either "asks for" or gleans on its own -- you can come to your own conclusion on that one. Nonetheless, this information, known as metadata, is the digital equivalent to what the US Postal Service is collecting yearly on every piece of mail touched in America. Who your emails went to, how many of them you sent to each person, the file types of your attachments, the first and last time you spoke with each person -- pretty much everything besides the actual contents of the emails themselves. That clarification is the fine legal line that the NSA and other agencies are riding to uphold their legitimacy.
Piecemeal, these loose data sets are useless. Collectively, however, they paint a hyper realistic portrait of one's personal life, in context of the entirety of the data being woven together. MIT gives us a relatively toned down perspective by delivering a snapshot stemming from only our Gmail accounts, but you can imagine the power the NSA has with its monstrous data centers that churn through this, and more, on a daily basis.
While the execution behind Immersion is likely very accurate to the methods being used by the NSA, it by no measure shows the totality of one's digital life when cross-examined on a playing field taking into account all of your digital stomping grounds. If you believe the NSA isn't leveraging such power already, then you're just being plain naive.
So how does an Immersion plotting look? I tested the service on myself, using my oldest Gmail account created back in high school (late 2004). Mind you, Immersion is very clear in stating that it only looks at the To, CC, BCC, and timestamp fields of your messages. This is just a portion of the metadata the NSA is likely curating to create digital mugshots of each of us every day.
Immersion's primary dashboard graphic gives you your "web of relationships" at a glance. My personal web, shown above, took Immersion 5-7 minutes of digging through my Gmail to display. The NSA's systems can probably tackle hundreds of these in seconds.
The immense breadth of information that metadata can provide when viewed as a whole is breathtaking. My Gmail account has been in existence since late 2004 and Immersion claims I've got 31,000+ email messages in my account. The sheer fact that it could take nearly nine years of data and piece it together into a visually corroborative narrative of who I've spoken with, how long I've been speaking to them, and how they are related to everyone else in my life is something you have to see with your own eyes.
The tool doesn't stop there, however. If you want to see how specific contacts in your email history are related to others, simply click on any bubble in the graphic Immersion spits out (in wonderful Ajax fashion, might I add). Each bubble represents a single person you may have communicated with, and the larger their bubble, the heavier the email volume you've had with this person. On a lighter note, this may not be a bad way to see who is eating up all those minutes (or hours) in your inbox each day.
Clicking on any of your contact "bubbles" in Immersion gives you a top-down view of the relationships between that person and others you have spoken with. Direct lines between contacts represent associates that somehow know each other -- likely by being involved in email threads.
I labeled the first screenshot from my own Immersion readout with the major "webs" that exist in my Gmail account life. Seeing as this has been the primary inbox I've used since my high school years, it has the deepest insight into my personal communications. From work colleagues to hockey teammates down to past relationships. Immersion leaves no stone unturned.
Going even one step further, Immersion allows you to view some interesting specifics about any of your email contacts you associate with. I decided to take a peek at a few of the larger "bubbles" in my contact web, and was able to find out neat (or scary?) statistics about my relationships with these people. Immersion not only shows me when my first email to the person was, but also when my last message to/from them was, as well as a bar graph showing the overall volume of interaction over the entire history of us speaking to one another. Slightly less accurate was the "Introduced you To" area which tried to explain who I met as a result of knowing this individual, but for an agency using similar info-mining like the NSA, it's the bigger picture that matters most.
Should any one party be privilege to such information? Again, another area up for massive debate. A lot of this metadata that surrounds emails you send are already considered public knowledge to some extent. For any message on the internet to reach its intended destination, whether it be on a Gmail account or on Office 365, metadata needs to be shared and read between systems so that items can be properly routed. The NSA is undoubtedly culling a lot of the information it gleans from such siphoning of the public web, but one can easily see where abuse of this information in a big data context can easily spiral out of control -- especially in the wrong hands, for the wrong purposes.
I'm not one opposed to all forms of digital surveillance to keep the likes of al-Qaeda at bay in this age of virtually-equipped terrorism. But the cop out excuse of "I've got nothing to hide" is nothing more than complacency for digital tyranny. The further we allow government to creep into every aspect of our lives unchecked, the less meaning we can attach to claiming we live in a "Land of the Free".
If Immersion is just a sliver of the exponential power that metadata can behold, then the capabilities of the NSA are up merely to your mind's imagination. Literally.
Image Credit: Balefire/Shutterstock
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Perpetual release cycles. Windows 8.1. The unified Windows ecosystem. If there are any key takeaways to remember from Microsoft's cornerstone keynote at the Build 2013 conference, these three items would sum it up quite well. Microsoft CEO Steve Ballmer reminded thousands of developers on stage last week that the company isn't getting left in the dust and it has a solid plan going forward.
While most of the tech world was keenly focused solely on Build 2013 as the gateway to the first official peek at Windows 8.1, Ballmer's keynote had a few other important messages to deliver. The Windows update, formerly known as "Blue", may have stole the show but Microsoft had a grander agenda to piggyback at the developer conference.
How did Ballmer do in giving us the "State of Microsoft" at Build this year? Here are the hits, and more importantly, the misses from his tactful performance this last Wednesday.
Hit: Rapid Release Is the New Norm
This single facet of Ballmer's speech (which you can watch here) was by far the biggest theme of his near 25 minutes on stage. If Microsoft's approach to Office 365 and Windows Phone didn't make it apparent already, the man solidified it in a few hard hitting but spot on sentences:
We're certainly going to show you Windows 8.1 today. But you can think of that, in a sense, as the new norm for everything we do. For Windows releases, in addition to what we're doing with devices through our partners, what we're doing with Azure and Office 365, rapid-release cadence is absolutely fundamental to what we're doing and, frankly, to the way we need to mobilize our ecosystem of hardware and software development partners.
In a way, Windows 8.1 is not just a pseudo reset for the Window 8 brand. It is the poster child for Microsoft's evolutionary foray into perpetual rapid release cycles. Never before has such a sizable update hit the release platter so quickly after a major core release, as we are seeing unfold with Windows 8.1. Microsoft used to bring us innovation on a three to four year release schedule. As Melissa Webster from IDC puts it, "Three years is a lifetime in tech these days".
And the truth of the matter is, she's absolutely right. BusinessWeek had an infographic which said Apple was able to bring five iterations of the iPad to market in the same time it took Microsoft to develop Office 2013. Google trounced Microsoft just as easily, delivering five major versions of Android in the same time-frame. While Office 2013 is no doubt a solid upgrade to its predecessor, the days of justifying a single major release every three years are over.
Who better for Microsoft to learn from than Google? After all, Office 365's main competitor, Google Apps, publicly lists hundreds of meaningful updates that were all announced and released into the platform in the same time period that it took to roll out Office 2013. If Ballmer holds true to his promise of rapid release cadence, then Microsoft should be aiming at beating -- not just matching -- Google's extremely optimistic schedule.
Time will tell if Microsoft is on the right track with this. The massive and generally favorable update to Office 365 for Business from February didn't hit most existing customer accounts until May. A near three month lag in getting everyone on the same page is a bit disappointing. Here's hoping the future of 'rapid release' at Microsoft isn't actually rapid for some and eventual for others.
Miss: No Mention of Office 365?
I get it - Windows 8.1 was the star of the show. But to completely pass over Office 365 (both the consumer and business editions) when Microsoft is pushing a generational shift to rapid release is a bit confusing. Isn't Office 365 the primary catalyst that introduced Microsoft's intentions with persistent updates? Giving the first product to receive this treatment the backseat at Build 2013 was a perplexing message conflict on Ballmer's part.
It shouldn't come as a surprise that Microsoft is steadily building up to a future where Office 365 is the face of the suite as we know it. So doubly, keeping the cloud productivity platform in the dark at the conference was an interesting decision from the company's upper echelons.
Miss: Lack of Touch Devices to Blame for Windows 8 Uptake
I found it quite ridiculous that Ballmer took the liberty to blame the slow adoption of Windows 8 on a lack of touch devices last holiday season. I'm not sure what kind of a "Noah's Ark" scale flooding Microsoft was expecting in a holiday season that followed only months after Windows 8 reached RTM (Release to Manufacturing) status. And more importantly, I don't think a sea of touch devices would have overcome an unjustly negative media flow that Microsoft did little to personally stem.
Windows 8, in my opinion, was a positive step forward in many ways. Under the hood, the operating system has all of the power of Windows 7 and then some. The battery life tops that of its predecessor, for mobile devices. Most importantly, I haven't seen a single blue screen since I installed Windows 8 last October -- that's saying something. And security wise, it appears to be the most hardened Windows iteration to date. How come Microsoft wasn't able to swing the media firestorm with all of these positives? Ballmer's reactive, not proactive, marketing department has dropped the ball on what otherwise could have been an easy sell.
There's more to an honest discussion about Windows 8 than just the Modern UI and Start screen. Public opinion was wrongfully hijacked by the media, and Microsoft never made an effort to take it back. Windows 8.1 is a start, but it's hard to reverse course when you're letting critics lead the way.
Hit: Windows Is an Ecosystem Now - Not Just the PC on Your Desk
Another important area that Ballmer actually nailed was Microsoft's commitment to evolving Windows into an experience across devices in its ecosystem. This is important because the face of Microsoft needs to continue changing. The baggage that the operating system's name still carries today is that it represents the old-school. Computing at your desk -- not on the go, in your lap or in your hand.
Ballmer addressed this dilemma:
The Windows device of today doesn't look like the PC of five years ago or 10 years ago or 15 years ago.
I think the line of Windows devices we have today exemplifies this transition from a static, stationary computing device that sits in your cubicle to an array of options for the home, the office, and on the go. And it's only getting better. Microsoft is working with hardware vendors to bring more options to the table for consumers and enterprise customers alike -- something Apple is sorely lacking in its limited focus on just a few devices, all of which are only suitable for deep pocketed buyers.
Consumers naturally like choice. In the same way that Android continues to lead the US smartphone operating system landscape, Windows can keep an honest lead in the general computing arena by giving buyers breadth of selection for various needs. Apple famously poked fun at the conformity of the 1980's IBM PC era landscape in its monumental "1984" commercial, but the fruit company hypocritically seems to be making inroads towards such a future again with each and every iPad and iPhone release.
Windows needs to differentiate itself as the alternative to blind uniformity. And Microsoft surely doesn't have to look far from the tree for inspiration. Which company's famous tagline used to be "Think Different"? You can draw your own conclusions there.
Hit: Windows 8.1 is For You, and You, and You
Windows 8.1 hits later this year, but Ballmer rightfully set a half-conciliatory, half-visionary tone for the (massive) point release at Build. This "service pack on steroids" is bringing over 800 updates to Windows 8 covering nearly every inch of the platform. It represents Microsoft's budding commitment to turning longer build cycles into more or less a path of consistent agile development for the operating system as a whole. It's the right move at what couldn't be any more of a right time.
For the most part, Ballmer's presentation of what Windows 8.1 includes and why it matters were equally important. Sure, there were no crowd-awing moments in Jobs-era Apple fashion, but Ballmer is a bona fide salesperson at heart, not a software engineering expert.
The amount of time spent on promoting Windows 8.1 and its "Boot to Desktop" functionality was critical for Microsoft's enterprise perception. Many believe that forcing users into the Modern UI without an option to bypass this app-centric interface was a harbinger of Windows 8 failing in the corporate sector. "Boot to Desktop" puts this concern to rest and ensures that those who rely on the millions of traditional Windows applications are what they should be -- equal citizens in the Windows world.
The reintroduction of the Start button is also a way to placate much of the black noise that surrounded the Windows 8 launch. I personally haven't lost any sleep after it ditched the Start button, but numerous others aren't as kind to this change. It's important to remember that Microsoft may have brought it back, but the way the feature works is quite different than ever before. I went over the changes, almost a week ago, which include the new power menu and the basic way that using the button merely brings you into the Start screen. I'll call that a half concession on Microsoft's end.
Will all of these changes help foster a new wave of Windows 8 uptake? Will Microsoft's renewed call to bring Windows into a Google-style of rapid consistent development ring positively with the technical community? My guess is as good as yours. If the coming shockwave of Office 365 conversions is of any potential indication, perhaps Microsoft does have a knack for rebranding itself as the cool, yet secure bet for the industry.
It's not the technology that is holding back Microsoft's resurgence in the computing world. Windows 8 is a darn solid OS. Windows Phone 8 blows me away in terms of usability -- and I'm a proud Android owner. Microsoft needs to win us back by winning the storytelling game. Tech specs and changelogs don't represent an experience. It's the message that does.
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
The hype bubble around Windows 8.1 is steadily building this week. Microsoft will supposedly dump a full preview version of 8.1 in ISO format, and the rumored date across the net happens to be June 26. In step, BetaNews readers have been sounding off on Wayne Williams' post asking the big question at hand: Will you be installing Windows 8.1?
Interestingly, just by chance, I found out myself that you don't need to wait until the 8.1 ISO hits the web. Some of the biggest, and most requested, changes are already floating around in the wild -- albeit in a slightly different package than you may expect. Both the new Start Button and the 'Boot to Desktop' option are fully viewable in the latest Windows Server 2012 R2 preview build. You can download a full preview copy for yourself over at TechNet.
I stumbled upon this nugget during the course of firing up a fresh testbed server on Windows Azure. I wanted to get a new server going in the cloud to test out a new remote support product for my tech business called ScreenConnect (great software -- more on that another day) and found that Server 2012 R2 was a fully available option as a host OS.
I thought I'd give it a shot, figuring this test server was going to be shredded in a few weeks anyway.
Two things jumped out at me right away. First, as soon as I jumped onto the machine over Microsoft Remote Desktop, I was taken straight to desktop. I don't recall this happening in the RTM version of Server 2012. Secondly, the new Start Button was fully out in the open, gracing the same space as the old, now-defunct Start Button used to sit in the lower left hand corner.
If you have high hopes for this new button to bring back the familiar rich menus of yesteryear, then you will be quite displeased to say the least. If Server 2012 R2 Preview is any indication at what Windows 8.1 features (there's little reason to believe MS would deviate much across platforms) then this Start Button will serve two primary purposes.
First, the new Start Button allows you to jump back and forth between Start Screen and Desktop mode. When in the Start Screen, the button goes invisible, but bringing your cursor down to the lower left hand corner will bring it into focus with a rollover, in which you can use it as a simple way to bounce back and forth between interfaces. I find the Windows key on any modern keyboard to serve the same purpose, and is generally faster, so this function will be minimally used by me. Visual users will likely appreciate this more.
More interestingly, if you right click on the Start Button, you're presented with the same secret power-user menu that stock Windows 8 provides by hitting the "Windows Key + X" combination. It's even in the same lower left hand area of the screen. However, even though the menu is very similar to the one that Windows 8 features, it does add two extra menu items which I find very neat.
Network Connections is now available off the menu, and so are the full bevy of shutdown options you normally had to access via the right Charms bar. My screenshot below shows a "disconnect" option which I presume is specific merely to Server 2012 R2 because I was connected via Remote Desktop. Traditional 8.1 users will likely not have this option; I imagine it would be replaced with a line item for Sleep.
The other big item which has been nagging users of Windows 8 has been a lack of a way to boot straight to desktop mode. I use Windows 8 prime time already without many issues, and spend over 95 percent of my workday in desktop mode anyway. Being able to launch straight into desktop is one of my biggest gripes with the RTM version of Windows 8 and I'm glad Microsoft is finally addressing this flaw. The enterprise will surely appreciate this gesture, judging from what business users are near universally saying since 8 launched.
Luckily, Server 2012 R2 preview has the necessary options to be able to control this aspect of your boot experience as well. Again, I am fairly certain it will function similarly in the 8.1 preview this week. Getting to the boot to desktop option is as simple as right clicking on your taskbar, going into Properties, and clicking into the brand new "Navigation" tab of the Taskbar Properties window.
I highlighted the option in yellow in the above shot so you can easily spot it. However, there are a few other niceties that Microsoft has tucked into this new menu area. For example, you can fully turn off the Charms bar by mousing into the upper right hand corner -- a quirk I find myself running into when using desktop apps regularly. And likely more useful for IT professionals, there is an option (on by default in Server 2012 R2) that forces the right-click menu on the Start Button to show PowerShell related items instead of Command Prompt as is the case with Windows 8 RTM.
As soon as Microsoft releases the Windows 8.1 ISO this week, whichever day it may land on, we will all be able to try this new functionality out relatively easily. But until then, for those interested in a sneak peak, whirl up a free Server 2012 R2 Preview trial server on Windows Azure for a little taste of what 8.1 will likely feature on the interface end of things. While some claim these concessions from Microsoft aren't going far enough, I find them to be a happy middle ground that should suit power users and average folk alike. Myself included.
Online meetings have been synonymous for nearly the last decade with well known platforms like GoToMeeting and Webex. And rightfully so. Both cloud collaboration suites are fairly mature offerings, with expanded feature sets that replicate (nearly) every aspect of a face to face meeting. As an IT professional by day, I'm frequently involved in client meetings over both platforms and have helped countless others leverage these products for their own businesses.
Yet there are numerous things which irk me about the status quo from these two offerings. The biggest happens to be the substantial cost attached to each. It's hard to believe that two platforms with such maturity have not been able to bring their price levels down considerably with as much engineering and prevalent, cost-effective cloud technology as exists today.
The other nasty side to the GoToMeeting and Webex pricing models is their relatively low meeting attendee caps. Sure, I could dump nearly $1K a year on a GoToWebinar subscription but this is outrageous for the few times a year my company needs to host expansive online gatherings. Some radical new entrants like Join.Me do exist, but their free options are fairly pathetic and in Join.Me's case, they offer no 24/7 phone support when issues arise. For a critical client meeting, I don't want to be wading through low level knowledge base articles at the last minute.
Luckily, Microsoft has taken notice of this lopsided monopoly in the online meeting space and has unleashed a competitor most people don't give two thoughts about. Lync, and specifically its cloud-based cousin Lync Online, has really grown on me since the large Feb 2013 overhaul that hit the rest of Office 365 for Business. I penned a mostly positive review on the new iteration of the business suite earlier this month, and Lync Online is no exception.
Lync Online Pricing Has GoToMeeting and Webex Beat, Hands Down
A cheap product is only as useful as the feature set it brings to the table. But Microsoft's recipe with Lync Online is right on the money. I have no qualms with the free Join.Me for the most basic, one-off meetings for low-stakes purposes. But it has a low 10 person meeting cap on the free version, and the free version of Webex fares even worse with a 3 person cap.
Microsoft's Lync Online doesn't play tit-for-tat with attendee limits. The Plan 2 option, which is the only one you should be considering for meeting usage, bundled with Office 365 Pro Plus runs a mere $17.50/month or $210/year. I'll get to the explanation for why Office 365 Pro Plus is needed shortly. But at face value, the closest apples to apples comparison to GoToMeeting and Webex still makes Lync Online less than half as expensive as the other big players. If you're a yearly subscriber to GoToMeeting or Webex currently, you can see the extensive potential savings already lining up.
What do you get for that $210 per year? First off, the reason you need Office 365 Pro Plus along with a Lync Online Plan 2 subscription is because you can't get all the bells and whistles of some advanced meeting features, like meeting recording, solely via the free web based client or slimmed down Lync Basic desktop client. In familiar Microsoft fashion, there's always a gotcha attached, and this is why I'm making this fully clear. But even though Office 365 Pro Plus is needed, the net price of your services are still substantially less than what the (somewhat more limited) G2M and Webex offerings are priced at. And the upside? You get access to nearly every desktop app from the Office 2013 suite except for Visio and Project.
You can view the above comparison chart in its full glory, with complete references included, as a Google Doc. All of the items highlighted in yellow denote a category where a platform holds a respective lead over all other options. It's pretty apparent Microsoft's offering holds its own in many important aspects.
Microsoft happens to have a leg up over the others in most feature categories I compared, as well. If you don't like the lowly 25 person attendee limits that G2M and Webex impose (at the compared plan levels, of course) then you will be pleased to know that there is a 250 person limit for Lync Online. That's 10 times the max number of guests potential in your meetings -- nothing to laugh at. Some don't have a need for this, but if you're even considering hosting a webinar for customers, a 25 person cap could bite you quite quickly. As if that wasn't enough headroom, Microsoft has whispered on its blogs that it's already working on upping that cap to a massive 1000 person limit.
While all of the compared vendors claim to offer "HD video" on their service pages, only Microsoft offers true HD with video up to 1080p quality. The others can be regarded as "near HD" from what I can tell. Another boon for Lync Online is the fact that it can record meetings directly to industry standard MP4 format. The other two allow meeting recording, but aside from G2M using the Windows Media codec option, you're otherwise forced into vendor-specific proprietary formats which can then be transcoded to something more standardized. With Lync Online, you can take your recorded MP4 and toss it straight onto YouTube for private sharing to participants. Convenient and easy indeed and I use it fairly regularly now, even for some of our internal company meetings when all of my staff can't attend.
It's also important to note that, surprisingly, Microsoft has the most expansive list of client and app options for guests to use in joining meetings. Redmond, WA was never known to be one to offer open-arms to users of competing platforms but Microsoft has proven us all wrong this time around. Lync Online is accessible through traditional Windows and Mac clients, but also has Win 8 and Win RT apps (for the Modern UI), a rich web client usable in nearly any modern browser, and mobile apps for every major ecosystem aside from the new Blackberry 10. Webex is the only provider offering a Blackberry 10 app so far and I don't see Microsoft being far behind.
Security in Lync Online follows a similar path that the rest of Office 365 does, and Microsoft states that the service is given full encryption for all transmitted instant messaging and audio/video. For the technically interested, this blog post by Microsoft goes over the bevy of encryption standards used in Lync including a mixture of TLS, SRTP, and others. Microsoft knows that meeting security is a big deal to large organizations and Lync Online doesn't skimp in this regard. I recently argued in favor of moving our US congress to the virtual realm, in part by using Lync, and believe that it truly has the security gusto to past muster for those in charge of this country.
Microsoft doesn't win out in every category, however. In terms of a free trial, GoToMeeting has everyone beat with a lofty 45 day demo period. Also, Microsoft's biggest sore thumb in the comparison charts is its lack of any integrated conferencing bridge phone line service for Lync Online. The only audio options you have are using VoIP over your computer, or opting for a 3rd party provider of your choosing. This isn't the end of the world, as numerous free conference call providers are on the net like FreeConference.com (my company has free audio conferencing from our cloud-hosted PBX provider RingCentral) but it's still something Microsoft needs to offer at some point. Perhaps this is on Microsoft's roadmap along with its plans to turn Lync Online into a full blown cloud-hosted PBX provider by the end of 2014.
Finally, I will make it known that the experience Mac users get on the Lync 2011 client is a tad buggy still, and using the web based client for now is likely a better bet. Microsoft watchdog Mary Jo Foley recently exposed that Office for Mac 2014 is on its way next spring, and we can expect a bundled Lync 2014 client in that release. The next version should put the full Mac client on par with what the Windows side already has. Until then, the Lync web app is full featured for most needs aside from meeting recording.
The Meeting Experience: Feature Rich, Easy to Use
We can drown in comparison charts all day, but how does Lync Online stack up as a bonafide online meeting tool? It's not perfect, but then again, I wouldn't consider GoToMeeting or Webex perfect in their own respects either. Every platform has its bugs and nuances and Lync Online isn't alone here. On the whole, I find that Lync Online has a few rough edges but it hosts high quality online meetings with relative ease.
Any meeting can be started in one of three ways. If you have an internal contact on your Lync Online domain which you wish to meet with, and they are online, it's a simple point and click affair. More formal meetings with larger groups need to be established either via the Outlook plugin, or through my favorite route which is the newly revamped Lync Web Scheduler (see below). Since I'm not an Outlook fan myself, I find the web based scheduler much more friendly and accessible off any computer I may be using at the time.
The new Lync Web Scheduler, available anytime from any computer at sched.lync.com, allows for easy setup of pre-planned meetings via a web browser. Set your meeting details, adjust lobby settings, and email out the link to your guests -- it's as simple as that.
As stated earlier, attendees have numerous options for joining the fray. In most instances, I find that guests prefer to either use the Lync Basic client or the rich web app that is now available (only a plugin needed for some features). Having the full Lync 2013 client as the primary meeting host allows you to have a few extra bells/whistles, namely meeting recording capability which is invaluable for me. Microsoft has a full rundown of the ins and outs of its various desktop access options on TechNet.
I find the Lync Online experience to be a bit more controllable on a regular computer, and actually prefer it to the overwhelming interfaces that eat up your screen in GoToMeeting and Webex conferences. I'm not saying that the others necessarily have flawed experiences, but I personally feel they require a lot of real estate even for the most basic meetings. Lync, via web app or full desktop client, is fully collapsible to a small window if necessary for times when you just need to multi-task during a meeting.
The usual gamut of options exist for making the most of your meeting. Microsoft introduced a neat new video view called Gallery in the latest release which neatly lines up everyone in the conference, highlights active speakers, and shows avatars or name-blocks for situations where video is turned off by a party. You can also easily share a full screen, just a particular program, or a PowerPoint presentation directly. People who are adept with OneNote can even share an active notebook for participants to mark up. This may be old news in a few months, as all of the Office Web Apps are getting full Google Docs-esque live co-authoring very very soon.
The amount of options available in online meetings are numerous and should meet most peoples' needs quite well. You can share single screens, multiple screens, single apps, PowerPoints, or even throw up a common whiteboard for attendees to mark up. Quick polling is even possible in the latest release, and results get tallied automatically on the fly. (Photo credit: Microsoft)
As a meeting host, you have a bevy of options at your disposal. You can easily assign presenter rights to nearly anyone else in the meeting at any time, and also mute other attendees as well as block them from displaying video. The whiteboard feature in Lync meetings is probably one of the coolest aspects I think, and virtually replicates what a large meeting room whiteboard provides during face to face meetings. However, this virtual whiteboard is at everyone's fingertips and doesn't require any smelly markers. Another benefit is the fact that it can either be solely exported to a format of your choosing, or if you are recording your meeting, it will be natively embedded into the meeting recording by default.
The meeting experience as a whole is pretty smooth as long as each party has sufficient bandwidth on their end. Most normal conferences not taking advantage of HD video don't eat that much bandwidth, but pumping full 1080p HD video over Lync is specified as needing upwards of a 4Mbps consistent connection both ways. This can vary depending on the situation and number of attendees, but for the average meeting I partake in, audio is crystal clear with very good to great video quality as long as people are on Wi-Fi at the very least. Cellular connections (3G/4G) prove to be an issue for mobile Lync attendees at times, but this is just as problematic with Skype and related competing products from what I've seen in the field.
Should You Dump GoToMeeting or Webex for Lync Online?
Of course preference is a large factor that determines the software we choose to use. Personally, I see no glaring holes in Lync Online that would prevent me from recommending it to clients of my tech company FireLogic. For the price point that Lync Online brings to the table, and the features list which outstrips G2M and Webex in many ways, I see no reason why you shouldn't at least consider a switch to Lync Online. I know there is a certain stigma with saying Microsoft has a better solution than the others, but I honestly think this is becoming the case in the online meeting sphere. Unless I am missing something big in my comparison, Lync Online is a feature rich alternative to the status quo.
The key areas that I believe Lync Online has the other big players beat:
If you want to give Lync Online a spin, you can easily sign up for an Office 365 E3 level demo account which requires no credit card and allows for a full 30 trial of what the service has to offer. If you are happy with the service, you can easily switch your subscription level down to just Lync Online Plan 2 and Office 365 Pro Plus within the Office 365 administration area.
I've been using Lync Online for customer and in-house meetings for nearly a half year now, and must say I have been taken aback by the quality of the product Microsoft pumped out. This is coming from someone who runs Google Apps for nearly all other aspects of the FireLogic email backbone! While the rest of Office 365 for Business has been a true comeback kid in its own right, Lync Online stands out as a reliable alternative to the established names in online conferencing. Lync doesn't need to prove itself as a worthy conferencing tool, as nearly 70% of the Fortune 500 are already utilizing Lync in some fashion already.
It no doubt has its share of shortcomings which I mentioned earlier, but unless those are showstopping aspects for you, I would highly recommend in giving Lync Online a shot. You may be pleasantly surprised, and the need for GoToMeeting or Webex may soon be a relic of the virtual past.
Photo Credit: nmedia/Shutterstock
The very notion of telecommuting has been present in the mainstream white-collar workplace now for well over a decade. Yet for one of the worst offenders in padding operating and travel expenses, namely the U.S. Congress, the notion of mentioning telecommuting seems to be downright sinful. One would think that these calls for a "virtual Congress" come from watchdog groups of various political winds. But shockingly enough, one of Congress' very own -- House member Steve Pearce of New Mexico (R) -- is leading the push to bring our legislative branch full circle into the 21st century.
The premise behind the technical, and very much cultural, shift in thinking for how Congress does its business is quite down to earth. "Corporations and government agencies use remote work technology; it’s time that Congress does the same," says Pearce on a landing page for his initiative. "Members of Congress can debate, vote, and carry out their constitutional duties without having to leave the accountability and personal contact of their congressional districts." A wholesale breath of fresh air, I say.
While eliminating wasteful spending is the primary reason my interest was perked by this legislation (House Resolution 137), there are numerous other perfectly sane considerations behind the bill. Just a few of those mentioned by Pearce include:
And the list goes on. I laud Pearce for his efforts to markedly improve the way that our expense-tattered Congress does its business year in, year out. But as I mentioned, the bread and butter behind this legislation, as far as I'm concerned, sits within the fields of wasted dollars that Congress produces. While most congressmen prefer to keep a hush-hush silence on the money which oozes at the seams from their lavish culture, I have no issue telling it like it is.
The Numbers Behind "Business as Usual"
The digital age has blatantly uprooted the way most of us "average Joes" go about our daily lives. We're reading a majority of our news online as opposed to in print. Email has cut back on what was exclusively a vestige of the postal system. And most purposeful data is created -- and kept -- in the digital realm. Yet if you learned about any aspect of the workings of Congress, you'd be shocked at how '20th century' our nation's capital operates.
Take, for example, the ludicrous policy which forces the Government Printing Office to physically print hard copies of all bills on the floor for consideration in Congress. It was estimated that between 325-475 copies of any single bill are printed for congressmen at a time. The final copy of Obamacare, for example, comes in at 974 printed pages. If you want to kill some trees en-masse, try printing off the nearly 20,000 pages of Obamacare regulations that Sen. McConnell sarcastically posted a photo of online a few months back. A bill to stop this madness passed the House back in 2011 that could have spared taxpayers roughly $35 million over 10 years -- but not surprisingly, it failed to make progress in the Senate.
The 20,000 pages of Obamacare regulations sitting stacked in Sen. McConnell's office, printed by our very own Government Printing Office. A prime example of the waste our government basks in. A virtual Congress could essentially slim the GPO printing budget down to pennies. (Photo Credit: Senator Mitch McConnell)
The waste at the hands of Congress stemming from its digital phobia doesn't start or stop with its printing obsession. Even as Americans feel the pinch from sequester related cuts, Congress and its pampered staffers took no less than $1.45 million worth of trips in 2012. While this figure represents local and foreign travel alike, this number is surely bloated by the senseless back and forth that congressmen make between their hometown districts and Washington each month.
And the madness doesn't stop there. The red tape that dictates everything from how bureaucrats can travel, what kind of reporting they are required to make, and other rigid regulations fluff administrative overhead for these travel costs an easy 30 percent (based on Defense Department rules.) To add pain to injury, it doesn't help that congressmen are regularly booking multiple seats on subsequent flights to ensure they get to where they want to go, at any expense. And don't forget the super expensive front-row parking spots they are reserving at airports to cut down on commutes. The perks of being in Congress, I guess.
As if traveling in style and comfort wasn't plentiful enough, the vast offices that members of Congress and their expansive staffers have access to while in Washington are equally wasteful. The Congressional Research Service publishes a regular report aptly titled Congressional Salaries and Allowances which provides some nice insight into these plush congressional "homes away from home." Every member of the House is provided what is called a Members’ Representational Allowance (MRA) that is a slush fund of sorts to cover everything necessary in serving out their public duties. The Senate has their own version, the Senators’ Official Personnel and Office Expense Account (SOPOEA).
The CRS says that the average MRA for 2012 was $1,353,205, but could be as high as $1,564,613. That's per year, per congressman. Senators are even worse offenders, having an average SOPOEA budget in 2012 of $3,209,103.
This means that House representatives can churn through an average total of $588,644,175 per year, with Senators claiming an average total allowance of $320,910,300 per year. In total, this comes out to $909,554,475 of taxpayer money set aside merely to keep official Congressional life moving. How much of these budget allowances are truly necessary is up for discussion. But one cannot for a second say that this entire sum is a necessity of being a congressman.
Per the official CRS report, this money is used for items such as "staff, travel, mail, office equipment, district office rental, stationery, and other office supplies." I'm no budgetary expert, but I would make an educated guess that a virtual Congress could cut this yearly waste by a good 75% if not more. We don't need to look far to see where the sequester should have been targeted.
How Congressional Telecommuting could would Work
It's not a question of if a cultural transformation of the way Congress handles its business would be possible. It's purely a matter of how much benefit it could have not only on budgets, but on increased face-time between congressmen and their constituencies. Members of Congress could easily do a majority of their grunt work including voting, debates, staff meetings, and much more over the same technology used by 70 percent of the Fortune 500: Microsoft Lync.
There is nothing pie in the sky about Congress shifting it's work away from Washington and back to the towns and cities where they got elected in the first place. The amount of trips to and from D.C. would be cut by more than half. The useless pallets worth of printed drafts of bills could be eliminated and replaced by a solution like SharePoint. And all of the supporting phalanges that keep this monstrous machine moving could be drastically reduced, if not eliminated in most areas, because congressmen would be ideally working from the comfort of their homes or small local offices.
70 percent of Fortune 500 companies use Lync for their videoconferencing and telecommuting needs. Could this be the the way Congress does its business day-to-day in the near term? If it works for corporate America, there's no reason government can't follow suit.
In practical terms, there are not many reasons anymore why such a move to a virtual Congress would be such a difficult change. Beyond training everyone involved, and ensuring that a reasonable support staff was in place across the country to keep this digital change humming, the largest pieces of the puzzle are already widely available. And not surprisingly, various branches of government are already leveraging these technologies. The Federal Aviation Administration has already turned to Office 365 for Government, along with the likes of the Environmental Protection Agency as well as the US Department of Veteran's Affairs. Similar high-profile converts in the public sector have likewise turned to Google Apps for Government.
Just how little could the technology behind a virtual Congress cost? While Steve Pearce didn't include any estimates in his proposed bill, I'm going to go out on a limb and provide some food for thought. Using numbers provided by CSPAN back in 2000, this pits each House member at having an average of 14 staffers, and each Senator at roughly 34. Based on this, the number of direct members of Congress and their staff which would need a virtual platform for telecommuting and related work is 10,025. If each of these were given, just for discussion's sake, an Office 365 for Government E1 subscription at $6/month, the cost to equip Congress and its entire support staff would run a lowly $721,800 per year.
This cost would not only cover Lync Online to provide end-to-end videoconferencing, but a fully secure cloud-hosted Exchange Online email account, and access to SharePoint Online and Office Web Apps. Call me crazy, but that's enough for most people to be fully functional in an online workspace environment. For those who would counter a virtual Congress by pointing at security, Office 365 provides out of the box Lync Online encryption and optional fully encrypted Exchange Online email service. And for high stakes file sharing, encrypted VPN capability has been maturing for over a decade now. The technology to get this done already exists and is clearly already being used by facets of our government as mentioned earlier.
For those keeping tabs, such a cost is less than 1 percent of what we are currently allowing Congress to splurge on offices, mailings, travel costs, and the rest of the niceties that go hand in hand with running an office in Washington. Of course my estimates don't take into account the support costs that would go into keeping this digital infrastructure humming, but I'm sure even such a total would be chump change compared to the massive waste being dumped into travel and such to maintain a traditional "in-person" presence. The status quo in Washington has a tendency to balloon costs due to red tape, administrative overhead, and other porkbelly fluff so the savings I am estimating could be exponentially greater than what they seem.
So what's likely keeping Congress from putting their official votes down on Rep Pearce's common sense bill? The culture of corruption; the era of entitlement; call it what you wish -- the logic behind the lack of real change is the same. Former President Reagan understood this dilemma best. "Government does not solve problems; it subsidizes them."
Photo Credit: Joe Wilcox
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Merely a half year ago, my thoughts on Office 365 were salty at best. Outages continuously plagued the service. Its treatment of browser-based users who wished to forego desktop versions of Outlook and Office disappointed. And spam filtering was bottom tier, proving to do little in stemming waves of junk mail. In the February 2013 release, Microsoft turned a new page and proved why it's a reliable comeback kid in the cloud.
If you don't believe Microsoft is transorming itself into a company solidly rooted in the cloud, you're clearly missing the writing on the wall. The company's past three years have been nothing short of a cloud-cluster of budding services while simultaneously sun-setting legacy on-premise products. Windows Small Business Server bid its farewell, while runaway hits like Azure sweep the Redmond, Wash. horizon. Yet even as Office 365 for consumers came out to relatively loud fanfare, the main attraction of the Office 365 product line is the business-oriented offerings.
Only seven months ago, I put Google Apps and Office 365 head to head in a cloud email showdown. Google Apps came out ahead with a comfortable lead in most aspects, and rightfully so. I lauded Google's top notch spam filtering engine; their dedication to treating browser users as first class citizens; and best-in-class release schedules for new features and bug fixes.
Microsoft listened to the IT community's (and my own) lashings on Office 365, because the recent February 2013 overhaul of the business suite is like night and day to what its predecessor resembled. And luckily, most changes made were for the better.
For the uninitiated, Microsoft rushed to market a half-baked service called Business Productivity Online Suite, which was the big brother to Office 365. This was back in 2010, when Google Apps was starting to become the hot ticket in cloud hosted email. The service itself was just as awkward as its truly maligned name; botched in execution and put forth as a competitor to Google Apps in name only.
Office 365 launched in 2011 to wipe away the mess that BPOS left, and it spent two years in the minor leagues getting numerous rookie mistakes cleaned up. On February 27, 2013 Microsoft finally put two solid feet forward and showed the world it's dead serious about cloud email.
Whereas my company turned customers away from Office 365 in droves pre-2013, we're now embracing the product with open arms. What gives? It's a multi-faceted about face from Microsoft, and from what I've seen from the numerous customers we have deployed on the platform since, experiences have been pretty positive thus far. Some of the key areas Microsoft overhauled in the February 2013 release of Office 365 include:
In light of all the above positive enhancements, we wholeheartedly recommend Office 365 to customers now. Instead of being relegated as a last resort option for customers who are just horrified at the vision of using Google for email, Office 365 actually presents a value proposition now that can stand on its own. While Office 365 is without a doubt the preferred option over on-premise or hosted Exchange, its now getting a fair shake in situations where I would have solely considered Google Apps.
Online Administration gets a Welcome Overhaul
For a cloud platform, the old Office 365 had a downright pitiful online control panel. Microsoft tried harder to tame an Office 2010-esque color palate than to provide a meaningful, organized administrative experience. The result was a disparate interface, pieced together with numerous sub-sections that had zero flow or logic behind placement. As an IT professional who administers Office 365 domains day to day, the new online interface powering Office 365 since February is a breath of fresh air.
The new Office 365 online control panel dashboard is well-organized, visually pleasing, and easy to operate in every respect. In contrast, the old interface was a daunting experience with no sense of togetherness in either layout or functionality. Now, everything just flows -- as it should.
For example, just figuring out what aspect of the control panel you were in used to be quite the chore. Instead of simply cutting up the admin interface smoothly as it is now, using a simple dropdown menu towards the upper right-corner of the screen, the old experience used disjointed text links to take administrators between Exchange Online control and regular admin features. To some these small caveats may not matter, but they rounded out Microsoft's rookie attempts with the old Office 365 to a T.
More importantly, the old control panel did not display the at-a-glance health information of the current up/down status of various Office 365 facets. Let's be honest: most users that play the hat of administrator will usually log in to check on downtime-related problems. Having this critical information display in your face as soon as you log in is another indicator of Microsoft's eagerness to reign in an era of transparency. Perhaps it was best that the company kept this information under wraps -- stability was a major issue with the old Office 365 anyway.
Taking an elegant page out of the Google Apps UI playbook, Microsoft decided to implement a common 2-pronged navigational structure across the entire Office 365 web interface. The left hand sidebar menu plays host to a number of "primary areas" that control sub-tasks within major sections. And straddling the top of the service is a one-click quick access navigation bar, akin to the prominent "black bar" that gives away most pages within the Google sphere. The resulting experience is clean and connected; users never have to guess at where they are, or how they can quickly change between major features in Office 365.
The new universal nav bar, which straddles the entire Office 365 web experience, takes a solid cue taken from Google Apps' playbook.
Even little things, like the redesigned Domains section of the control panel, make the experience that much more pleasant. You can quickly see which domains are active and functional on your account, and take corrective measures to fix issues that may be affecting mail flow or accessibility on any of them. The old interface was a bit wonky in this regard, forcing you to trial and error multiple supporting links just to find the proper settings page you wanted to access. Minor gripes, but from the perspective of a seasoned IT person, these are the small things that make up a grander picture.
Office 365's Bread and Butter: Exchange Online
If there was one change I had to choose which stands above the rest in importance, it's that Microsoft finally got serious about competing in the browser-based cloud email wars with Google Apps. The poor excuse for a web interface that existed in the pre-2013 Office 365 was putrid. However, in true Microsoft fashion, they redeemed themselves after an extended rookie season with a now gorgeous Outlook Web App experience that pulls straight out of the refreshing Outlook.com closet.
The pre-2013 Outlook Web App that users endured. The goal was to bring the entire Outlook experience into the browser, as illogical as that seems. The result? A clunky, chopped-up interface that didn't even run great in its preferred browser -- Internet Explorer.
I have a bone to pick with Microsoft in that it still does not allow for custom login pages like Google Apps offers. I am not bemoaning the appealing default login page by any means, but customers that know about Google's custom login page capability scratch their heads in disbelief with something so trivial. Nonetheless, I easily get around this caveat by creating a psuedo-custom login link merely entailing the use of a CNAME record that points to the direct OWA webmail link for Office 365 at mail.office365.com.
Gripes aside, the power of the new web interface for Outlook Web App is striking as soon as you get into your account. The simplest items, like even making a new email message, were not well placed in the pre-2013 interface. Take a look at the old interface and new interface (the screenshots straddling above and below are examples) and tell me upon which version you can clearly find the button for making a new mail message.
The new 2013 Outlook Web App simplifies placement of the most basic tasks people were trying to complete via the browser. Yet the resemblance that the new OWA bears to Outlook 2013 and Outlook.com for that matter is very well thought out. People used to the older 2010 style are not left in the dark, yet 2013 users will feel right at home. It's the best of both worlds as far as I'm concerned.
The improvements start in the inbox, but permeate across the rest of the functions of Outlook Web App with crisp refinement. Once again, the Google Apps-esque new top navigation bar creates a unified path for movement between all of your critical areas. Gone are the disorganized, clumsy choices of the pre-2013 OWA release.
Calendar, for example, now has awesome dropdown reminders that pull from the common top bar of the screen, instead of blasting you in the face as the old release did. The calendar in OWA now also supports multi-calendar overlays ala Google Apps, which is refreshing. Working with multiple calendars is now a breeze.
This is not to say that rough edges don't exist. Sharing calendars from the online interface is still a bit of a chore compared to what native Outlook provides, which is a shame because Google Apps has been doing online shared calendars in less than three steps for years now. I have found this out first hand after trying to setup shared calendars between coworkers at businesses we support.
Not only does the share have to be initiated by the first party, but the third party receiving the share needs to actively add the other calendar manually within their "My Calendars" area. Not the end of the world, but it would be nice to see Outlook-style ease of calendar sharing.
Simple is the new "cool" in the 2013 Outlook Web App interface of Office 365. All browsers work equally well; speed is non issue; and might I say, I prefer it over native Outlook. It replicates about 75 percent of what desktop Outlook can do. OWA isn't a second rate experience anymore.
Contacts has been aptly renamed to "People" as Outlook Web App takes more cues from the social infusion that Outlook 2013 has received. Which is not a bad thing. The address book still works as it did in the past, but searching capabilities have been tweaked to a certain extent for faster results and less waiting.
An interesting new addition is the ability to connect to your LinkedIn contacts and have them stream into your People list. I have not tested this yet, but Microsoft has some decent information about this capability on their support website. I personally prefer to keep my contacts list clean and clear of social accounts, but that's up to you.
The options area for Outlook Web App has also received a feature boost. Not only is the visual layout much cleaner than the old dogged design, but you can handle tasks related to your mobile devices connected to your 365 account, as well as control aspects of how conversations are controlled in reading view. Inbox rules have gotten considerably more powerful, and you can replicate about 80 percent of the potential with rules that are found in native Outlook which is a welcome change.
On the whole, is Outlook Web App finally worth using? Most definitely. Outlook Web Access from Exchange 2003/2007 and even 2010 to an extent was always considered a second (or third) class citizen when it came to basic tasks. So much so that customers would always laugh at me when I brought up even the thought of going into OWA.
The tables have turned, however, and the fact that you can use a non-IE browser and have a quick, seamless experience with most of the features your inbox has to offer is radical when looking back on the history of Outlook Web App since its inception. I have numerous customers who don't even use Outlook anymore because OWA suits their needs just fine. Microsoft knew OWA was a laughable mess since it came out, and they created a browser based experience worth a concrete look in the new 2013 release.
Exchange Online Administration proves 'On-Premise' is Dying
While the end-user side of Office 365's email experience is commendable, IT professionals are likely concerned about the concessions they'll make by moving email administration to the cloud. Specifically, colleagues of mine in the tech industry are always curious about what nitty-gritty administrative tools or functionality they give up with moving to 365. The newest Exchange Online release and its respective Exchange Admin Center prove to be reliable tools which act as the online gateways for playing email domain God.
The million dollar question always is: how much of the traditional Exchange administrative experience does Office 365 replicate in the cloud? If I had to put a number on it, I'd say around 80 percent. Of course there is still going to be a feature disparity in some areas between Exchange Online and traditional Exchange, but the differences are dropping with each subsequent release from Microsoft. In the pre-2013 versions, I estimate that only about 55-60 percent of the features were common across platforms, so Redmond has been making steady progress here. I wouldn't be shocked if there is a 1:1 likeness between the cloud and on-prem versions in the next major update.
The innate power within Exchange Online administration is shown above, with what kind of mail flow rules can be configured across a domain. Anything from segregating messages from particular senders to applying per-message modifications based on set criteria can all be accomplished in a few clicks. I'm not missing traditional Exchange one bit.
One of the sour spots with the out-of-box experience in 365 is the still relatively useless built-in migration capabilities for email moves. While Microsoft's documentation highlights how you can choose from either Staged or Cutover migrations and the caveats that come with each, one of the biggest things that irks me is the fact that you cannot migrate on-premise Exchange boxes that use self-signed SSL certificates. Strange indeed, because nearly all small businesses we help with Office 365 moves are using Exchange and self-signed SSL certs -- going with third party certs is generally expensive and unnecessary (security purists would likely disagree, however).
For customers who have legal retention and/or archiving capabilities purchased for their domains, the online Exchange Admin Center allows for seamless management of those aspects in the same interface. These functions were usually relegated to third-party services, since Microsoft never had a clean way of doing archiving on a mass scale internally on Exchange. This notion has been turned on its head with 365, as Microsoft offers unlimited archiving on its higher end 365 plans. This puts Office 365 on par with Google Vault for easy administration of users' email archives.
Powerful, Accurate Spam Filtering in the New Office 365
Mail filtering deserves a brief discussion, since this is an area that Office 365 has traditionally fumbled in for as long as I can remember. When I consulted with customers who were on the old pre-2013 release, the agony they relayed to me about poor detection rates for their inboxes was just overwhelming. Under its old moniker of Forefront Online Protection for Exchange, the platform was quite pitiful in its spam detection levels. False positives were common, and too much junk would get through to inboxes. This was one of the big items that forced me to knock Office 365 in my previous head to head with Google Apps.
Spam filtering, and malware filtering on the same note, has gotten a big boost in the form of Exchange Online Protection (EOP) which comes enabled and bundled with every Office 365 email inbox. Microsoft's reliance on Spam Confidence Levels (SCL) in performing spam checks on incoming mail has gotten amazingly accurate, and might I say on par with what Google Apps offers through its notoriously powerful anti-spam engine.
Even though my analysis of the differences is completely non-scientific, the customer feedback since the 2013 release in February has been indicative of the improvements made. Junk is hitting the spam folder an extremely high percentage of the time, and false positives are now near nonexistent. Night and day difference? I'd say so.
Exchange Online Protection, the spam/malware filtering backbone in Office 365, is disgustingly accurate. Unlike the Forefront-powered sibling it replaced, EOP is now very capable at filtering spam and malware without any third party solution. This reduces cost, complexity, and administration time for customers looking to make the move to 365.
I have heard of customers who had malware outbreaks due to messages with infections that passed through their pre-2013 Office 365 inboxes, but this is a thing of the past. Exchange Online Protection brings with it numerous enhancements for keeping malware out of end users' hands, including the implementation of multiple scanning engines. Microsoft claims that its EOP scanning engines ask for new definition updates every hour, which is better than most client PCs we handle which only do two to three updates per day at best. Microsoft is deeply serious about changing their tainted security image with the new Office 365.
I also want to add that these benefits from EOP extend to any users who are viewing their email through Outlook on the native desktop interface. EOP's deep integration with 365 at the datacenter level means that the desktop, mobile, and Outlook Web App interfaces all take advantage of the filtering provided by EOP. This is one of the numerous reasons why I think Office 365 has on-premise and hosted Exchange beat by a wide margin. If you are looking for the most tight-knit security solution for your cloud email, Office 365 sits at the top of the heap right next to Google Apps.
Office Web Apps is actually Worth a Look Now
If you had a bad taste in your mouth from the previous iteration of Office Web Apps, you're not alone. Microsoft has made considerable changes to how the apps work, and how accessible they are, in making the service more attractive as an alternative to the trend that started with Google Docs. The pre-2013 versions of Office Web Apps had one big, glaring flaw: they could not be created natively from the web. This means you HAD to have a copy of Office installed that worked with this functionality (namely Office 2010) in order to even have access to work with this nuanced capability. Even then, your ability to make massive edits was fairly limited. It was clear Web Apps were not ready for prime time.
The newest iteration of Office Web Apps allows you to create documents right in the web browser. This alone will help stroke up adoption of the service quite a bit in the newest release, as people without the full Office suite installed will be able to have light web versions of Office which can replicate a majority of the core basics from the bread and butter apps Word, Excel, PowerPoint, etc.
Word Web App replicates a decent portion of what full Word 2013 has to offer. However, take a look at how slimmed down the ribbons are between both variants. Word Web App is notably missing a few key tabs (top) such as Review, Design, References, and Mailings compared to full Word 2013 (bottom).
Another key difference is the conversion of what used to be called SharePoint Workspace to now simply SkyDrive. In reality, this is not the same SkyDrive that consumers are using with 7GB of free storage. SkyDrive Pro is a feature limited to Office 365 for Business accounts that provides the same underpinning of an experience as consumer SkyDrive, but with added functionality when sharing files between coworkers within your organization.
Microsoft even recently released a native Windows SkyDrive Pro app a few weeks back, which means that you can take advantage of the benefits of storing your data on this great new service without being tied to using the new Office 2013 suite. Office 2013 was the only version of Office that offered integrated usage of SkyDrive Pro with Office 365 accounts - until now.
Creating documents through Office Web Apps right from the browser? Yes, this is now an option by default - which means you aren't tethered to the full Office suite anymore.
Would I drop my love of Google Docs for Office Web Apps? Personally, I think Google Docs is still more mature of a product when it comes to browser-based document editing, especially in the realm of simultaneous multi-user document editing. But that doesn't mean Office Web Apps aren't capable in many respects. Co-authoring of documents by multiple parties is now becoming more of a reality, as Microsoft realizes that people are yearning for Google Docs-like capabilities in their Office documents. Microsoft posted a neat video about the current abilities of co-authoring from a sample PowerPoint Office Web App file, which was pretty neat. Let's hope these functions keep growing, and eventually match (and outpace) that of what native Office can do.
If Office 365 is to continue growing into more than just a bona-fide email platform, and into a full blown online workspace solution along the lines of Google Apps and Podio, then this is one of the areas Redmond needs to continue pushing the boundaries on.
Unified Messaging with Lync Online: Promising, but still Maturing
Is Lync, in its current iteration within Lync Online, the silver bullet to unified communications? No -- but the direction Microsoft is heading definitely assures me that they will get there eventually. The new additions to Lync Online are plentiful, and many of the oddities that the Lync platform exhibited in the pre-2013 release have been carefully cleaned up.
One of the biggest gripes I had with the old Lync was its downright embarrassing Lync Web App experience. In true Office 365 fashion for web browser users of the pre-2013 version, the experience was cut rate at best. Windows users got about 50 percent fidelity with what Lync client attendees saw, and it was even worse for Mac users, as our own internal testing proved it likely was never intended to work.
Mac users had to rely on Lync for Mac 2011 from the Office suite - something not all Mac colleagues have access to for one reason or another. In all, the Lync meeting experience pre-2013 was a test in patience and technical aptitude to a large extent.
Those days are gone. The new Lync Web App is full featured and claims host to a bevy of capabilities which the old iteration could only dream of offering. One of the biggest introductions is full IP audio and video, as well as desktop sharing. Meeting management is also on par with what the Lync client offers, in that meeting hosts can now work with participant lists and also schedule meetings via the web. The old Lync allowed for web scheduling, but it was pretty pathetic -- I tried it for numerous company meetings we attempted to hold over Lync, and gave up.
The fact that any web browser can be used now is also quite refreshing. Internet Explorer was the browser of choice before, which was another way of Microsoft engineers forcing people to use inferior technology just to take advantage of their cloud. Giving Chrome parity with IE now levels the playing field, and shows Microsoft is committed to continuing its path towards open standards and accepting that there is a life beyond IE.
One of the other things which people should take note of, and something we get asked about quite a bit at FireLogic, is whether Lync Online can replace your current PBX in exchange for a cloud-hosted Lync-powered alternative. For the time being the answer is "no", as Microsoft has just shut down its short-lived Lync hybrid voice solution it was running with partner JaJah Voice. Microsoft still requires that Lync Server be deployed in your local server room in order to take advantage of the "full" Lync Enterprise Voice experience.
Seeing that Microsoft is kicking dust in the face of its own Exchange product in the way of Office 365, it's a shame that fully cloud-hosted VoIP is still not yet possible. This is one of the reasons we are moving customers over to cloud hosted VoIP from RingCentral, while Microsoft gets its PBX-to-Lync transition plan together. It has been quoted that Microsoft has put an 18 month timeframe on getting fully cloud hosted PBX going with Lync, but this remains to be seen.
I won't spend many words on it, but Lync desktop client has gotten a number of new features as well in the 2013 release. Numerous changes to display options for contacts has been added, along with tabbed conversation views, and one-click video calls to boot. Further Outlook integration has also been baked in, for those who are tied to Microsoft's desktop-powered experience.
OneNote has even gotten a layer of integration with Lync in the form of shared notes which can be co-authored during a Lync meeting. While I would opt to just use Office Web Apps to achieve this, it's nice to see the desktop side getting some multi-user attention.
Out-of-box Mobile Device Management
Many people have no idea that Office 365, through Exchange Online, provides quite an impressive array of management features which can be leveraged to keep a fleet of mobile devices secure. Much of this is powered directly via Exchange ActiveSync, the protocol Microsoft developed nearly a decade ago to provide secure access to most data offered up through Exchange services. While it doesn't offer advanced functionality like remote app installation or GPS location of lost devices, if your organization is looking to cover the basics on the cheap, look no further.
For starters, you can easily specify default policies on what kind of password security is required on mobile devices. Complex passwords can be enforced, along with number of sign in attempts before a device is wiped, as well as forced encryption. At one of my last customers that we moved to Office 365, there was a fleet of new Galaxy S3 phones being deployed to replace an aging Blackberry Enterprise Server system with Blackberry phones. Forced encryption worked amicably and ensured that each smartphone was completely safe from data tampering in the event that a remote wipe was not launched in time after the phone was stolen/lost.
Are you looking to drop your own BES environment in exchange for Office 365? You're also in luck, as most subscriptions provide full support for pre-BB10 devices (OS 7.1 and below) via Blackberry Business Cloud Services for syncing and device security. This was a concern for a few customers who were tired of administering BES but did not want to lose the capabilities they enjoyed with managing their fleet of Blackberries. No need to dump those Blackberry phones in order to transition to Office 365.
Creating policies to control password requirements and encryption enforcement, to name a few, are very easy in the new Office 365 control panel. The above screenshot shows off all the available features for locking down mobile devices, and these capabilities continue to grow as 365 matures.
Going further, however, Office 365 allows you to set policies on what kind of devices can connect to your domain's email accounts based on pre-set baselines. Where would this come in handy? For example, if a business wanted to ensure that only company-issued phones were allowed to sync to the corporate 365 domain. I see this being very useful for large enterprise or government entities moving to 365, where top-down deployment of smartphones is still quite common due to security reasons.
Remote wipes can be issued both by administrators and end users, something which is very key for situations where IT personnel cannot get involved fast enough to prevent data leaks. The remote wiping capability works very well. On my tests of trying to wipe a Galaxy S3 through the administrative panel, only three minutes after requesting a wipe did the process fully begin and completely rid the test phone of all remnants of its previous information.
In all, the MDM features cooked into Office 365 out of the box are quite powerful and will meet the basic needs of most organizations. Due to this, unless customers have some advanced needs, we are relying on first-party MDM via Exchange Online for 365 customers going forward.
With so many Plans, which Way to Go?
In some ways, there are so many choices in the Office 365 family that your head can start to spin. Microsoft has created a dizzying array of SKUs for the new Office 365 and an increasingly important part of the sign up process is narrowing down exactly what service level to choose. There is little room for making mistakes here, too: since Microsoft bars you from moving between platform levels (i.e. Small Business to E1) without starting from scratch, getting onboard at the correct subscription is critical to avoiding future headaches.
For most businesses, here are all the available plans:
There are a scattering of a few other niche plans, such as Lync Online and Project Online, but those are for specific situations. As I stated above, I am not very fond of both the Kiosk plans or the Small/Midsize Business plans. For one, the Kiosk plans are not entitled to any phone support from Microsoft, which businesses need to be mindful of before going on the cheap. And specifically what deters me from recommending the Small Business/Midsize plans is their standing as true "silo" plans -- in that they cannot be moved up to E or down to Exchange Online Only plans with ease.
The above chart does a pretty good job showing off the differences between the K and E plans, along with pertinent pricing. For the most part, all of the Enterprise level plans share common 25GB inboxes, Lync Online connectivity, and SkyDrive Pro space for document storage. E2 through E4 plans are given the ability to author documents through Office Web Apps, and E3/E4 plans have advanced features like Office desktop suite rights and archiving functions. If you want a full in-depth comparison of every single feature in the various plan levels, this TechNet blog post goes into painstaking detail for you.
I gave Microsoft a lot of heat in my previous writeup on 365 last year regarding the lack of 24/7 support for the now-dead P level plan they offered. My argument was that they were offering phone support for Exchange Online users at $4/month, but P plan subscribers at $6/month were left to support forums run mostly by community volunteers. It just didn't make sense any way you cut it, and I'm glad Microsoft has ditched that plan altogether and now providers phone support for all of their major tiers except for dirt-cheap Kiosk plans. For businesses making a move to the cloud, having that hand-holding when issues crop up is indispensable.
A lot of customers also ask me whether paying $20/month to get Office 2013 download rights is worth it. That's a question whose answer changes depending on who you are, and how much value you get from the higher end apps of Office Professional Plus. Personally, I rarely use much of the Office suite outside of Word, Excel, and PowerPoint (although I am fond of Publisher for various marketing needs) so to me, as a business owner, paying $20/month just to have those rights is not a deal maker for me. However, if your daily needs entail usage from some of the higher end products, this may be a consideration to make.
Best-in-class Security, Stability rounds out 365's Value Proposition
At first glance you may think that with as cheap as Office 365 pricing comes in, that there must be some concessions. Namely, in the area of security and data privacy. Microsoft pulls no punches here, and rightfully so. With Office 365 becoming a go-to service for a number of large government entities and businesses alike, 365 can't cut any corners with ensuring that organizational data is 100 percent safe.
I pitted Office 365 in a head-to-head comparison of security standards and compliance levels against some of the largest hosted Exchange providers (chart originally from my Office 365 v Hosted Exchange article in March). Hands down, Office 365 has the others topped.
You can most certainly read up on all of the various areas that Microsoft is concerned about when it comes to your data. As an IT professional myself, I've told customers time and time again that there is little we could employ in our budgets to bring our on-premise defenses to the same levels as what Microsoft has going in its data centers and cloud infrastructure. Surely there are many skeptics who will claim otherwise, but I challenge them to make good on their word. For all intents and purposes, Microsoft's unrelenting drive for top notch security in 365 is unmatched when put against hosted Exchange or on-premise Exchange.
Security & compliance is an important discussion to have when looking at prospective email providers. I have been equally impressed with Google's stance towards tough security with Google Apps over the last few years, and it's good to see Microsoft following suit and even raising the bar.
Medical institutions, for example, can rest easy that Office 365 has full stated HIPAA compliance for its services. Besides Google Apps, the only other major provider I found to be offering this was Intermedia with their Hosted Exchange product. No matter what kind of legal requirements your organization has, Office 365 seems to have all of its bases covered down pat.
For K-12 education customers, rest assured that Microsoft also has everything in place to meet the stringent requirements set forth by FERPA laws. This is an important criteria to be mindful of for school districts moving to cloud email, as I am only aware of Google Apps and Office 365 meeting these compliance requirements.
In terms of uptime guarantees, Office 365 is held to a 99.9-percent Service Level Agreement (SLA) which equates to no more than 44 minutes of downtime per month, or under 9 hours per year. Microsoft's commitment to stability is extremely reassuring. And yes, while outages have been reported in the media over the last year or so, they are becoming less common and affecting smaller subsets of customers even when they do occur. Overall, from my own experiences with clients, Office 365 has been fairly rock solid since Dec or so of last year.
Tips for a Successful Migration to Office 365
There are numerous pain points we see customers having trouble with while contemplating moves to Office 365. Here are some of my biggest recommendations during the planning phase of a cloud email move:
Microsoft's About Face with 365 is a Win-Win
If it weren't for the February 2013 release of Office 365, I would have started writing the platform's obituary by now. Google Apps has been on a crash course of rapid innovation and progress, and until recently Office 365 was anything but. I'm glad to see that Microsoft took both the community's, and my own, complaints to heart and revamped a service that now has exciting potential to forever change the way we view business email and online collaboration.
Is Office 365 a perfect solution? Far from it -- but not any worse than Google Apps, I will say. It's leagues better than what it represented a mere half year ago. I pinpointed numerous shortcomings in my exhaustive review, and still think Google Apps is a better product for some situations. However, Microsoft has shown it can make great strides in turning a crummy, half-baked solution into a solid and stable ecosystem for today's complex businesses. Here's hoping that Redmond can keep the polish coming so that Office 365 can come recommended without any reservations.
Photo Credits: James Thew/Shutterstock
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Some of Microsoft's greatest battles aren't being fought in the open, contentious field of constant public opinion and media coverage. If there's one thing Microsoft has always done better than the competition, it's blowing open new areas of opportunity and running with the ball on the sly. Apple and Samsung can keep their tactical flags limited to consumer electronics; Microsoft has far greater potential as a rising star in the cloud arena. The war started with its drive to push email to the cloud with Office 365, and the next leg of battle sits in the helm of Windows Azure and XaaS dominance.
If you're under the impression that we are not yet in the era of massive, prevalent 'big data', you're wildly mistaken. Our data needs are already climbing to astronomical levels, with IBM stating that 90 percent of the data in existence today was created in just the last two years. Not surprisingly, much of these growing data needs are being tossed into virtual environments whether it be on-premise in a VMWare or Hyper-V driven route, or my personal favorite: cloud-hosted virtual machines.
Before I get too far into my thoughts about Microsoft's Azure service as a VM platform, let me say that I have been using the service on a limited basis with a few customers since about October of last year. For industry watchers out there, Microsoft quietly took Azure IaaS out of beta formally a few weeks ago. And along with attaching a nice SLA uptime guarantee for VMs (99.95 percent, currently), they company is serious about taking on the giant Amazon in a price war that has no foreseeable end in sight. If you thought Microsoft rolled out Azure as nothing more than a pet project, guess again.
As a consultant to many small and mid-size businesses in the Chicago area, the move from physical servers to cloud alternatives is nothing less than hot right now. And in my opinion, cloud-hosted virtual machines are a better fit for smaller firms (25 and under) for many reasons. Just a few of the biggest benefits over on-premise virtual machine environments include:
There are other aspects that I'll discuss as I progress, but in general, my initial impressions with Azure thus far are pretty stellar. No, the service has not been without hiccups, but compared to the number of quirks I ran into with Amazon's EC2 during my testing over the same period, I'll take Azure any day of the week.
It's simple, very competitively priced, and affords for the flexibility of any virtualization project needs that your organization or customers may be going through. Microsoft even offers Linux boxes on its cloud; something commendable for a company that publicly prefers for you to keep Windows Server running in your network closet.
Leveraging Azure means you have access to some of the fastest pipelines to the backbone of the internet. Microsoft's data center infrastructure is arguably one of the best in the world. The above speed test from a production Azure VM (on Server 2012) speaks for itself.
Amazon and Microsoft aren't the only ones in the cloud hosted VM arena these days. RackSpace got into the game some time ago, along with numerous smaller vendors like SoftSys, PayPerCloud, and MyHosting, to name a few. I'm not here to knock all the little guys, as I only have formal experience with SoftSys outside of Azure and EC2, but they can't compete with Azure in two primary areas: breadth of offerings and price points. In these two respects, Microsoft and Amazon have the market fairly cornered.
Google launched Compute Engine back in the middle of 2012 to compete in the cloud VM market. But since the service only supports Linux VMs at this point, along with its numerous rough edges to boot, I don't consider Google a major name in this arena just yet. In my opinion, the service needs to add Windows VM support to compete in the same class as Azure and EC2.
Nothing like an Old-fashioned Price War
I'm not one to champion companies merely on price alone. I know there is more to the quality equation then just pricing. But budgets speak numbers these days, and when the bar of quality is equal, pricing talks. Microsoft needs to keep Amazon at bay if it plans to keep growing Azure into a mainstream service. In the same way that Microsoft has Hosted Exchange providers beat on pricing with Office 365 as a whole, I think the company also created a two-horse race between Amazon when it comes to virtual infrastructure services.
It helps that Microsoft is dedicated to keeping costs in line, or cheaper, than Amazon's competing EC2 and S3 products. With the announcement just a few weeks back slashing Azure pricing 21-33 percent, Microsoft provides assurance to the technical community that it wants to level the playing field and differentiate from Amazon on feature sets -- not on face value price points. For my own customers, this has helped us consider Azure as a viable competitor for what is otherwise a fairly new entrant to the market.
Comparing prices between Amazon and Microsoft is a pain in the rear end, and not because of Redmond I might add. Amazon's EC2 website is so backwards that there is a single page outlining a bevy of price levels, and comparing the services apples to apples requires that you reference back to a glossary page of VM instance types. Perhaps Amazon wants to create a cloud of confusion and sucker you into submission.
In contrast, the Azure cloud calculator is a simple to use dynamic form that shows pricing levels on the fly depending on the number of VMs needed based upon the performance tier chosen. You can easily mix and match vanilla Windows servers along with Linux boxes, and toss in SQL instances, among other options. Simple, quick, and dirty - and as a technician and customer alike, I appreciate when companies embrace KISS.
On the raw pricing front, Microsoft seems to be undercutting Amazon's price levels just enough to make it worth mentioning. Here is a pricing comparison taken on May 7, 2013 based on the latest publicly available rates between each service for Windows Server virtual machines. Amazon's US East pricing was used, since this represents the generally cheapest price levels for Amazon cloud services in the USA.
As you can see above, Microsoft's similar level Medium Windows instance beats Amazon's by more than 12 percent. Before you jump on me for the difference in memory offered, even taking that into account, there is only a 6 percent difference in memory levels between the two providers. Pitting the two numerically equally, this still means Microsoft's pricing is over 6 percent better than Amazon's. For an organization looking to run these VMs nonstop month after month, this small difference adds up. For those curious on Amazon's reasoning behind a unit of EC2 processing power, you can read up on their official FAQ page.
The same numerical advantage sits in Microsoft's court even if you size up a beefier VM, from the respective "Extra Large" class that each provider offers:
If a customer wanted to run one of these Extra Large instances 24/7 for a production need, it would cost them $6,289.92 per year on EC2 compared to a mere $5,529.60 per year from Azure. That's a $760.32 savings for the year. Sum that up across every extra VM in use, and the numbers start adding up to a pretty penny.
Management counts, and Azure tops EC2
Another area that Microsoft clearly has Amazon beat is in administration and maintenance. If you're looking to spin up a quick VM on Amazon, you better learn the sprawling and massive instance glossary before chugging forward or else you could be in for a pricing surprise if you chose an instance too large for your needs. The process won't become any more user friendly until Amazon takes a deep long look at the number of steps involved in the instance creation process on EC2.
Microsoft's dashboard for Azure (known as the Portal) is visually balanced, easy to manage, and logically organized (bottom). In contrast, Amazon's EC2 dashboard is a cluttered mess of technical jargon and links loosely collected in a left hand tree (top). No wonder I like working in the Azure Portal compared to the rat's nest Amazon provides.
Microsoft's portal, in contrast, is clear and concise from the get-go. Looking to create a VM? Click on the large "New" option at the bottom left hand corner of the Portal, choose your desired platform, and plug away at all the options you need. The visual indications on screen let you know when your VM is being built, and when it is ready to go. No second guessing the setup process is necessary.
The other issue I have with Amazon is that their administration for different services is illogically carved up into various admin areas. Administering EC2 is different from the area that controls S3 (cloud storage) is still different than what you see for RDS (relational database service). Why can't they all play nicely together under a unified management portal like Azure provides? If I had to work with a small business to get them oriented on managing multiple services from Amazon AWS, I don't know how I would explain it all.
Amazon's answer to managing its numerous cloud services online, as shown in entirety above. Each service has a different control panel area to learn. Azure's portal allows you to manage all services from a common administrative interface, reducing complexity and streamlining tasks.
I know full well that most organizations won't be spending all that much time in the online interface, but there are many instances where you may need to flip VMs on and off regularly or create new test environments for various purposes. Whatever it may be, that's not the point here. Microsoft's approach to cloud service administration is much more defined and "mature" even though Amazon has been doing this longer than anyone. Seniority doesn't always win out, it seems.
Performance and Reliability earn Azure Top Honors, Study says
Microsoft may be one of the rookies in the cloud services game, but it sure plays like a veteran, according to enterprise storage vendor Nasumi. The organization released its second State of Cloud Storage report a few months back, and Azure gave the competition a licking in most areas, including performance and availability. Nasumi was pretty blunt about the findings: "Microsoft Azure has taken a significant step ahead of Amazon S3 in almost every category tested".
What were the key points in Nasumi being so poignant about Azure's offerings? Based on tests run between Nov 2012 and Jan 2013:
The only area where Amazon was able to top Azure was scalability, but only with a difference of 1.3 percent between the two. "Not only did Microsoft outperform the competition significantly during the raw performance tests, it was the only cloud storage platform to post zero errors during 100 million reads and writes", the report goes on to say. While Amazon was a strong second place contender, Nasumi clearly believes that Microsoft's Azure service is serious about taking the lead in cloud storage and services, and staying ahead of the rest.
Azure's Possibilities seem Endless
This review was focused solely around Microsoft's virtual machine offerings on Azure. But with all of the available flavors on the platform, the sky seems to be the limit for Microsoft in this newfound realm. The platform already plays host to such diverse offerings like cloud-hosted SQL, mobile services, and storage blobs, and Microsoft's appetite for moving us to the cloud is not getting any smaller.
Take, for instance, the possibilities surrounding the budding identity features in Azure, specifically within Hosted Active Directory. Today Microsoft has solely limited the scope of the functionality within this platform to federation with existing services and on-premise AD infrastructures, but imagine when -- not if -- we can ditch all remaining local AD servers and replace them with AD in the cloud? I know this is on Microsoft's plate, and I expect it will start rearing it's head sooner rather than later.
Further proof leading me to believe this is the eventual future of Azure sits within a juicy nugget of news that came out recently surrounding a pet project reportedly in the works. Under codename Mohoro, Microsoft is supposedly looking to branch into cloud-hosted VDI (virtual desktop infrastructure) which is dominated by players like Citrix right now. If Microsoft could get this initiative right, and build it into Azure as a pay-per-use offering, VDI could become a lot less expensive and actually viable for the average small business.
Such VDI buildouts in Windows ecosystems have been traditionally reserved for the enterprise. Why? Expensive hardware; expensive software; hard-to-manage licensing; extraordinary planning efforts; and an entire upheaval of the way organizations handle their desktop technology now. Azure has already proven it can deliver the backend to handle diverse needs. I don't doubt in any way that it couldn't make VDI for the masses an eventual reality very, very soon.
And if Windows goes the way of pay-per-use, a bevy of other Microsoft products will subsequently follow, namely the entire Office ecosystem that has held onto its localized, desktop roots. The current push by Microsoft to turn Office into a 365-driven platform doesn't change the fact that the software is still downloaded and installed locally to your Mac or PC (Office Web Apps being the cheeky exception).
If Azure has it's way in the next 5-10 years, that could all be a thing of the past. Imagine merely selecting which Office app you need to use, on the fly, and having it streamed through your Azure-powered Windows desktop in a matter of seconds.
Microsoft gets it (mostly) right with Azure
Is Microsoft's Azure service perfect? Of course not. It has got some rough edges, but nothing to make it any less recommendable. For one, reserving static IPs for virtual machines is still a challenge when you get rid of (delete) VMs that are tied to specific addresses. And Microsoft is not immune to making some rookie mistakes with its baby of an offering, as was seen as recently as this past February. But like with any cloud service, there will be bumps along the way, and I guess that's a part of the ride. Sit tight, and be prepared for the unexpected at times. It's not like Google Apps is coasting along unscathed by downtime, either.
If you are looking for a service that has flexibility for varying needs, solid pricing, and the trust and security that comes with the Microsoft name, I don't think you could ask for much more. Azure went from being nothing more than a value-added afterthought to a full blown primetime option that I have already deployed for numerous customers, and am actively setting up for others going forward.
It will be interesting to see how viable Microsoft's yet-released VDI initiative will be on Azure, and whether or not it will take off. Some studies are already finding that VDI is something either being used or investigated by over 80 percent of organizations, and this number will surely continue to grow. As much as the industry tries to move away from Redmond, quality offerings like this will only keep reeling them back in. Hopefully others like Amazon can keep up.
Photo Credit: T. L. Furrer/Shutterstock
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Cloud workspace platform Podio introduced another round of fresh updates on Thursday, bringing exciting new functionality to the quickly evolving SaaS offering. Hot on the heels of a major UI facelift that was released back in late April, the newest refresh brings much requested real-time chat capability with online members of your various workspaces. For my company that uses Podio on a daily basis, these additions are definitely appreciated.
For those unfamiliar with the service, I provided a mostly positive in-depth review back in December of last year. For those who have never given Podio a spin, placing a label on what it "is" definitely takes a little effort since it is almost anything you want it to be. The product fills the gap of online task, project, and customer management that is much cheaper and flexible than any other mainstream CRM offering. It also correctly introduces the aspect of "professional social", something which Yammer forces down your throat -- but Podio makes feel like a natural fit.
Design Tweaks bring Subtle, Meaningful Changes to Podio
Podio has always had a fairly fluid online interface, but it was bugged by one large problem: the Workspace menu on the left-hand side took up nearly a quarter of the entire screen. While it may be useful to have a visual reference as to what workspace you are in at any given time, your intuition takes over after a certain point.
So Podio took the liberty to get rid of this throbbing nuisance and introduced a sliding, animated left-hand drop menu. This was a much needed change to the overall design of Podio, since working within workspaces requires more screen real estate, especially when you start working with very complex, data driven apps. Getting back a full quarter of screen real estate is the equivalent of buying a larger monitor just to see the same amount of information at once.
The Podio team also introduced some smaller, but also noticeable, UI tweaks to a few other aspects of the main screen. For one, the top blue nav bar is now host to an inbox shortcut (also called notifications) towards the right hand side of the screen instead of where it used to sit oddly towards the center. Much of this was changed intuitively to likely coincide with the introduction of the new Chat feature, which hugs the right hand side of the screen starting Thursday.
Search has also gotten some nice boosts, including some smart predictive search technology (straight out of Google's playbook) that also presents results in real time without the need to go to different screens. For multitasking hogs like me, this is a small but timesaving UI improvement.
If Social is Where it's at, then Live Chat is King
By far easily the biggest new feature added to Podio in the last six months has to be the introduction of live chat turned on yesterday. For someone that is dug into a few different Gmail chat windows each day for work purposes, this is exciting to try out. As my team continues to spend more time organizing projects and customer workflow in Podio, the fact that we can establish chats not only with eachother, but with invited customers, is a monumental change for the way we do daily business.
The new chat feature is not only 1-on-1, but you can have team chats with colleagues, or better yet -- full blown project team chats with customers and people from your company at the same time. If the users are a part of your workspaces, then you can use Podio as a full blown integrated chat platform built around the way you do work in the cloud. This is where Yammer falls short against Podio: in bringing external colleagues or customers into the conversation.
Chat brings with it some finer points as well. Conversations can continue onward even after you finish your most immediate chat, because Podio treats them as free-flowing ongoing thread instead of a separated, disparate phone call. And Podio's extremely powerful search capability is present even for chats, so you can look back and find nuggets of information from previous chats as easily as if they happened yesterday.
You can watch a short informative intro on Podio chat on YouTube.
Coming this Summer: Integrated Audio/Video Chat
If real time text chat doesn't suit your fancy, Podio is working away on introducing full blown audio and video chat within the product that should break down the communication barriers even further. Not much is yet known about the extent of the capabilities of these new rich multimedia features, but judging from first screenshots, the changes are going to turn Podio into a full fledged presence heavy online workspace platform.
Could Google Hangouts or even Lync be getting a run for their money from Podio? It's too early to say, but I like what I am seeing. As a dedicated Podio customer building out most of my business processes on the platform, it's reassuring to see that newfound owner Citrix did not give the Podio team the forgotten child treatment. As soon as Podio rolls out its planned rich multimedia functionality, I hope to present a revised review of the product focusing squarely on real time collaboration capabilities -- something which I don't think has been this drastically changed since the rise of Google Docs.
Podio is a SaaS powered workspace platform that runs 100% in the cloud, and is completely free for companies or organizations up to five people in size. External colleagues or customers you invite to join your workspaces are also completely free. For organizations over five people, the cost is only $9 USD/person. Pricing includes full access to the platform for internal workspaces and comes with access to the Android and iOS apps for mobile access. You can learn more over at Podio's website.
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
It's almost as if some in Congress forget that we've been down this path before. Garbage legislation, now under the moniker of the Marketplace Fairness Act, has been discussed in various guises and masks over the last 20 years or so. Streamlined Sales Tax. Remote Sales Tax. Distant Sales Tax. They've been tried, debated and debunked each time before.
But it's funny how larger than ever state budget deficits perk up the ears of slimy congressmen on the umpteenth attempt at an Internet sales tax. While proponents like J Marra, writing for BetaNews this week, are in favor of this bill, I stand tall against it, without hesitation.
Speaking from Experience
As a small business owner myself, already reeling in yearly time wasted in wading red tape, I have full right to be fired up about this new effort. Even though my business only does taxable commerce in Illinois, I have numerous colleagues in the IT industry who would be affected by this legislation. Any time government tries to impose new taxes and says that it is merely filling in the gaps is cause for alarm.
Remember when big government lied to us and claimed that Obamacare would save money for everyone across the board? It's the same flawed thinking that leads some to estimate costs rising for individual plan claims up to 32 percent under the massive legislation. Rule of thumb: the more government gets involved, the worse off we generally are.
Just about 20 years ago, the Supreme Court made a landmark ruling in Quill v. North Dakota that set an important precedent in today's debates. In that case, the court said, rightfully so, that obligating businesses to collect taxes on behalf of jurisdictions which they have no presence was too burdensome to enforce and expect of them.
Yes, you can say that the case was pitted in the discussion surrounding mail order catalogs and their sales across state lines, but the basis for argument is the same. Should businesses large and small be held liable for collecting taxes on any number of items they may sell to any person online in the vast United States?
Small Business Nightmare
If this junk legislation passes, it means that businesses with online sales across state lines will be liable for tax collection across roughly 9,600 different jurisdictions in the country. As if small business owners didn't have enough paperwork and red tape to wade through already to keep their businesses legal.
Proponents of this smelly pile refute the tax liability mess by claiming simplified "tax calculation software" will be available to affected merchants in efforts to simplify tax liabilities. Reality to Congress: QuickBooks has been around for over 20 years now, and taxes have not gotten any easier for US small businesses. I'm not sure how another band aid to a broken, over-complicated tax system is going to make life any easier or the burden any less troublesome.
One of the bills sponsors, web giant Amazon, has come out in favor of the legislation. Not surprisingly, Amazon already has the vast, expensive technical infrastructure in place to handle such broad tax collections. So much so that it's even offering expertise in the form of tax compliance services to other businesses. The online retailer has got more to gain financially from this bill passing than just "leveling the playing field" as it claims publicly.
The 3,007 counties of the USA, as shown above, are just a sliver of the jurisdictions that small businesses would be liable in collecting taxes if this legislation passed. All because they survive by selling across state lines. Whatever politicians calls the legislation, it's the same tax increase that has failed time and time again in congress. (Image courtesy of: mapsfordesign.com)
It's also interesting to note that this bill has zero language addressing how States would force brick and mortars to collect taxes for purchases made in person by out-of-town residents. After all, the name of this bill is the Marketplace Fairness Act, and in the interest of fairness, shouldn't brick and mortars be held to the same legislation that is burdening their out-of-state competitors?
Five states do not collect any form of state sales tax, including Alaska, Delaware, Montana, New Hampshire, and Oregon. So if I were a Washington resident I could make the short trek into Oregon and get away without paying any sales taxes. Yet sitting back in my recliner, ordering from the same vendor in the comfort of my home, would yield a fully taxable purchase. That's fairness? Depends on who you ask, I guess.
Bill sponsors claim that the reason brick and mortars cannot be held to the same standard is because, presumably, it would be too difficult to impose questions upon each buyer about what state and county they come from. But hypocritically, they agree, that imposing the same burden on online retailers is justifiable because they can wrangle in some legal language providing cost-effective tax calculation software that will make the process seamless.
No Silver Bullet
If there's one thing I know as a small business owner, it's that nothing surrounding government legislation is as easy as it's portrayed. I don't care how much software you toss my way.
Here's a suggestion to a Congress hell-bent on raising taxes: how about focusing efforts on cleaning up our broken tax code instead? It's already been proven that Americans as a whole waste 6.1 billion hours annually merely complying with federal tax laws. That's the equivalent labor time of 2.1 million full time workers! Talk about waste to the nth degree.
While there has been no silver bullet plan as of yet, the attention Herman Cain received for his (flawed, but commendable) 9-9-9 flat tax plan was a step in the right direction, at least as food for thought. If Congress was discussing ways to reduce tax loopholes and administrative overhead/complexity, we wouldn't have talk about ways to raise taxes on Americans to fill budget deficits.
While the Senate tries its best to ram through the Marketplace Fairness Act as fast as possible, I urge all commonsense Americans to not only sign the public eBay petition against this grimy monstrosity, but also reach out to their representatives and tell them why you are opposed to any new tax increases of this nature.
Legislation supporters claim this is out of fairness to the mom and pops losing money due to online retailers, yet the very ones that will be hurt most are the small-timers selling on the web who will be burdened with more red tape, overhead, and administrative waste.
Fix the glaring mess we already have, Congress -- then perhaps we can discuss just cause for more taxation.
Photo Credit: Jane0606/Shutterstock
Many people considered this company irrelevant and dead years ago. Yet with nearly three million paying Internet service subscribers still, this provider is anything but dried up -- yet. Internet access, among other subscription services, makes up a clear majority of its continuing sales and its greatest chunk of profits as a whole. Subscriber growth peaked off back in 2002, but for this aging Internet heirloom, at this point they will no doubt take what they can get. Who the heck am I referring to?
Don't choke on your coffee, but it's none other than AOL. Namely, their dialup Internet service division. It's hard to believe that in the year 2013 any company has more than a trickle of subscribers left on dial up, but this attests to the sad state of broadband adoption in the United States. Of the estimated 74 percent of Americans who have internet access in their homes (2010 figures), a full 6 percent of those are still on dial-up service. There are a myriad of issues affecting broadband adoption, including things such as lack of access, pricing, reluctance to switch, etc.
A full 19 million Americans sadly don't have access to any form of broadband. And in a comparison of adoption rate per capita, our country ranks a miserable 15th globally -- behind United Kingdom and South Korea, to name just a few. Much of the Internet is abuzz about Kansas City's recently completed rollout of Google Fiber, with its near gigabit speeds delivered directly to the home.
Even with slow expansion into other small markets, like the recently announced Olathe, KS, the excitement over Google Fiber is premature by all reasonable measures. One giant (Google) is getting into the fiber game while another (Verizon) is slowly exiting after making similar market promises of "fiber to all" just a half decade ago.
While I'm all for nationwide fiber like the rest of us, as an IT consultant by day, I know the tough realities of broadband penetration in our diverse urban & rural mix that is the USA. Here are some of the roadblocks that the pure optimists continue to overlook.
Verizon's Lesson with Fiber: a Messy, Messy Game to be in
Simple economics can explain much of the problem with fast-paced fiber rollout stateside. In many ways, Google could (and should) use Verizon's failed attempt to take FiOS nationwide as a cue to what it can expect in its own forthcoming efforts. It was reported that as of mid last year, FiOS held a nominal subscription base of roughly 5.1 million households. In comparison, the largest cable (coax) broadband provider in the USA, Comcast, holds somewhere between 17-19 million subscribers right now. No wonder Wall Street held Verizon's feet to the fire, and ultimately forced its arm on exiting this 'wild west' of a service buildout.
Most people clamoring for Google to expand Fiber don't know the half of what goes into a fiber network build. Before Google or Verizon can even start taking pre-orders for such service from households, there is a nightmarish mix of legal, financial, and technical boundaries that need to be overcome. Franchise agreements generally have to be negotiated with each and every municipality that is up for consideration. If other providers have exclusivity contracts in a given area, let the legal wrangling begin. And the mess only continues, as a myriad of permits are generally needed to install fresh lines in a small urban area -- including state and local permits, as needed, depending on which roadways and areas are controlled by a given authority.
No better example is needed than the delays that Kansas City, Missouri is running into with rolling out its promised Google Fiber. Local cable providers there are up in arms over Google's preferred treatment by local authorities. More importantly, there is a big rift on how these new fiber lines are to be installed in the community. The preferred, and cheapest, method happens to be utilizing existing cable-ways via electric poles, but this requires using (very) specially trained crews that are quite costly to hire. The current status page for the city shows the first neighborhoods to get service start this month, but that timeline can already be taken with a grain of salt in light of the lack of news on progress there.
Google Fiber promises gigabit speeds at lower prices than even high speed coax cable providers. But is Google the savior to reign in nationwide fiber for all? Possibly, but due to numerous challenges, I doubt it.
The level of paperwork and overhead required in fiber rollout pales in comparison to the cost complexities with actual installations. As seen above, while running fiber on electrical poles is ideal and logical, it is often met with fierce kickback from local municipal higher ups alongside the many others who share these infrastructure pathways -- from electrical companies themselves, to cable providers, and telephone companies, to name a few.
Another option, and one which is less susceptible to weather and the elements, happens to be underground fiber, but this has numerous drawbacks of its own. Digging permits, existing utility lines, and costs related to reconstruction after installation all dog these endeavors to the point where rollout either slows down indefinitely or gets abandoned altogether.
Sprawling suburban areas like my own backyard of Park Ridge, IL provide enough headache for companies like Wide Open West engaging in limited fiber expansion for key areas. I can only imagine what would happen if Google had to install underground fiber in the center of our next-door monster neighbor, Chicago.
Rural America is always an Afterthought
Over 14.5 million Americans, all located in rural areas, still have zero access to any form of broadband. Troubling indeed, and definitely the reason why AOL can still lay claim to nearly three million subscribers. But if you had to pinpoint one reason why broadband penetration in the sticks was so poor, there wouldn't be a single culprit to point fingers at.
Cities and towns blame lack of provider willingness to expand; providers blame staggering costs to build out oodles of infrastructure for a scattering of residents; and rural Americans refuse to pay boatloads for sub-standard broadband when dial up is still relatively dirt cheap. In simple words, the entire situation has been anything but a win-win for any side.
The realities of expanding access in the vast heartland of America is mind boggling. Many small towns sit dozens, if not hundreds, of miles apart from one another and the cabling, manpower, and future upkeep for relatively limited return on investment is what keeps many providers from stretching their reach. The numbers just don't make sense any way they look at it. Even if a small rural town was connected with full fiber access to each neighborhood, the provider would have to outlay nearly all of the capital expenditures necessary to get this in place.
Surely, Google could charge new customers for a fair share of their homes' installation fees, but in order for customers to bite, the price would have to be right. There's no way that Google's fee schedule for Kansas City would hold true for any regular rural community. As opposed to larger urban and suburban centers that take advantage of the economies of scale that come with such locales, rural America would be forced to foot a larger sliver of the installation bill -- or expect Google to do so. If Verizon's FiOS is any lesson for us on fiber rollout, don't expect the latter to hold true in the long term.
While the 2010 initiative under the moniker of the National Broadband Plan was devised to solve these dilemmas, the program has raised more questions then it has solved. Wild cost estimates have still to be hammered out (with some guesses putting such expansion at close to $350 billion USD) and numerous voices of opposition are claiming the NBP will only stifle innovation, growth, and ultimately lead to higher prices for rural areas. A federally backed plan to address broadband is definitely desirable, but in its current form, the NBP is not living up to expectations.
The Chicken or the Egg Dilemma: Subscribers or Penetration First?
This is probably the trickiest aspect to any proposed fiber rollout. Does there have to be a minimum number of potential subscribers in a given area before a provider will promise a fiber expansion? Or does the provider take an educated gamble and just build out blindly? The real answer is one buried in a little bit of both angles to this reality.
ISPs (like Google) of course want to have a solid commitment from a given percentage of customers in a community. But most Americans won't jump on the fiber bandwagon without letting the service(s) mature a bit and outgrow their first iteration bugs. This natural hesitation scares the likes of any potential fiber provider, as their success lays directly with pulling in as many first-generation customers as possible to sustain continued service growth.
Verizon's CFO made it pretty clear last year that the company was opting to raise prices in the face of limited adoption by customers. What choice did Verizon have? They have spent admirably since 2005 to get FiOS rolling in numerous markets (primarily situated on or near the East coast) and while they don't have any new plans for fresh expansion, they do have existing market commitments to uphold.
Fiber isn't something you can roll up and change in relatively short order, such as wireless cell networks moving from WiMAX and HSPA+ to LTE, for example. Fiber is a huge up front expense not unlike our crumbling copper telco network -- once it's here, someone needs to support it one way or another, high prices or not. Those who made the early move either bite the higher pricing, or switch off to older technologies they presumably left for the very same reason. Go figure.
Much of the opposition to fiber adoption could indirectly come from regular citizens who have no intention of switching service providers at all. Even in my broadband-plastered suburb of Park Ridge, our company FireLogic still has customers on dial up service who refuse to upgrade. They are under the illusion that "it just works" and they don't need anything better.
In many cases, the service is either on par or cheaper than the lowest tiers of broadband, so the case against dial up becomes even harder for these individuals. The same goes for customers on sub-standard DSL who will not consider faster options like cable. To many, what they have is what they intend to have for the long term. Fiber won't change their minds unless extreme price hikes made moving a necessity. A path of least resistance keeps services like AOL operating for a sizable minority of its subscriber base.
But if raising prices is Verizon's way of battling limited uptake, doesn't this defeat the goal of cheap fiber for the masses? It most definitely does. Subsequently, it further allows market pricing for competitive providers of cable and DSL service to stay artificially high. As if cable monsters like Comcast needed any more reason to keep prices inching upward. Premature "harvesting" of customers, as analysts are calling Verizon's FiOS moves, is not the solution to establish and keep fiber expansion growing nationwide.
No one knows if Google will be forced into similar corners with its Google Fiber service if adoption doesn't meet expected levels. But if so, those clamoring for the search giant to bring its Fiber service nationwide could start to rethink their wishes.
Google Fiber for All? Lofty, but unlikely
In a perfect world, Google Fiber would blanket America. But if the above roadblocks (and Verizon's own troubles in fiber) are any indication, we need to tone down our expectations quite a bit. Even if Google had the motivation and financial backing to take the service across the USA, it would hit major urban markets first and then crawl outwards into the suburbs at a slow, painstaking pace. Our nation's biggest coax providers started their cable buildouts in the 1960s, and today we are still struggling to extend those networks past suburban areas. History tells an obvious lesson -- infrastructure growth in the United States is a tough-as-hell endeavor, and not for the faint of heart.
The most likely scenario for Google Fiber is a staggered introduction into new markets every few years. The biggest cities would act as the incubators, providing subscriber padding to back capital expenditures and to gauge capacity and upkeep needs for the longer term. Suburbs would then receive service capability once their neighboring saturated urban cities were solidly covered, and those out in rural America would likely be left out in the cold indefinitely.
Unless government subsidies make it financially feasible to bring above-ground fiber to these rural areas, I can foresee Google handling these customers just like Verizon is itching towards doing; cutting off fiber expansion at some point and merely pushing high speed cell coverage through some fashion of partnerships with the likes of Sprint, ATT, Verizon or T-Mobile.
So while it's great to stay optimistic about Google's Fiber plans for the rest of America, let's keep our expectations in check. This entire discussion about nationwide fiber is a topic that feels more like a "been there, done that" style debate when it comes to American infrastructure. Everyone wants it, yet no one knows how to both pay for it and keep it sustainable for the long run. Here's hoping that Google can pull off a miracle, but like many, I'll believe it when I see it.
Derrick Wlodarz is an IT Specialist that owns Park Ridge, IL (USA) based technology consulting & service company FireLogic, with over 8+ years of IT experience in the private and public sectors. He holds numerous technical credentials from Microsoft, Google, and CompTIA and specializes in consulting customers on growing hot technologies such as Office 365, Google Apps, cloud hosted VoIP, among others. Derrick is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA exams across the world. You can reach him at derrick at wlodarz dot net.
Nearly six months ago, I voiced in on the Google Apps vs Office 365 debate and let it be known that (at the time) I fully believed Google Apps was the better platform in many respects. Fast forward to February 27, and Microsoft unveiled why waiting until the second (or third) try on a given product is usually a good bet. In all honesty, I think Microsoft has been on the right track with Office 365 for four to five months now, introducing quality features and fixing stability issues that plagued its reputation in the past.
I'll go so far as to say that the Office 365 ecosystem has been nothing short of respectable lately. My technology consulting company FireLogic steadily has recommended the suite as reliable alternative to Google Apps for some months now, and the results are extremely positive. Heavy Microsoft shops moving away from their legacy on-premise Exchange servers are itching for a new home, and the company seems to have a cloud of its own that is living up to even my stringent expectations.
A big question that a lot of customers are asking now is: why shouldn't we just move to hosted Exchange? And that's an entirely valid debate to have. I've worked with countless customers over the years that have been on a bevy of providers from AppRiver to Intermedia to RackSpace, naming just a few. While the experiences were generally good to great, I just don't think they match the value-added entirety that Office 365 brings to the table now.
For Microsoft, time is generally on its side. Just two years ago, when Office 365 was formerly under the Business Productivity Online Suite flag, Redmond's cloud suite was nothing short of a hodgepodge; loosely connected by company name only, and lacking a majority of the big features that companies rely on from traditional Exchange. Microsoft well understood that the underwhelming collective it was selling to the masses fell short. Redmond came full circle last month when it unveiled a true toe-to-toe alternative to on-premise or hosted Exchange, which is the modern Office 365 for Business.
While price is certainly one factor where Office 365 reigns supreme compared to hosted Exchange offerings, this is not the only merit that takes it over the top. Businesses and organizations want the entire package -- security, stability, functionality, scalability, and elasticity - in addition to a cost effective bottom line. I fully believe that Office 365 is finally delivering on the promise that Microsoft's budding cloud vision entailed a few years ago.
Hosted Exchange Providers can't touch Office 365 on Price
As a technology consultant by day, I know that price alone should not be the deciding factor on which platform a business chooses. But then again, cost does affect the bottom line, and it's something that inevitably needs to be considered heavily. This is one area where Office 365 just blows the away competition. The chart below just exemplifies how well Microsoft has leveraged its tremendous weight in cloud economies of scale to bring Office 365 pricing down to extremely affordable levels.
A quick glance at the above comparison clearly outlines a few important items. Firstly, Microsoft's pricing for Office 365 (at the Email Only level) is dirt cheap -- a mere $4/person per month which equates to a lowly $48/person per year. That's nearly half the cost of the next cheapest provider, Intermedia. But the advantages on cost don't stop at the monthly price tag alone; Microsoft has the other big names beat on a few other key areas hands down.
First off, while ActiveSync support for mobile devices such as iPhone, Android, and Windows Phone are a given these days, BlackBerry support (pre-BB10) is not a universal privilege. Of all five providers showcased in my comparison, only one, Office 365, has native complimentary BlackBerry email/calendar/contacts support out of the box. The other providers have nominal, but still extra, monthly fees tacked on for BB usage which could definitely have some sway in a final decision. Much of big enterprise, and some small businesses, still have large fleets of BlackBerries in use and having to pay a surcharge to continue using them is something many SMB owners don't want to hear.
When it comes to security compliance and certifications, Microsoft also has the third parties beat by a long shot. Other then Intermedia, which does have posted HIPAA compliance, the other providers are all either not advertising their credentials (which I doubt) or simply don't have them under their belts. Office 365 boasts HIPAA and FISMA compliance, two important factors which make the suite 100-percent capable options for running in the public government sector or for healthcare-related institutions.
Of course, there are a few providers that have Microsoft beat in a few aspects. For example, AppRiver has a slightly audacious claim that its hosted Exchange service affords "unlimited" email space. What the provider call unlimited, we all know is merely a glass ceiling that isn't publicized. Similarly, RackSpace takes the same approach in its advertisement of a 100-percent uptime guarantee. I've heard from customers of theirs who have experienced outages, similar to this first-hand account, which Rackspace brushed off as not falling under their strict SLA terms. Whether or not the claims are a bit outlandish are up to your own discretion, but I prefer Microsoft's down-to-earth, honest approach to advertised capabilities and uptime.
Businesses want Security, and O365 delivers
I'm not here to say that hosted Exchange providers aren't secure. That isn't the case by any means. But in a level playing field comparison, Office 365 holds a sizable advantage in presenting what has to be the single most secure email platform alongside the primary non-Exchange alternative, Google Apps. I put together another feature matrix that highlights the same providers I pinpointed in pricing earlier.
The important differentiating factors here between Office 365 and everyone else happens to be EU Model Clauses capability, along with FISMA certification. EU Model Clauses is a contractual framework that was created by the EU to establish international data transfer standards. While most major providers fall in line with the commonplace EU Safe Harbor guidelines, only Office 365 advertises Model Clauses capability with customers.
FISMA (Federal Information Security Management Act) is the other, much more important, credential that Office 365 has achieved. Specifically, this makes Office 365 fully acceptable for usage in any U.S. Federal Government agency. Google Apps is the only other large cloud email service to have attained this level of security. And government acceptance of Office 365 is subsequently on the rise. Microsoft has inked major deals with the City of Chicago, U.S. Department of Veteran's Affairs, as well as the State of Texas to name just a few. While it could be partly due to Microsoft's immense datacenter investment strategy and overall size, it still speaks numbers to their ability to handle some of the most delicate email needs within US borders.
Is this to say that hosted Exchange providers like Intermedia and RackSpace aren't suitable for small business? Not at all. They very well may be fully capable of meeting your own needs. But at nearly half the price of even the next cheapest option, Office 365 presents a true bargain when you take into account all of the security backbone you receive. The second best options of the given list are a tie between Intermedia and RackSpace; each respective provider has three different missing security credentials up against O365.
On the Extras, O365 goes Above and Beyond
If we were to end our comparison here, Office 365 would go home a solid winner. But some companies look for options above and beyond the plain-Jane features described above. And this is where Microsoft's first party solution outshines the competition once more.
Extras in the realm of cloud hosted email systems come in many forms. But the most popular ones being advertised today include SharePoint access, Lync capability, and download rights for some or all MS Office applications. Indeed, the disparities between providers are fairly wide, with some providers offering some extras in areas that others don't, and vice-versa.
But the consensus easily shows that Office 365 offers the widest array of possibilities out of the box. Let's be clear and fair here to point out that my matrix in this area merely highlights if a provider affords access to a given flavor of value added service, and if so, what the associated cost happens to be. The comparison chart isn't as apples-to-apples as the previous two, but it does show you the possibilities for all of the providers listed on various items that may be of importance to an organization.
Without a doubt, it's fairly easy to see why Office 365 is the most flexible offering here. While the lowest Office 365 Email Only plan does not provide all of the extras listed in the chart, as a whole, some level of subscription of the suite provides each and every option you may desire on the scale. And that's the key benefit that sits with Office 365; the choice is up to you. You can have as much or as little of the pie as you'd like -- a winning combination of cloud elasticity and scalability alike.
Microsoft doesn't win on every category above, mind you. On Legal Archiving, 365 takes second seat next to RackSpace Hosted Exchange. This third-party provider has a quite excellent $3/seat extra per month plan that can provide full legal archiving and retention capability. However, in the grand scheme of the entire package that 365 provides, this is a small shortcoming which will likely be rectified as time goes on, seeing how Microsoft has every other provider beat on features and pricing for nearly every other aspect.
The rest of the positives that Office 365 brings with it are pretty self explanatory. None of the hosted Exchange providers provide downloadable Office suite rights; the few that offer anything just pony up standard Outlook licenses, which are petty compared to having full Office download rights for just slightly more money through Microsoft (for multiple devices, too). Office Web Apps are another exclusive to 365, which are browser based versions of Word, Excel, PowerPoint, etc. Lync capability also comes in cheapest through 365, with the next best bet being Intermedia at a steep $15/month per seat.
It's keen to note that SharePoint offerings from the hosted Exchange providers are also expensive, when available. Office 365 comes with SharePoint Online access starting at the lowly $8/month E1 plan, and the next closest option is Intermedia which bundles SharePoint rights into its $11/month plans and higher. The same goes for Active Directory Federation and/or SSO capability - only Microsoft provides the purest SSO option in the form of Active Directory Federation Services which can tie natively into your optional existing on-premise AD domain. Intermedia offers a third party based directory sync option, but at a slightly higher $11/month per seat starting point.
Is your business looking for future headroom with whichever direction you are heading in terms of email hosting? If so, I think Office 365 is the natural option that provides the greatest elasticity on features and scalability for your workforce. Price isn't everything, but Office 365's feature set and rock-bottom pricing is a tough option to say no to.
The Finer Points matter, Too
There are a few qualitative advantages that Office 365 carries over the various hosted Exchange providers; some of which may matter more to a prospective organization making the switch. It's critical to bring these to light because too often we get lost in a purely price vs feature comparison only. These considerations are equally important when moving from a legacy email system and into the cloud:
Continuous, gradual improvements vs "Big Bang" upgrades. Office 365 is on a similar, but not as radical, development cycle as Google Apps. The entire suite undergoes continual, evolutionary upgrades as opposed to hosted Exchange, which delivers the same traditional approach to on-premise legacy Exchange platforms: what you see is what you get (and will have) until the next major release -- and even then you may not get moved without some friendly prodding of your provider. Most business owners I speak with prefer a consistent upgrade path that introduces new features and fixes slowly, as opposed to putting people through revolutionary shocks between major stepped releases.
What level of uptime transparency does the provider have? Like Google Apps' own Status Dashboard, Office 365 provides a very similar Service Health section within your control panel interface. I ran a quick check across a few of the hosted Exchange providers, and I couldn't find any public information on service status or outages. They may offer intra-control panel dashboards for service status information, but be sure to ask about this if you are considering someone else outside of Office 365 or Google Apps.
What level of integration do you want from your cloud email solution? It goes without saying that Office 365 is one of the most tightly-integrated cloud email offerings around next to Google Apps. While some providers of hosted Exchange offer bits and pieces of the same experience, none of them have the seamless full circle packaging across all aspects of the cloud services you may be subscribing to. This could be the difference between a simplified, easy-to-maintain platform and one with numerous phalanges that need to be separately controlled and serviced.
How much control do you want over your platform? While the Office 365 online control panel provides access to about 80-percent of all the intricacies that the service provides, everything else can be directly accessed via the powerful PowerShell command interface. Hosted Exchange providers may be able to provide the same level of access, but generally, they need to keep the "reigns closer to the belt" due to the very nature of how hosted Exchange works. Office 365's version of Exchange Online is built from the ground up for cloud usage, which makes it technically the more sound solution in most cases (but not all.)
Don't get lured in solely on offers for free migration by hosted Exchange providers. While migration costs can get costly depending on your current setup, keep in mind this one important fact: migrations are one-time fees, while recurring monthly fees are just that, recurring, and last for as long as you are on the platform. As you add more users to the service, you are increasing your monthly costs at the same pace. Hosted Exchange providers love to highlight their free migration services and sway the discussion away from their generally (sometimes substantially) higher monthly fees. Good business sense indeed, but as a consultant to my customers, I'm equally responsible for providing them with honest cost-minded recommendations for their ongoing needs.
The points above are definitely things every organization needs to consider above the traditional feature/price matrix. Since switching costs are inherent to any cloud provider move, it's always best to make the right decision up front and stick to a given platform for a number of years - instead of hopping and flopping between providers.
Either Way you look, On-premise Exchange should not be on Your Shortlist
We can debate the differences between Office 365 and hosted Exchange all day, but one thing that we can hopefully agree on is that on-premise Exchange just simply doesn't have a place in the modern small business (50 seats or less.) Microsoft already nudged the SMB market away from on-premise Exchange by killing off Windows Small Business Server, and I'm certain that even the new Server 2012 Essentials offering is merely a placeholder to buy small businesses time as they make their way the cloud. Customers are even approaching us about moving Active Directory into the Azure cloud, which could bid servers farewell in the small business workplace entirely.
The modern small business needs agility, flexibility, and a reduction in reliance on static servers that need painstaking ongoing maintenance to keep them operational. Are there situations where servers or on-prem makes sense? Sure -- but I fully believe those scenarios are few and far between in 2013, and even less so going forward, especially for the SMB market.
If your organization is planning a move to the cloud for its email needs, don't fall victim solely on price-wars or steep promises. Do your homework, compare your options, and make your own informed determination. While I'm not writing off hosted Exchange entirely, most of my customers who have switched to Office 365 are quite happy on all levels including price, features, and elasticity for optional upgrade headroom. Sit down with your trusted technology consultant and lay all the cards out on the table.
While I don't want to blindly champion Microsoft's first party offerings, with the new Office 365, they're truly on to something pretty great.
Photo Credit: 2jenn/Shutterstock
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
We all know software vendors have vested interests that sway some of the decisions they make. When I heard that Microsoft was the real driving force behind a sly K-12 school privacy bill making the rounds in Massachusetts, I immediately smelled something rotten. While the public purpose behind the bill aims squarely at protecting student privacy, it's not hard to connect the dots back to Redmond, Wash.
Even though it's easy to see why Microsoft would prop up such a bill (to ease Google Apps' rise in the K-12 educational market), I question the long-term business sense of such dirty grandstanding. Microsoft's Office 365 for Education is already free for students and staff of any qualifying school district (just like Google Apps), and the suite is pretty darn good competition for Google on technical and functional merit alone. So what's the sense in playing dirty just to sign on a few more seats here or there based on misinformation?
Shady maneuvers like this almost always backfire in one way or another. Apple may have scored a legal victory against Samsung recently, but its behind the scenes efforts to get retailers to ditch Samsung devices on shelves only angers those who enjoy free competition and choice. And how well have Apple's attempts to keep popular Google apps, like Voice, out of users' hands turned out? They soured Apple's image, and in the end likely turned more users against the company than anything else.
School Privacy Bill with Microsoft Roots
At face value, the bill up for consideration in Massachusetts has some pretty agreeable stated goals. It aims to prohibit any company providing cloud services to schools in the state from using mined information for any commercial (read: advertising) purposes. Both Microsoft and Google compete toe-to-toe for K-12 district mindshare in the cloud email space, and Google Apps steadily chalks up conversions -- over the last few years. My own former high school district and employer, Maine 207, was one of the first large high school districts in Illinois to make the move to Google's ecosystem starting back in 2008.
Microsoft is oddly standing behind its efforts in backing the bill. Microsoft spokesman Mike Houlihan says, "We believe that student data should not be used for commercial purposes; that cloud-service providers should be transparent in how they use student data; and that service providers should obtain clear consent for the way they use data. We expect that students, parents and educators will judge any proposed legislation on its merits". Not to spill egg on Microsoft's face, but does this legislation even apply to Google Apps in the first place?
As a consultant to many school districts considering switches to cloud email, this discussion comes up quite often. And it's an area that I fully believe Microsoft and Google are on level playing fields for, and never recommend one service over the other on privacy issues alone. Lifting one provider over the other on this line item alone is like winning the game because the other team didn't show up to play -- satisfying, but it doesn't really prove anything, as both Google and Microsoft are very good in regards to privacy concerns alike.
A quick check on Google's public Terms of Service for the Google Apps for Education suite speak numbers to the way that commercial interested (primarily ads) are handled for school needs. Simply put, they aren't displayed or used. By default, Google has all ads disabled for student and staff accounts; the only time ads are allowed are on alumni accounts, which don't technically qualify as students or staff in the first place.
Microsoft, Google squirm around Different Facets of Email Scanning
The larger discussion at hand here, especially in light of the recent "Don't Get Scroogled" campaign by Microsoft, surrounds the very fact both companies have and still do scan all of your email whether you like it or not. Microsoft's attempt to paint the debate around strict advertisement targeting is indeed intellectually dishonest to the nth degree. Since we now know that neither Google or Microsoft are targeting ads for students or staff at schools for advertising, what exactly are they scanning for, then?
A lot of things, actually. Spam detection and control, for example, wouldn't be possible without contextual and precise data filtration from incoming and outgoing messages. Does this constitute a breach of privacy? I don't believe so, and have stood behind this belief for quite some time. Technical, computer algorithms deciphering the spam levels of email messages is a far stretch for painting respective vendors in a bad light for helping merely keep our inboxes clean.
Funny that Microsoft should take the stance it does, since the company has some uses for targeted email scanning. Straight out of the conscripts of its own Services Agreement reads:
For example, we may occasionally use automated means to isolate information from email, chats, or photos in order to help detect and protect against spam and malware, or to improve the services with new features that makes them easier to use.
Even though it is not serving up ads, isn't it fairly easy to foresee that Microsoft may be commercially gaining from your information? Wouldn't an overall better user experience ideally lead to more new customers of Office 365 for Education? I'm not a legal scholar, but you can come to your own logical conclusions.
Similarly, another snippet from the same policy reads:
When you upload your content to the services, you agree that it may be used, modified, adapted, saved, reproduced, distributed, and displayed to the extent necessary to protect you and to provide, protect and improve Microsoft products and services.
So again, if your definition of "commercial benefit" is strictly aligned in the realm of advertising, then Microsoft using your data to influence its direction of given services could be a moot point. I happen to differ, and this is why I label the Scroogled campaign and subsequent school privacy bill battle as intellectually empty in all respects.
Keep the Fight focused on Cost and Technical Merit -- Not Legal Battles
Consumers always win when competition is high and barriers to entry are low. It's a fact of our wonderful capitalist system, and the reason why we're free to use Gmail or Outlook.com, Google Apps or Office 365 in our school districts, and all the other services these two great companies have to offer. I'm not penning this article to talk down Office 365 on technical merit -- in that area, it's a solid competitor and often times better option than Google Apps.
But let's not divest this cloud services battle to one full of patent law pickpocketing, dishonest privacy rights campaigns and backhanded attempts at keeping competition out of the K-12 market. If Microsoft spent as much time, energy, and money on showcasing the great benefits schools could enjoy through Office 365 (like they have quite wonderfully done in their great Surface commercial campaign) then perhaps we wouldn't be discussing the legal filler in its respective terms of service.
Email scanning is here to stay for as long as we need spam protection. Unless, of course, you prefer wading through hundreds of junk emails each night. No, thank you. I'll give the cloud providers the benefit of the doubt here.
Photo Credit: Igor Zakowski/Shutterstock
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
The competition in the Google Apps backup market is steadily ramping up, with more than a few contenders jumping in lately to have a piece of this newfound need. Just two months ago, I wrote about my (mostly) positive thoughts regarding Apps backup provider Backupify. But in order to do the competition justice, I decided to give the other popular alternative Spanning a run for the money.
Your choices don't stop at Spanning and Backupify, in case you're wondering. Google stepped into the backup arena with its first party Vault solution earlier last year, which takes the crown for being the most integrated option (for apparent reasons.) Some of the junior vendors in this space also include CloudAlly and SysCloudSoft. These two latter providers try to edge out Spanning and Backupify with better pricing, but they are not yet as established so it is tough to judge them on cost comparison alone.
For Google Apps administrators looking for a quality cloud backup solution, both Spanning and Backupify fit the bill at face value. Each provides varying degrees of data backup for the Apps platform, and both are quite cost-effective. But as I outlined in my writeup on Backupify back in December 2012, it does have a few shortcomings. Namely in offering no price discounts for the educational/nonprofit sectors and also in having some questionable inabilities like no true Google Sites restoration, as well as forced "all or nothing" restores as opposed to per-item selections.
Spanning is Well-organized and Visually Appealing
I know backup platforms aren't here to win fashion awards, but visual organization and a clean UI go a long way in creating a fully functional service. Case in point: Backupify doesn't have the worst UI in the cloud arena, but it surely wouldn't win any trophies from me. It took me a good 15-20 mins to get acquainted enough with Backupify to truly understand how to navigate the entire interface, and how it grouped backups, settings, etc.
Spanning spared me the lesson in software navigation. In short order, I quickly understood that moving around the backup service entailed two simple concepts. The first entails understanding what level of your domain you are browsing (the uppermost "domain" portion of your account, or a particular user account) and then the subsequent settings and available restoration options for that level. The contrasting color scheme goes a long way in creating simplistic experience that doesn't dumb down the interface, but saves me from having to go into the FAQ section to get my bearings straight.
Even non-technical admins can make their way around Spanning. The interface as a whole is a pleasure to navigate and you're never left wondering about where to go next.
Spanning takes a more Google Apps-esque viewpoint on assigning backup rights to accounts in your domain. Whereas Backupify opts to have mere on and off approach to which accounts are backed up, Spanning uses the licenses method that Google Apps admins should already be used to (since Google Apps forces you to purchase seats by licenses.) I wouldn't say I necessarily favor one approach over the other, but you can quickly reassign licenses and see how many are left at a glance.
Another area where I like Spanning in the visual department is its Status History page that shows you a simple dot matrix chart that corresponds to how backups went for each user over the past 30 days. Its clever, simple, yet intuitive and reduces the need for admins to waste time digging into status reports unnecessarily. In many ways, this page takes numerous cues from Google's own Status Dashboard, but I don't mind. The basis behind the idea works well.
Checking on your backup stats couldn't be simpler. Green is good, anything else deserves a second look. Clean, yet effective.
There are other small areas that aren't very crucial on the outset but become more useful in more nuanced scenarios. For example, there's no way in Backupify to view how much space each user is taking up in terms of backup needs. In Spanning, this is easily done on a single page that displays the largest users on top, using simple bars or varying sizes to denote storage space used. Other minor nuances persist throughout the service like bright, big buttons for changing options and visual cues that hit you over the head when something is wrong.
Backup and Restore Capabilities are Top-notch
I have to bite my tongue a bit, because I gave Backupify some high accolades for impressive capabilities. That was until I laid my hands on what Spanning offered Google Apps administrators. The level of features baked into the service are quite stellar, especially when you look at the bevy of options provided when it comes to restoration abilities.
For starters, Spanning introduces some welcome settings that allow you to adjust things like backup exclusions per account. By default the service has options configured that prevent shared documents and calendars from being backed up to a third party's (the sharing recipient) account. But as an admin, you may opt to change these settings for select users. In similar fashion, particularly email under certain labels can be excluded from backups if it serves little value to the user or organization.
The crown jewel of Spanning's prowess is clearly demonstrated on its Restore page that allows each user's data to be plucked back from oblivion. And here is where the differences between Spanning and Backupify are even greater, because Spanning allows for point-in-time restoration on top of multi-item restoration. Want to go back and merely bring back a copy of a Google Doc from two weeks ago, as well as a Google Presentation two days prior to that? With Spanning this is easy as ever, since the interface allows for complex searches that are visually succinct and easy to follow.
Spanning lets you see your inbox the way it stood at any point in time. Talk about detailed restoration.
Another important difference is that Spanning allows you to restore entire folder structure for your Google Drive account. In Backupify, to my best knowledge from past tests, folder structure was not kept intact and large, complex accounts became a miserable mess if a full restoration was needed due to data disaster. Spanning not only brings these files back but recreates where they sat, saving you endless hours in cleaning up your cloud storage drive.
The level of control provided on restores really does matter. Think about it: do most users truly make messes of their entire inbox or Google Drive account at a time? Not likely. Most of the time, in my own consulting experience, end users lose single files or emails and want the ability to bring those selective items back from the dead. Spanning simplifies this process to the lowest common denominator, and I can't commend it enough for this capability.
An item that may or may not matter to some organizations could also be the fact that Spanning can handle full restores on Google Sites within a domain. While Backupify does handle backup of Sites, it can only offer exports on them - not true restores. Many organizations I consult are choosing Google Sites as an alternative to Sharepoint these days, so this could be another tipping point in your own comparison search.
A colorful interface that has a dead simple menu structure. "It just works".
In keeping with their dedication to granular file control, Spanning goes a step further and allows for multi-selected file export. Want to download three files from someone's Google Drive that may be needed? You can do it with just a few clicks. The online interface for Spanning represents a simplistic version of Windows Explorer so even the most beginner of Google Apps admins will not feel overwhelmed with performing such tasks. Backupify has export capability on the various Apps, but they are primarily full service exports. The "all or nothing" mentality as I mentioned previously which is nice, but not great.
Spanning doesn't come without its flaws, however. While it allows you as admin to select whether end users can or can't change their backup settings, it doesn't let you, for example, pick and choose which options users can adjust. It would have been nice to see Spanning allow admins to let users adjust which mail labels they can/can't backup, but force full backups on each Drive account if necessary.
Also, while you can see how much storage space each user occupyies on the service at a glance, you can't tell what services account for the used space. Spanning could have taken a few functional cues from FlashPanel on that point, since the service excels in providing very good insight as to how much data is residing in both Drive and Email at any given time.
Minor grievances aside, Spanning provides an all around experience that is exemplary of what my wish list from a cloud backup provider looks like. Some companies excel in function, some in form - but Spanning seems to have hit it home in both areas, and to that, a much needed kudos.
Competitive Pricing and Proven Security Credentials
Lots of companies are very particular on the security credentials of auxiliary services my company recommends in the course of Google Apps and Office 365 consulting. And while I was pretty impressed with Backupify's claims about security, Spanning puts their money where their mouth is in the form of having achieved full SSAE 16 audit status.
This accreditation not only certified the level of security and standards in place for their technology backbone, but everything else encompassing employee protocol, customer communication, and much more. If everything else puts Spanning and other backup providers neck and neck, then this should be something to bring it over the top.
Before you wish to purchase, they provide the obligatory 14 day free trial. Getting signed up is a piece of cake. And on the pricing front, Spanning has another clear win. Their service costs a flat $40/year at face value ($35/year if you can find a discount code) and the most important factor here is that there is no storage cap. Backupify's lowest end tier places a 35GB cap on storage space per user, which could easily become a problem for execs who have 20GB in their email inboxes and another 20GB+ of Drive files to manage. Sure, you could go up to Backupify's $4/month plan but Spanning still has them beat in cost for the same feature set level.
I also knocked Backupify on their lack of discounts for the educational and nonprofit sectors. Spanning has, in contrast, been offering lofty 25-percent discounts for these two important markets, and this will definitely play a big part in many organizations' decisions. Google's own Vault may be the one platform that undercuts Spanning on cost, as they offer free Vault service to all students at any K-12 domain that purchases Vault for its entire staff base (at a cheap $10/user per year.) But for the feature set and multi-faceted capabilities of Spanning, Vault is tough to recommend above that of the former.
Whichever direction you decide to head with your organization, remember that the cloud still needs to be backed up like any other on-premise system. While it's very true that Google Apps and Office 365 both have the technical underpinnings to prevent systematic failure or data loss, they do nothing to prevent user-induced data loss. This is where you need to have some kind of solution in place, ready to help restore files in times of need.
The cloud is great as a whole, but data backup is something we just can't seem to shake. At least Spanning makes it foolproof easy.
Photo Credit: T. L. Furrer/Shutterstock
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
While Betanews isn't usually a place for political discourse, I'm going against the grain on this one. It's because I strongly believe the real answer to solving our serious gun crime problem in America rests in something most readers on this site tend to embrace: technology. More specifically, what we refer to as Big Data. I fully believe we have a data problem, not a gun problem. While the debate at large focuses on reaching the same end goal, the fingers point at the wrong solution.
Big Data, in my opinion, does have a spot in this debate. While Robert Cringely one month ago wrote why he believed just the opposite, I think we have more than enough examples of where Big Data has been helping more than hurting. If you listened solely to the press conferences politicians hold in Washington, you'd almost come to the conclusion that all the guns used in recent crimes pulled their own triggers. There seems to be a steady forgetfulness that nearly every recent mass tragedy was actually perpetrated by individuals with some form of mental illness. But this doesn't stir the headlines the same way gun debates do, so the topic gets swept to the wayside.
We've got a serious problem in America, and I don't think it lies in magazine round capacities or (mislabeled) assault weapons. It's that we have no reasonable technological backbone in the form of data collection and sharing that can track individuals at risk for committing these types of violent crimes. New York City successfully leveraged Big Data (CompStat) to fight its own crime problem; we've tracked sex offenders with ease down to the city block for many years already; and the USA has managed a national detailed terror "No Fly List" for nearly a decade now.
So why can't we effectively track mass data on potentially violent threats afflicted with mental illness? It's a very good question, and one that will only be logically tackled once we get over the largely theatrical gun control debate.
The Dead End that is New Proposed Gun Control
It's a shame that so much congressional energy is being spent on clarifying what visual aspects of guns constitute assault weapons that politicos like Sen. Dianne Feinstein (D-CA) want to ban. The biggest sham about the new proposed assault weapons ban, for example, is the established fact that most violent crime isn't even committed with so-called "assault weapons". Handguns are the most prevalent choice for mass shootings.
If you're hesitant to believe the true numbers, have a look at some of the most violent recent school shooting suspects in the United States:
Of just the above sampling of 6 recent violent US school shootings, only two instances involved semi-automatic rifles (the center of the current assault weapons ban controversy.) Of the rest, all involved some form of handguns, supporting the established statistics on US gun crimes at large.
A standard S&W MP15 sporting rifle, commonly referred to as an AR-15. Contrary to reports, "AR" merely stands for "Armalite" -- the company which originated this style of weapon. And doubly, it can only fire at semi-automatic levels, unlike a true "assault rifle" that can handle fully-automatic. Gun paranoia fuels this kind of media misinformation.
While there is no strong correlation between assault weapons and recent mass school shootings, mental illness definitely has a staggering presence among the involved suspects. Using the same sampling of perpetrators, have a look at just how many had some form of mental illness that has been publicly reported:
In some form or fashion, all of these individuals above had one thing in common: mental instability. That's right, every single one of them. In fact, a majority of them had traceable histories of instability that was brought to light only after it was too late. I'd argue it's not a gun problem at all that we're fighting, but a lack of coherent direction on information sharing and mental health access. Both sides on the issue are at fault to some degree.
New gun legislation is likewise doomed for failure in solving the core issue of gun access because even the US Department of Justice concedes that among state prisoners in jail for gun crimes, 80 percent obtained their weapon in part by "street buys" or "illegal sources". So this begs the question: if criminals aren't following the law to begin with, what makes us believe these new regulations are going to keep weapons out of their hands? If I'm connecting the dots right here, these stricter gun laws will merely give the bad guys another leg up over law-abiding citizens.
If there's one thing I would recommend in light of all the above data, it's that broadening our concealed carry laws could likely have the biggest impact on keeping death tolls down when shock killers decide to strike. The evidence already points to lower murder rates among states that have concealed carry in contrast to those that do not. And there's even movement to get teachers prepared to fight back, like these Utah educational workers who are training to carry concealed weapons in school, something already legal in the state.
Renowned security expert Larry Correia said it best on a recent blog post surrounding the gun control debate: "Gun Free Zones are hunting preserves for innocent people. Period".
The Data is out There, We just refuse to make sense of It
If there was a posterboy for the mess that is our state of mental health information sharing in America, it would be Seung-Hui Cho. This was the 23 year-old troubled college student who took 32 lives plus his own on the campus of Virginia Tech in 2007. The signs that were present, and data that was already available, was insurmountable. For example, Lucinda Roy was the co-director of Virginia Tech's creative writing program, and experienced Cho's conditions first hand. "[He was] the loneliest person I have ever met in my life", she told ABC News.
Roy attempted to get Cho help through official channels, but met resistance from higher-ups who noted "legal hurdles" in getting Cho the help he needed. The University even made public mental health records they held on Cho, noting several instances of discussions between Cho and school health specialists where he described ongoing symptoms for depression and anxiety. Three therapists had the chance to talk with Cho before his murder spree, to no true avail. They are not to blame necessarily -- the broken system is.
The more you look into the past lives of these school shooting perpetrators, the more commonalities you find with similar signs and data. While every single shooter doesn't leave behind such a vivid trail of mental illness evidence, most instances do have enough concrete data to say that "something" could have been done. The problem at hand is that we have no discernible way of making sense of all this data. How do we store it all? Where do we organize it? Who's to manage it?
These are the kinds of questions we should ask now, and work towards to a solution once and for all. The terror tragedies of 9/11 forced Washington to get serious about a No Fly List, and the results have been impressive overall. Many recorded (and some likely unrecorded) instances of potential terrorism were stopped dead in their tracks due to this cohesive national database.
The New York Police Department also showed how the right technology can empower change. Specifically, reducing crime and predicting future problems, edging on the lines of what the movie "Minority Report" made famous in its screenplay. GIS and database technology helped form the basis for CompStat, a comprehensive system that has been so powerful, it is now in use in numerous US cities such as Austin, San Francisco, Baltimore, and other large urban centers. In fact, CompStat has been so effective that by 2001, one third of the country's 515 biggest police departments had adopted some iteration of the CompStat methodology in place.
Big Data isn't perfect, however, especially when government is involved. Take for example the FBI's spotty recent history in getting its multi-million dollar pet project running, Sentinel. That Big Data project went completely overbudget, overran on its entire projected launch date and was eventually taken back in-house for final development. The in-sourcing effort by the FBI further bloated costs and finally produced a usable data warehousing platform by 2012 -- over a full decade after the information mess that allowed 9/11 to take place.
Information Sharing is Key to Preventing the Next Sandy Hook
I'm not an expert in the topic of federal information systems, or even smaller state level database platforms. I don't have the single answer to what our next steps should look like. But I do know, as a rational American, that connecting the dots surrounding the real issue with mental health information sharing isn't that difficult. The patterns exposed among a majority of these school shooting suspects should provide some immediacy to the question of how we plan on getting already collected information shared, and how to best leverage it to help prevent the next disasters.
Sadly, until we get real about tackling our information sharing problem, the Adam Lamzas and Seung-Hui Chos will continue slipping through the cracks. The unfortunate part is that the breadcrumbs are already out there. Without a technologically powered way to sift it, sort it and make use of it at the proper levels, we can't do much about it. And so the political theater will continue in Washington, while the underlying cause of mass shootings gets swept aside because gun control debates steal more headlines then mental health discussions ever could.
Want to get involved in making real change? Contact your congressional leaders and tell them to address the mental health information problem. Even banning every future assault weapon sale won't in the least prevent the next Sandy Hook or Columbine.
Photo Credit: mashe/Shutterstock
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
Even though Microsoft's lips have been sealed shut on the topic, launch of Office 2013 (and the new Office 365) is imminent. Speculation is fueled by the Office 2013 available on its Home Use Program (HUP) website, something which has customarily preceded most prior Office launches. If the show is about to begin, then all of these preparatory charades are quite the indicator.
Martin Brinkman provided a wonderful in-depth preview on Office 2013 this past summer, and the great majority of what he covered is still valid in the final bits. I've personally been using a MSDN copy of Office 2013 Pro Plus since about late October and am quite pleased with the product. Ever since Microsoft dabbled with x64 capability in Office 2010, in-house developers increased memory and security aspects that 64-bit provides and the result is a smoother, safer Office experience. Microsoft posted a long Technet article on the benefits of x64 Office 2013 this last summer.
There's a surprising level of confusion out there, even among fellow IT professionals, about what the new flavors of Office entail. And rightfully so, with Microsoft making reference to Office 2013 and Office 365 interchangeably as of late -- even coining a new unified term, the "New Office". After some careful research on the web and doing some of my own digging, I've come to a pretty good understanding of where Microsoft is headed with everything Office-related. Here's the full scoop.
Office 2013 = Office 365 = "New Office"
Office 2013 is all-but-confirmed for a January 29 launch under three different-but-related product names. You may see reference to any of these monikers, depending on which route you're heading with your Office upgrade path. Thanks to Microsoft watcher Mary Jo Foley, we now have a pretty crisp picture of where all these editions fall into line.
The key distinction between the editions here lies, first and foremost, in whether you plan on going the cloud-based subscription route, or if you are going the boxed copy route. The more traditional physical copies of Office 2013 follow a similar pattern of previous releases, especially if you are familiar with Office 2010's flavors. For $140 you will be able to pick up the Home & Student edition; Home & Business will run $219; and Professional will cost $400 (not shown below). Office 2013 Standard and Pro Plus are available through the educational and Software Assurance channels only, meaning you won't be able to purchase these in any store.
Microsoft truly believes, however, that the cloud subscription route (Office 365) is where the public wants to go. I'm personally a boxed-copy kind of guy, so the subscription model is a tough sell, but I can't predict what others will think of this. Adobe paved the way for the pay-as-you-go model when they took Creative Suite to the cloud earlier last year, and Microsoft has followed suit with Office.
Looking at the matrix above, you'll notice how aggressive Microsoft is with the monthly price tags, especially at the Office 365 Home Premium and University levels. Home Premium is probably the juiciest option of all, allowing for up to 5 devices to be used on a single subscription for under $9 per month. This edition of Office 365 is the one Microsoft hopes to lure people over to who are used to the boxed Home & Student edition, that has offered multi-user installation rights since version 2003. Microsoft has nixed this nicety in Office Home & Student 2013, presumably to bait home users to switch over.
Another item that confuses some folks: which editions of Office 365 will most resemble the traditional O365 offerings aimed at businesses. More specifically, the Office 365 flavors that feature Exchange Online, Lync Online, etc. Through various outlets, it has been confirmed that the Small Business Premium level and higher will be the ones that contain downloadable Office 2013 copies alongside Microsoft's hosted Exchange Online, Lync Online, and the other business-centric items companies care about.
These new business focused subscription levels will fit alongside Microsoft's current Office 365 lineup, which has price points for every comfort level. I'm personally curious to see where Microsoft takes the new Small Business Premium offering, since I ratted out the current Small Business (P1) Plan pretty heavily in a previous op/ed on Office 365. One of my biggest gripes about the current P1 offering is its lack of 24/7 phone support for customers, even though the Email Only plan (which costs $2 less per month) does get full phone support treatment. Hopefully, Microsoft listened to the complaints about this discrepancy and will make some changes come later this month.
Current Office 365 Customers get Hit with Improvements as Well
One of the nice parts about the upcoming unified Office 2013/Office 365 launch: Microsoft is going to extend quite a few improvements to its current Office 365 customer base as well. Not to be left behind, they are going to be notified via their web portals about a looming "refresh" that will introduce quite a few new features and enhancements. Seeing as I've already moved some of my FireLogic customers to Office 365, this will definitely be a pleasant surprise.
Some of the biggest things hitting O365 during the next refresh cycle include:
The new landscape of what Office 365 looks like is starting to become more clear, as the lines blur with what Microsoft considers solely "traditional Office software" in the way we used to know it. Soon the "New Office" will be just that: a fresh, bold direction for a flagship product that has withstood the test of time. When Microsoft proclaimed it was "All In" with the cloud a few years back, now we can safely say it wasn't joking around.
Want your Boxed Copy of Office? It will cost You (a Bit) Extra
If there's one way to stick it to those who prefer to get their software the old school way, it's to tack on an artificial price premium. Not only that, but Microsoft is killing off one of the finer points with purchasing the boxed edition of Office (Home & Student edition, specifically) in that every flavor will have a single user install limit. Seeing that Home & Student 2013 costs a few more bucks than 2010, it's more than a little upsetting.
It's fairly clear that Microsoft's reasoning behind the price premiums is to slowly carrot-and-stick users onto its Office 365 plans. Joe Wilcox does a good job in outlining the new price points for the boxed editions of 2013, and the results won't win over any fans who were planning on upgrading via DVD. License to license: 180 percent increase between Office Home and Student 2010 and 2013 versions and 76 percent for Home and Business. Looked a differently, Microsoft nearly trebles Office Home and Student 2013 for anyone wanting three licenses.
Was it a good idea on Microsoft's part to raise these prices so sharply? Yes and no. If you believe in the manifest destiny of Microsoft's Office 365 cloud path, then yes, winning over users via a price war may be the ticket. However, even among my own customer base, there are many users who refuse to pay for a subscription on a piece of software that has always been known for its boxed copy releases. Changing that attitude among those potentially upgrading will be a difficult sell.
SkyDrive is on a Bold Crash Course with Google Drive, Dropbox
Perhaps one of the most under-the-radar aspects of the recent Microsoft wave of updates is the continual push to get SkyDrive adoption on the up and up. To a certain extent it's working quite well. My Google Apps customers have been loving Google Drive since it came out in mid 2012, but SkyDrive's "me-too" approach to the online storage market is a welcome competitive strategy. In my own informal testing, it's a solid alternative to Google Drive, and for already-heavy Office users, it may be a no-brainer.
It's nice to know that Skydrive now offers 7GB of free storage space (topping Google Drive, which only has 5GB free), which is pretty commendable, seeing as the popular Dropbox only offers a measly 2GB. Office 365 Home Premium users get an even sweeter deal, however, because Microsoft tosses in a bonus 20GB of SkyDrive space per user. The storage is completely stackable on the free 7GB already available, bringing you to a whopping 27GB at no extra cost. I have my reservations about moving to a subscription-based model for my Office needs, but this may be the freebie that brings me over the top.
And unrelated to SkyDrive, Microsoft announced 60 free Skype minutes per month on Office 365 Home Premium subscriptions. If you happen to be one of those folks who increasingly use Skype as a phone replacement, this could be of good value to you as well.
Will Microsoft's push to unify Office and Office 365 be a Winning Strategy?
It's anyone's guess if the cloud model for Office is the right way to go. On the outset, yes, the pricing is darn attractive and the fringe benefits you get scream value, especially over that of the boxed editions. But tossing freebies and likable pricing at users won't change the fact that there is a perception gap that Microsoft has to hurdle. No matter which way you slice it, Office started its life as a boxed product and there is a sizable user base that will likely never see it any different way.
I'm going to give Microsoft the benefit of the doubt on this experiment. The critics claimed that Exchange users would never take their email to the cloud; yet Office 365 got its rise to glory by tearing down the notion of on-premise servers. If Microsoft has its way, perhaps more of its core software base will move into a subscription model, paving the way for a fresh new era at Microsoft. Perhaps this could be the start of Microsoft reinventing itself and getting us prepared for a future of paying monthly for the software we need.
Is this necessarily such a bad approach? While I'm personally an old-school guy when it comes to my software, I think if the price is right and the terms are customer-centric, then the cloud can actually be a good thing. As users we are already accustomed to paying monthly for our cell service and internet service, so adding Office rights to that mix may not be so crazy after all.
There's an excellent debate raging on the front pages of BetaNews for the past few weeks, and it's a topic that I feel quite entrenched in. Seeing as my computer repair business FireLogic deals with customers of all types on a daily basis, I thought I should drop my own two cents in on the subject. Joe Wilcox has argued the death knell for the PC is just about here, while a few others, like Wayne Williams (and myself), dispute the notion with quite the vigor.
I think this topic deserves some definite attention because there seems to be a perception out there that the rise in mobile devices such as tablets, smartphones and the like will completely eradicate the traditional PC. It's a touchy topic for my colleagues in the computer repair industry, and something that is frequently debated on the forums of a website dedicated to "our kind" over at Technibble.com.
Like many computer repair businesses, I've had to come to the realization that being multi-faceted is the only way to survive. The field of computer repair has changed drastically in just the past five years or so, and not because of the decline of the PC per se. The truth of the matter is simply that computer lifespans are longer today due to better hardware, better operating systems (yes, Windows 8 even surpasses 7 in my book), and more unused processing power that newer machines come with, idly unused for the front half of their ownership. Whereas customers of mine were getting 3-4 years out of machines back in the early 2000s, they now push their PCs to 4-6 year replacement cycles without much sweat.
Getting by these days just by offering traditional desktop/laptop repair is not possible -- at least in our suburban Chicagoland area in the USA. But we're a company of many faces today, being knee deep in offering Office 365 and Google Apps consulting, handling networking/WiFi, and many other new-age technologies that go beyond the traditional scope of shop-based computer repair. Our industry is evolving rapidly and keeping up is just the name of the game.
While my own opinion stands directly opposite that of the PC obituary writers such as Wilcox, I have also observed the trends on purchasing and usage of my own customer base. Our company deals with a healthy mixture of residential, commercial, educational and some nonprofit customers alike and we are often the first point of reference for this crowd when they make buying decisions.
I fully admit that the acceptance of tablets and smartphones has been on a steady rise for the past two years. But I think we are mistakenly equating tablet/smartphone uptake with a direct drop in PC/laptop ownership. That, I say, is complete malarkey. Consumption is wonderfully handled on a 4-inch phone or 7-inch tablet, but none of my customers has dumped their PCs to go mobile full time. And I know exactly why.
We're in a PC Plus Era, not a post-PC Era
As exciting and fresh as the mobile era is, it ain't replacing the traditional PC (whether it be desktop, laptop, ultrabook -- whatever your chosen flavor of the day) anytime soon. There are a few hard reasons I can point to, and they are:
Much of this is merely a logical progression in the evolution of modern computing, just like many other products we rely on in daily life. When microwaves came out, we flocked to them because they made simple, short cooking tasks that much quicker. Yet I can't name a single person who got rid of their traditional oven in place of one. The same goes for anyone who purchases a scooter for their daily commute to work. Cutting down on gas costs is great, but you wouldn't dump your car unless you live in the heart of a big city. The same can be said for our new-found affection for tablets and smartphones.
I personally do a lot of email and browsing on my Blackberry 9810. It comes with my line of work. But would I dump my PC in favor of just the phone? I doubt it -- and no, an Android or iPhone wouldn't change my answer. It's because while I'm on the go, a smartphone makes sense. However, in the comfort of my office or home, the smartphone gets put aside and my trusted ThinkPad laptop is the only computing device I'd choose to spend anything more than 30 minutes on. In my view, non-mobile computing on a regular computer just makes sense.
And judging from my own observations of customers' habits, they tend to be in the same boat. Yes, a decent minority of them have purchased tablets such as iPads or Nexus devices, but they haven't eliminated their PCs in any way. They turn to their tablets and smartphones for the short-burst email responses or to spend a few minutes browsing the news, but when Microsoft Office isn't functioning, I'm still the first one to hear about it.
As odd as it sounds, some of these same customers even claim to have lost love for their tablets. Some of them had no idea that typing for extended periods on a touch screen was just that difficult (and they find bluetooth keyboards a hassle and pesky.) A few have mentioned that their tablets serve them no real purpose outside of using cool apps, which eventually wore off. And so the tablets now sit unused in drawers, while they go back to their laptop or desktop routine.
Some may call this a perception dilemma from before the purchase was made. I think some of it has to do with a media "hype bubble" on what niche mobile devices serve and what their real purpose in your bag or pocket is.
Modern Computer Hardware and OS Software lasts Longer
Do I have formal numbers to back this up? No, I'll leave that to the so-called tech analysts. But if the trends I see in my day-to-day work with computer repair customers are right, it's that modern computers (Windows Vista and newer) stick around longer. I'm only talking a few years longer, but for a PC industry that was built around the 2-3 year upgrade cycle, this is a shocking development for all sides involved.
Customers who dragged out Windows XP machines for more than five years avoided upgrading due to financial or other concerns. Now, a Windows Vista or newer PC could likely run on for five, six, seven or possibly more years without much issue. And this is the untold trend that I see in my customer base. Solid, secure operating systems installed onto well-engineered computer hardware equals a darn long system life.
Another factor playing into this reality is the fact that computer hardware manufacturers, starting with the chip outlets like AMD and Intel to the hard drive powerhouses of Seagate and WD, have been including exponential amounts of unused overhead for the past half decade. Let's be real here: the difference between a PIII and P4 processor for everyday computing were vast. Compared to the difference between a Core 2 Duo and Core i5? Not so much (note: I'm speaking about everyday computing, not gaming/CAD/etc.)
And the unused overhead extends to hard drives (how many average people will fill a 1TB hard drive in three years?) all the way down to memory (8GB of memory for browsing the news and typing letters? Quite the overkill if you ask me). Whereas ten years ago we had to spend extra in every corner of a new PC to get the performance we needed, these days it's the exact opposite. In most cases the average shelf PC from Dell or Lenovo will be enough for you to get your work done for the next five years, and then some as a hand-me-down for the kids.
The evolution of Windows is having just as much effect on this longevity question, in my opinion. Windows XP, while good for its era, quickly started to show its edges in terms of security and performance limitations. Excluding the mess that Windows Vista made, Windows 7 and now 8 are quite exemplary in getting more bang per byte than was ever possible. Not only are they dozens of times more secure than a standard Windows XP machine, but seeing as both of these modern OSes can be easily installed onto Windows XP-era hardware is a proof of concept within itself.
Every way I look, to my own computing needs and the patterns among my customer base, I see a fleet of modern computers that are outliving their brethren's lifespans without much effort. And this directly correlates with sales figures that are portraying a twisted landscape in the PC vs Tablet/Smartphone discussion.
PC sales don't tell the Whole Story
In a recent article on this topic, Joe Wilcox displayed some numbers provided by Gartner & IDC that explain the abysmal sales atmosphere of the PC landscape. I'll pop one of those tables up, just for discussion's sake:
The worldwide numbers for PC shipments were down 4Q 2012 compared to 4Q 2011 by a larger margin, coming in at -4.9 percent, but the US numbers tell a similar tale (as shown above). If you believe in two of the main points that I discussed above, then a lot of this tends to make more sense. Individuals who already own newer computers aren't purchasing again this time around because their machines last longer, but also because they're now dumping money into companion devices to offload some of their consumption onto.
Ah yes, the PC Plus era starts to make a little more sense. Just like we complimented our kitchens with microwave ovens, we're now adorning our daily lives with a greater need for mobile, handheld devices for quick need-based consumption. And this is exactly why sales numbers alone won't tell the whole story: if computers are lasting longer on average, and we need fewer of them per household, then raw sales figures alone will of course be skewed towards a "post-PC", hello-mobile belief.
Yet the numbers aren't as grim as we make them out to be. While total shipments for the USA were down a mere 2.1% quarter-for-quarter, there were more than a few bright spots on the map. HP came out with a 12.6 percent increase, and Lenovo posted a similar 9.7 percent bump year over year. Even Apple enjoyed a small but decent 5.4-percent bump. Dell and Acer have been on the decline for a few years now, so their drops aren't as shocking as they may seem.
Browser Usage Numbers show little Sway towards Mobile
I think a better determining factor of how devices are being used in daily life is to measure against a usage area that is near and dear to most peoples' hearts: web browsing. So heavy is the claim that mobile is overtaking the PC, we might not need computers anymore for our daily lives. But judging from what the browser usage statistics show, the PC is far from dead and merely cohabiting with consumers' new mobile devices in this landscape.
Here's one graph, thanks to Wikipedia, that was put together by StatCounter, a web usage research outlet:
In the purple color above, we can see the slow rise by mobile device browsing happening through the end of 2012, but the rise is far from dramatic. In fact, it shows mobile browsing creeping just over 10 percent with deep sags shown by Internet Explorer and Firefox alike. However, look at the runaway pony in this race -- Google Chrome of all items! In fact, Chrome trounced the gains made in the mobile arena by nearly 4 times over.
You may say, well, that's because replacing a software program is easier than switching your browsing habits to a mobile device. But isn't that the argument at stake here, that mobile is going to crush desktop usage soon? If these numbers indicate anything, it's that the desktop is still alive and strong. At this rate, mobile needs to continue its 4-percent increase in usage year after just to hit 50/50 parity with desktop usage in the next decade.
Looking at the broader picture, mobile vs desktop usage on a worldwide scale shows very similar results. Here is another StatCounter graphic pitting the two head to head for a timespan of 2009 to 2012:
Again, I ask where we see this post-PC landscape residing because it surely doesn't prove itself on these graphs. Sure, a near 8-percent jump in web browsing by mobile devices worldwide over two years is sizable, but I am fairly confident these numbers will plateau as we reach a level where the market is saturated with "complimentary" devices, not "replacement" devices. The mobile landscape seems to be growing at a steady 3-4 percent per year which is good, but nothing that will usher in the demise of the PC anytime soon.
What does the Future hold? I can't say, but the PC will still be a Part of It
Some people may get away with using a tablet exclusively, as our own Joe Wilcox is going to attempt to prove. And some, likewise, may find that their smartphone can handle a majority of their consumption browsing. But the average user, at least the one I serve in my own day-to-day line of technology consulting, has yet to come to any rash idealistic conclusion of a post-PC era. At the end of the day, no matter how flashy or light their mobile device is, it's their desktop/laptop they are turning to in order to comfortably get their computing tasks done.
I think some new devices blur the line between traditional PC and new-age mobile devices quite well. Some of my favorites, which I've positively opined about in the recent past, are the Chromebook and Surface from Google and Microsoft, respectively. I personally believe that people still have an attachment to the ease that a traditional keyboard provides, and they still want to be a part of the touch era, too.
For the foreseeable future, I think the computer as we know it will still be around and the platform of choice for most average consumers. Until the day where I can whip out documents as quickly and easily as on my laptop, with the comfort of a tactile keyboard and big screen, will I even consider converting myself over to the post-PC era. Because if the iPad and Nexus are the definition of post-PC, I'll gladly sit this evolution out.
Photo Credit: Peter Bernik/Shutterstock
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
While the cloud generally provides for better reliability than on-premise systems, having a solid backup plan is still a universal necessity. Cloud solutions like Google Apps and Office 365 have nearly eliminated the notion of data loss due to technological failure. The systems and processes in place that govern the storage of your important data with players like Google and Microsoft are rock solid. We can fault providers for service downtime any day of the week; but you'll be hard pressed to read about cases where they actually lost your data.
The biggest issue with data loss on cloud platforms lies within the acute problem of human error. We aren't perfect and will likely always be dealing with data loss stemming from incorrect clicks, mistaken deletion, and other similar circumstances. For this very reason, even with its inherent safety nets, the cloud needs a fallback of its own.
Customers of my technology company FireLogic are switching to Google Apps at increasing rates, and the need for a reliable go-to solution for backup is becoming a front-and-center concern. There are more than a few contenders seeking business from cloud newcomers, such as Spanning, Google Vault, and SysCloudSoft.
While all of the competing vendors have good offerings, Backupify is the one catching my attention most. A large number of fellow technicians have recommended Backupify for some time now in the Google Apps community circles. After reading about its positives for so long, I decided to jump in and give the backup solution a run for the money. After loading it onto my company's own Google Apps production domain and my personal domain, I've got some definite thoughts about the product.
Signup and Configuration are Fool-Proof and Simplistic
Google Apps is a relatively easy product to navigate, but I have to say that Backupify takes the crown in point-and-click simplicity. The signup process took no more than the better part of 5 minutes to handle for either of my domains. Like most other add-ons for Google Apps, this service has to be loaded in through Google's official Marketplace. Newcomers need not worry, since installing Marketplace apps is as easy as working in the Google Play store.
After providing some basic contact info and desired plan level, you are whisked through a straightforward wizard that merely asks for you to select which users on your domain are going to be backed up. By default, the service assumes you wish to take advantage of a seemingly forced 15-day trial, which for most administrators shouldn't be a hard sell. Knowing that the company doesn't ask for a credit card number to prove its credentials is a welcome approach from the common "bait and card" path many other vendors force upon you.
The elegance of the Backupify administration interface becomes evident as soon as you look at the dashboard overview of your domain. The "at a glance" information you can glean from the dashboard includes each user's storage quota usage, the last successful backup for each Apps service, and whether or not there are any outstanding errors with any service-level backups.
There are no cryptic commands to execute or interfaces bearing semblance to what Windows Server administrators are more used to. Everything that the product offers is controllable in a user-friendly fashion which even novice Google Apps admins can appreciate.
One of the best features about Backupify is the one-click export functionality that can be utilized for any account backed up by the service. For example, if I wanted to pull down a complete copy of my own user account for archival purposes, I can do so. It's up to me whether I solely want just the contents of my Google Drive or Google Contacts data, among other services. On the other hand, if I prefer to merely just grab an all in one zip file of my entire user account (known as "All Services" in Backupify) then this is also possible.
Google is definitely a clear leader in the up-and-coming "data independence" movement with its own Takeout offering, so its nice to see that Backupify compliments this aspect with its own openness to data exporting. I prefer to do business with entities that don't believe in taking your company data hostage when you sign up with them, and Backupify definitely allays my fears in this respect.
Other simple administrative tasks are just as intuitive in the dashboard. The search bar has real-time capability in that it can filter out users you are looking for merely with partial names being typed in. Adding in accounts for backup is as simple as working with a two-column select and move process. Removing users from Backupify is likewise just as easy; expand their name and hit the "Remove this Account" button to clear them away.
Working with Backups can best be described as Uniquely "Intuitive"
Not many things just make sense in life, especially in technology. But Backupify nailed it with how an administrator controls sifting, viewing, and restoring files or information for an account. For example, doing a basic search on a user's account (as I did, with the search term of "recipe") presents the pertinent results across all backed up files. Backupify shows a summary of the related file/data information for each item, and clicking into a result provides a deep rundown of the item from the last edit date to its exact path on Google's service.
Not only can nearly any individual item be restored, but it can likewise be downloaded directly to your computer (for legal e-discovery purposes, for example), or restored to any number of the previous editions that Backupify retains. This finite control that administrators have provides a very transparent capability to not only allow for easy administrative oversight, but to assist in restoring files which users otherwise would consider lost forever.
If worst comes to worst, and a user accidentally deleted, for example, their entire contacts list in Gmail, rest assured that Backupify has one-button full-service restoration for all of the core apps it covers. This should make an admin breath easily knowing that they won't have to recreate a user's contact list themselves with manual backups. The time needed to push a full restore is entirely dependent on how much data a user has and what kind of bandwidth quotas exist for each respective services' Google API.
Backupify takes its Security and Privacy Very Seriously
The issues of user privacy and data security are usually high on the list of questions a customer has for any product I tend to recommend, especially cloud related. I did some extensive research on the kind of protocols Backupify employs to protect data and was pretty impressed with what I found.
For starters, all data transmission to and from your Google Apps account is done over a 256-bit encrypted SSL channel. Subsequently, all of your interaction with the service over its web-based interface is also encrypted with 256-bit SSL. Seeing that Backupify is leveraging Amazon's S3 cloud storage system for the data it saves on your behalf, all of your content is subject to Amazon's rigorous data center security standards as well. To say the least, Backupify is taking cloud backup security quite seriously in every respect.
To verify the processes that Backupify uses, it paid for security auditing firm Rapid7 to come in and handle penetration testing and other checks on its infrastructure. Bundle this with the SAS 70 Type II audits that Amazon achieves for its AWS data centers regularly, and Backupify has a proven baseline for being considered a top-notch backup solution when it comes to user security.
And while it may be completely unrelated to its security protocols, it's definitely reassuring to know that Symantec has invested millions in Backupify very recently. This is the same Symantec that has a full suite of corporate security solutions including its Endpoint Protection antivirus offering and renowned VeriSign SSL certificate products. They say we vote with our dollars, and Symantec happened to place more than a few bets on Backupify with its recent investment.
The Bottom Line? Backupify is Very Affordable and Worth Every Penny
Even though I really like what Backupify has to offer, it has some items of interest that could be improved. For example, in the lowest tiered plan it offers, you only get one account backup per day. In comparison to other providers like Spanning, it would be nice to get an additional 1-2 backups provided. Likewise, the storage cap of 35GB for the lowest price tier may be nit-picking on my end, but could be a concern for Apps accounts that are heavily used.
Similarly, Backupify doesn't publicly publish any discounts for educational or nonprofit customers. In contrast, Spanning happens to offer a nice 25% off for these clients, and Google's own Vault service is completely free for all students at any given K-12 institution which provides Vault for staff (which are a very reasonable $10/user per year). It would be nice if Backupify were to match or beat the discounts offered by its competitors to sweeten the deal for its own service.
Gripes aside, when you lay the chips on the table, Backupify has a unique product that serves its purpose well. Not only does it provide administrators an easy way to restore accidentally deleted files for users, but it allows for complete oversight into the data being held by an organization in a concise web-centric form factor. No command prompt, no previous training necessary -- if you can use a web GUI, you'll be right at home. And even as a techie at heart, I can appreciate this ease of use.
Backupify just last week introduced some timely updates to coincide with Google's own "winter cleaning". The Enterprise price tier of Backupify now allows for multiple administrators to have access to backup data and controls; audit logs can now give insight into all actions performed within a Backupify account; and subdomains can now be added in and managed without the need to create multiple Backupify accounts.
The Backupify service has three price tiers that Google Apps admins can choose from. The cheapest level, indirectly aimed at working professionals and small businesses, costs only $3/user/month and allows for 35GB of storage space and a single daily backup per Google Apps user account. The second price level is the Enterprise edition, which costs a buck more at $4/user/month and has no storage cap, has audit log and multiple admin capability, and also provides for 3 daily backups per user.
The uppermost tier is Enterprise+ which is shy on details, but caps shared storage at 1TB across all users along with all the other benefits from the second most expensive tier as well. I'm not sure what situation would warrant an Enterprise+ account level unless it comes down to sheer user volume and per-user discounts above 248 users (starting at 248 users, the service would run $3.99 per user for a large enterprise, so I envision big companies taking this to heart).
I hope to give Vault and Spanning a try one day, but until that chance arises, I'm whole-heartily recommending Backupify to others including my own Google Apps customer base. Aside from my minor qualms mentioned above, the service deserves a look from any organization looking to get a second copy of critical Google Apps data for retention purposes.
Photo Credit: T. L. Furrer/Shutterstock
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
The original story was not very newsworthy at face value. An obscure, hard-to-pronounce city in Germany announced that its experiments with one time open source wonder OpenOffice had gone sour and they wanted their Microsoft Office back. Freiburg's city council released a draft resolution recently that covered numerous IT problems, but the one which raised more than a few eyebrows happened to be their frank disappointment with OpenOffice.
Among other things, the resolution had some pointed words about their OpenOffice experiences since 2007:
"In the specific case of the use of OpenOffice, the hopes and expectations of the year 2007 are not fulfilled... Therefore, a new Microsoft Office license is essential for effective operations."
In an attempt to save costs and try an alternative to Office, the city council voted in 2007 to allow for a side-by-side installation of Office 2000 (for backwards compatibility sake) and OpenOffice 3.2.1. Some of the hopes that the council had included an assumption that development of the suite would continue as presumed and offer new features and fixes. Those hopes fell flat after numerous years of trying to "wait it out" and see what happened. As their resolution accurately described:
"The divergence of the development community (LibreOffice on one hand Apache Office on the other) is crippling for the development for OpenOffice."
Of course, the open source community (especially Germany's own) have been up in arms ever since, calling this a smear of their efforts at large by nitpicking various aspects of this story. Yes, it is true that OpenOffice is currently at version 3.4.1 so the council is working on an out-of-date build. And yes, LibreOffice happens to be a (slightly) better alternative to the all but doomed core OpenOffice suite.
But I don't think either point really changes the fact that OpenOffice and LibreOffice are still slow; have too many document conversion shortcomings; and are honestly too bloated to be considered speed demons next to Office 2010 or 2013.
I'm not here to make a big stink about OpenOffice alone. It's quite well known that the suite doesn't win any speed or conversion crowns. And I know very well that there are numerous examples of great open source software, many of which I use all the time. VLC Media Player, Chromium, WordPress, just to name a few. But for all the good examples, there are still the longstanding mediocre ones plaguing the software landscape: Firefox, Thunderbird, ClamWin Antivirus, and the list goes on.
The shining stars of open source notwithstanding, what's the core issue with the majority of open source software and its community?
OpenOffice/LibreOffice are poster children for 'open source stagnancy'I have to be fair here. Just so the fact-finders don't go berserk on me, let's make it fully clear that Microsoft's Office suite has been around in some form for about 22 years now. In contrast, OpenOffice has only been on the map for about 10 years (some say the suite's previous life as StarOffice tacks another 15 years onto its life, but let's give them the benefit of the doubt.) But does this relatively short life give the suite a continual excuse for working just as poorly overall as it did back in v3.2.1 (which Freiburg, Germany just dumped)?
Let's be honest with ourselves. Making broad comparisons between the core OpenOffice suite and its cousin LibreOffice is like having a deep debate about the qualities of two similarly sub-par vehicles. Sure, each one will get you from point A to B. And if all you're looking for is just to "get there" then there's not much disagreement we can have in that respect.
But the office suite debate has a lot more at stake. A platform as integral to the everyday life of so many people, like members of a city council, has to perform to a certain level of expectations. From poor file conversion fidelity to consistent sluggishness, the complaints of the Freiburg city council aren't that shocking if you've used the OpenOffice suite yourself. I admit that my company FireLogic happens to recommend LibreOffice in cases where budget is a concern - but I never make any promises of the suite as a full replacement of Microsoft Office.
Some readers of the technology news website NeoWin used this story as an opportunity to sound off on their own thoughts surrounding OpenOffice (and its shortcomings) at large. A user by the name of javagreen says:
"I'm a Windows *and* Linux user, and OpenOffice/LibreOffice don't come anywhere within a 1,000 foot radius of Microsoft Office. Different leagues altogether."
Fellow user Yogurtmaster didn't have a much better opinion of the alternative suite LibreOffice:
"LibreOffice is just as bad. They need to throw the entire thing away and start with code from today and build it into the future that makes sense, their UI is horrible and the compatibility that they have with Microsoft documents is horrible. The worst project in my opinion in open source."
And one user, norseman, provides some common sense advice for when Office alternatives make sense:
"OpenOffice and LibreOffice are fine for people that don't work with many other people or collaborate with the real world. If you are a student or a professional that wants to get published or do any real work, you're using Microsoft Office, pure and simple."
Not to join in on the chorus here, but my own experiences have similar misgivings about both of the two main Office alternatives. After 10 years of primetime development, and a 15 year backstory before that as StarOffice, how can OpenOffice/LibreOffice have made such relatively little progress in this fair amount of time?
I'm not just blowing hot air here - I gave both alternative platforms a try at various points in time but could never make the switch. I've honestly dumped likely three or four times as many hours into using Google Docs instead of Office compared to any time I've devoted to OpenOffice, mostly due to the same issues others gripe about.
I'm not tied to Office or Google Docs by any means, but they just work. They are fast, efficient, and compatible with most things I need to share with those I communicate with. The combination of the two got me through college and my professional life thus far. Their interfaces are relatively elegant and appeasing to the eye. I don't have to hunt through long 1990's dropdown menus to find what I need (usually through guessing, as OpenOffice has its own intricacies in item placement.) So why can't OpenOffice and LibreOffice get their acts together?
Are hardcore beliefs in 'technical purity' really more important than coming together to present a formidable challenge to Microsoft? For all the bad rap that Microsoft gets for its market position, it does have a quality product to offer with Office. A few users on the same NeoWin article happened to speak up in support of their thoughts on Office as a quality suite.
Primexx had this to say about the Ribbon UI:
"What MS did with the ribbon - surfacing some great but hard to find features - makes MSO a h*ll of a lot easier to use. That means less time wrestling with the program and more time working on the actual thing you're doing."
And another user, 68k, happens to agree:
"The ribbon makes a HUGE difference. It's pure genius in UI design."
While Microsoft and Google are introducing new features into Office and Google Docs at record paces, the open source community is struggling to merely keep up with the times. Even with its newfound forefront status, LibreOffice has not made any huge leaps in the past two years since its inception. Office document support is still pitiful at best; support for long complex formulas is buggy; and the developers think that having a "quickstarter" load on Windows startup is still more important than cleaning up an aging, clogged code base.
Freiburg's angst with OpenOffice file conversion is well foundedThe city council that prompted this uproar surrounding OpenOffice claimed that file conversion fidelity from Office was one of the biggest problems they had with the suite. I've personally experienced similar issues with the suite, so I decided to give this argument a valid, up-to-date test based on what the open source community recommended to Freiburg: just install the latest LibreOffice.
I did exactly that (with LibreOffice 3.6.4), and tested multiple different documents both from my own collection and from online repositories to see if my results would be any different then past usage. On some documents, LibreOffice did better than others. In most cases, it had a tough time giving me 1:1 copies of what I had in Office. Google Docs generally did as good or better than LibreOffice, and in some instances such as this, it beat LibreOffice in conversion quality by a long shot:
So, is the community just blowing hot air when it comes to having qualms with file fidelity? As far as I can tell, serious problems still exist, as shown above from my own Windows 8 laptop. Google Docs clearly had little problem opening the document, and in all but one or two spots in this complex equation-filled file, Google Docs gave me a near replica of what Word 2013 showed.
I personally think the open source community behind LibreOffice and OpenOffice should take the experiences of Freiburg to heart and use this as a lesson that average people merely want functionality, not infighting and bickering about standards and code purity. There are relatively few governments giving open source the kind of love that Freiburg did, and if this bad reputation streak continues, fewer local and regional entities will be considering these suites anytime soon.
Open Source as a whole has an ongoing problem with focus and resultsSay what you wish about Office, Windows, and even rival Apple's OS X for that matter - but they have a few things in common which open source alternatives simply cannot attest to. These products are all backed by solid development communities from their respective 'capitalist' backers, and continue to evolve at a rapid pace.
Microsoft introduced the Ribbon UI in Office 2007 and has fine-tuned it to a large extent for the newest Office 2013 release. Similarly, Windows Vista reinvented the desktop experience back in 2006 and by 2012 we already have the monumental shift in Windows 8. In the same amount of time, what can the community behind OpenOffice or LibreOffice claim? From everything I can gather, their biggest achievements surround the importing of Office OpenXML files and numerous pages of bug fixes. And in reality, all of this effort has only produced suites that are only marginally better than what they were years ago.
Isn't open source a community built on developers that believe in a higher purpose than their counterparts at paid software vendors? Isn't the "love of the product" much more altruistic and deep-rooted in a commitment to quality, openness, and transparency in secure code writing? Why do these very aspects of open source seem to instead foster a continual feeling of stagnancy and petty arguing?
John Dvorak penned a pretty spot-on article back in 2007 titled "What's Wrong with Open Source Software?" that shed some light on why open source is anything but a guarantee of quality, speedy development. He correctly pointed out:
"How many open-source projects have you seen in which the code gets leaner and meaner rather than fatter and fatter? With all the great coders out there, how many projects include coding features and how many include coding optimization?"
John goes on to make comparisons of the Linux coding community to those he calls the "Mac aficionados" in how they both collectively despise the greater of all evils, aka Microsoft. But more importantly, he accurately pinpoints that most open source projects tend to drag on for an eternity unless they are guided by strong one-man leaders.
However, even those strong-armed leaders aren't always the best at keeping the ship sailing down a solid course. It happens to be quite ironic that Linux's famous brainchild Linus Torvalds verbally assaulted graphics giant nVidia this past summer for their supposed inability to properly support the Linux community. I wonder how many in competing development circles have similar thoughts about the relative lack of consistent progress from the Linux community, likewise.
As John Dvorak sarcastically put it towards the end of his op/ed:
So because nobody [in open source] is making any money, and because it's done for the Utopian oneness, there will be no complaining. If you complain, then you suck!
I guess in a nutshell, he sums it up fairly well. Here's hoping that the open source community can learn from their money-making corporate counterparts and harness the power that open development theoretically should provide. Until then, I'm sticking with Office 2013 and Google Docs on my Windows 8 laptop, thank you.
Photo: chasmer/Shutterstock
The Pope may be making news headlines in the tech world by opening his own Twitter account, but there's a much more worrying headline that is keeping its nose under the covers this week. The Internet as we know it could be in trouble if representatives from free-speech oppressors such as China and Russia have their way at a UN telecom regulations conference starting this week in Dubai.
The 11-day conference is billed as a gathering of the world's top nations to discuss ways to update rules last touched in 1988 on oversight related to telephone networks, satellite networks, and the Internet at large. Proponents of the conference say that the Internet has changed so radically since the 1980s that it is now time for others to have greater say in how it's regulated and controlled.
But if you peek under the covers of this debate, the majority of these voices tend to be concentrated in countries that already express some of the nastiest forms of Internet censorship to date. They have fully vested interest in using the veil of a UN conference to push their own agenda of restricting Internet freedoms further for their own citizens, and giving other like-minded nations a better standing to unleash the same kind of oppression on their own netizens.
While the United States has sent a high-profile 123 person delegation with representatives from major tech giants like Google and Microsoft to help oppose any massive changes, it's hard to say how loud the voices of the Internet's repressive poster boys will be. Google has already posted a full landing page marking its opposition to any proposed changes, which you can even sign as an online petition.
I'm with Google, Microsoft, and most of the modernized Internet world on this one. Here are my top three reasons why the UN should keep its hands off any changes to the governance of the Internet as we know it.
1. How can state sponsors of Internet censorship lead positive changes to Internet rules? It's a question in itself that boggles my mind. To put this conference in perspective, you have to take a look at the parties who have major backing within the body in charge of the event itself. The organization, the International Telecommunications Union (ITU), is made up of numerous countries that have vastly different opinions of internet freedom in contrast to the United States and the rest of the free world.
Some leaked documents from the closed-door meetings that have been going on to draft these newfound regulations by the ITU have language such as this excerpt:
Each Member State reserves the right to suspend the international telecommunication service, either generally or only for certain relations and/or for certain kinds of correspondence, outgoing, incoming or in transit, provided that it immediately notifies such action to each of the other Member States through the Secretary-General.
The ITU Secretary General himself, Dr. Hamadoun Toure, happens to conveniently work for Rostelecom, which is the largest telco operator within Russia today. And how does Russia fare on the internet freedom list today? Fairly poorly, I'd say, seeing as just this past summer the legislative house approved a law that for all intents and purposes is a "blacklist as you wish" power that is already being used. Sponsors argue that it is meant to keep the Russian Internet clear of items such as child pornography and suicidal material. The Russian media watchdog Roskomnadzor, mind you, can blacklist any websites it deems illegitimate without any court order.
Other big players in the ITU are states like China, which operates one of the largest and most complex firewall operations in the world aimed at strictly limiting any opposition to its one-party political system. Eric Schmidt, of Google fame, has already claimed that the "Great Firewall of China" is bound to fail at some point, but the question is when. However, if the ITU and its ongoing telecom conference gives headway to these censorship monsters, what's in it for China to open up? Not much, after such a prospective stamp of approval.
2. The UN as a whole has a terrible record in enforcing peace and openness. Some of those supporting the cause of this 11-day UN conference point to the fact that the United States has unjustifiably had "too much control" over the Internet thus far. That's debatable, as I address this fact in my last point below, but why should one think that the UN is the proper party to promote the best aspects of Internet freedom with players such as Russia and China involved in these talks?
Keep in mind that this is the same United Nations that allowed for Sudan to grab a seat on the 47-member UN Human Rights Council this past summer. Not only does Sudan hold an atrocious record on human rights (not the least of it related to the Darfur genocide of the last decade, as briefed on the Human Rights Council's own website) but its own leader, Omar Al-Bashir, is still an internationally wanted war criminal who has a warrant out for his arrest by the International Criminal Court. Sensibility is something the chambers of the UN clearly lack.
But this example of moral failure is not the only one on the UN's pitted record. The organization spent all summer fruitlessly trying to persuade the Syrian government to drop civil war against its own people -- a conflict which has inaction being continuously pushed by civil rights detriments Russian and China. And while on the backburner in terms of world affairs lately, the nuclear proliferation in North Korea can in some ways be seen as a failure by the UN nuclear watchdog to contain the diabolical regime of Kim Johg-Un.
If the above doesn't scare you about any future regulation of the "open" Internet by the UN, I'm not sure what will. Unless, that is, you are already reading this article within the borders of a country that prescribes Internet "cleanliness" on its own people, and you know what to expect by now from these players.
3. The Internet has flourished exponentially under the control of ICANN & the United States. All those people who complain about US control of the Internet's highest authority clearly don't have a knack for understanding the positive progress the entire system has undergone in roughly the last 20 years. Surely, ICANN has kept the reigns of top-level DNS and domain regulation within US borders thus far -- but I don't necessarily see that as a bad thing.
I challenge anyone who thinks that there is a better entity to control all of this to explain who that may be. Is the United Nations really the body which is the most capable and has the best intentions of world netizens in its sights? If any of the above are an indication, I truly beg to differ. Or maybe we should offer to share control directly with the likes of China, Russia, Syria and others, who say that the world would be better off without so much US influence over the system. To these critics, I ask them to point out what kind of restrictions the United States (through ICANN, among other authorities) has placed on Internet transmissions being sent to such oppressive corners of the world.
If the US followed suit with the same kind of false-hearted intentions that these repressive voices have in the upcoming telco regulations debate, the Great Firewall of China likely would have been more akin to a Great Firewall of the Internet as a whole. I'll believe the true intentions of the voices within the ITU when I see them in plain sight; but the closed-door drafts being leaked as of late don't place any high hopes in a truthful wish by these foreign bodies to enhance the viability of the open internet. Instead, it seems that they want to use their voices as leverage to the notion that a global Internet society should be censored in similar ways that they do so at home.
Don't believe the misguided communications coming from the nations of the ITU as anything but hot air aimed at derailing everything you know about a tax-free, open, and peaceful place to exchange ideas and commerce (aka the 'net.) Because as China and Russia have already proven, give these governments an inch and they will gladly take a mile -- in the direction that suits them best.
Photo Credit: Edel Puntonet/Shutterstock
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
Office 365 happens to be a product I think has a lot of potential. To be fair, it's Microsoft's second try at dedicated cloud-based email. Redmond first went toe to toe with Google Apps back in the days of BPOS (Business Productivity Online Suite), but they're distant cousins at best. With a few years' separation, Office 365 is Microsoft's answer to the growing threat Google Apps poses to Exchange.
The way I see it, Microsoft's torn internally. They are clearly still developing a wide range of Server and Exchange revisions on the usual upgrade cycle, but then signal a clear concession to the cloud by killing off Windows Small Business Server. While mixed intentions obviously represent the reality that they are innately a traditional software company, they realize that business is moving to the cloud whether they hold the leash or not.
If the death of SBS is any indication, it's that small businesses are slowly being bumped into the cloud. And just like the Coke vs Pepsi debate, businesses choose between two well-qualified giants. Continue in Microsoft-land with Office 365, or jump ship to the maturing email newcomer that is Google Apps? It's a tough question to answer, and one that small businesses are approaching my tech company FireLogic with on a near weekly basis.
I've been down this editorial road before. A few months back I wrote why I believe Google Apps is winning over users. But my previous opinion piece focused on the larger picture of suite vs suite. Lots of businesses look to these cloud giants solely for email. So I decided to pit the two in a cage match on this very topic, leaving the complimentary fluff aside so businesses can form their own reasonable opinions.
Both Office 365's Outlook web client and Google Apps' Gmail app have various things to offer. One size doesn't fit all, and I'm sure that businesses may find comfort in the good parts of each offering. I'm not here to say which suite a company should ultimately go with. But from a purely feature and performance perspective on email clients alone, a few judgments can definitely be made from my own first hand experience.
If Outlook had a Cloud Baby, Office 365 would be It
This fact may be welcome news or prove to be a burden, depending on how addicted connected you are to Outlook. I've spoken with numerous business owners who say that if they can't have Outlook, they'd rather not use email. It's usually that they just haven't seen the capabilities of Google Apps' Gmail, but also realistically because it's the only thing they've used for years. Either way, Microsoft has crafted the closest clone to what desktop Outlook looks like -- without the need for desktop software, of course.
While some aspects of Office 365's online email client are nuanced and clever, my biggest gripe sits with just how much of traditional Outlook got baked into Office 365. One of the worst aspects of Office 365 email is that in comparison to how zippy I find Gmail, O365 is just awkwardly slow in this regard. In preparation for this article I compared two (for lack of a better term) obese email inboxes, each powered by the two distinct email providers.
The Google Apps' Gmail account showed consistent performance when both changing folders (labels for native Gmail speak), responding to messages and working with different aspects of the account in general. I tried to give Office 365 the best benefit of the doubt that I could and used multiple different browsers including IE 9, Chrome (stable) and Firefox 16. It was smoothest on Chrome, but still slow in many aspects, especially just sifting through the primary inbox area. Even after being fully loaded, it had a tough time just scrolling through over 2,400 emails I gathered in the inbox. Gmail was as fluid with a full inbox as with an empty one.
Performance aside, visually Microsoft just can't bear to cut ties with Outlook it seems (good or bad is your call.) While I think Outlook is a decent desktop client (not great), when it comes to email, the cloud begs for a re-imagination of the UI. It's where Gmail offers a breath of fresh air by forgoing what we think of as a classic email interface. For example, one irk that sticks out is how Office 365 forces viewing of every new email message into a new popup window. Sure, I could revert to using the traditional preview pane, but I should have a choice I think. Gmail allows for native inline message viewing or via horizontal/vertical preview pane, for the record.
It's not all a mess in Office 365 Outlook, however. I give it some high marks in the way menus and settings areas are organized. Compared to Gmail, which tends to feel crammed like a sandwich in some option screens, Office 365 divides settings logically by tasks and affords some cleanliness in overall organization and layout. And for those who are looking for a brisk cloud replacement to desktop Outlook with a small learning curve, Office 365 delivers.
Outlook diehards will find themselves right at home. But seeing as Google Apps allows for seamless syncing with desktop Outlook if needed, it's hard to say that I prefer a half-baked reincarnation of Outlook in my browser.
Gmail's Strongest Points: Spam Filtering, Speed and Flexibility
If you're willing to learn the ins and outs of the Gmail interface, then you've got some real nice features under the hood. One of the benefits of Google's handling of Apps is the fast-track development path that allows Gmail to evolve at a much faster pace than Office 365 Outlook. There's no comparing the two when it comes to new features and filling gaps on sought-after needs. Google is definitely in tune with what its users are asking for, and just by skimming their public update feed, you can see that "stale" is far from how to describe Google's stance on Google Apps.
I tend to prefer the fast development path. It affords quicker implementation of features that my FireLogic customers yearn for, and represents what a cloud service should be: agile & responsive. Lots of businesses I consult for are initially wary about such a quick development path but in the end it's all personal preference. I expect Office 365 to move faster in changes/fixes than what traditional Exchange/Office offer, but not as swiftly as Google. Microsoft's still fairly new to the cloud game, let's remember.
When it comes to spam filtering, arguably one of the hottest topics in email today, I tend to see Google Apps' Gmail doing an overall better job than Office 365. I think a lot of this advantage has squarely to do with Google buying the spam filtering experts at Postini a few years back. While the standalone Postini product is going away, the inner workings of this quality filtering system are essentially fully rolled into Gmail, and the results are evident. In my non-scientific testing and experience, Office 365 missed more than a dozen blatant items of spam while wrongfully marking legit emails. Gmail handles spam as impeccably as I've seen on the market with minimal false positives.
That's not to say Office 365 is bad. It's leagues better in spam filtering than traditional on-premise Exchange (by a wide margin, I'll add.) Traditional Outlook users know that without some third-party app involved, spam becomes near uncontrollable. While I clearly chalk spam control in favor of Google Apps, Office 365 is a comfortable second place candidate.
One of the other aspects of Gmail that I just adore is the aspect of Labs. While these are turned off by default, they can be adjusted on a per-user basis or controlled on a per-domain level. These are features, adjustments, and other small fixes that Google is testing on Gmail users on an opt-in basis. Many of the Labs available are very useful and handy in improving your inbox workspace.
For example, there is a Lab that allows you to move the chat box to the right side of your screen; another Lab allows for quick Google Maps previews of addresses inside an email. One of my favorites (which I turn on for customers by default) is called Undo Send which "takes back" emails a short time after they are sent. We all know of times when we really didn't mean what we said, or forgot an attachment. These are just some of the ways which Google lets you experiment with functionality to your liking.
By contrast, Office 365 Outlook is pretty much "what you see is what you get" when it comes to interface and layout. A few things, like email preview, can be adjusted but on the whole the product is fairly static in design and UI. Of course, tweak-aholics may just opt to connect to desktop Outlook, but this defeats many of the positives of what cloud-centric email provides (if you've ever encountered a busted PST file, you know my pain.)
Each Platform takes a Different Approach to Unified Communications
An important aspect of each platform is how they view the topic of unified communications. And each system is starkly different in this area. Microsoft clearly appeals to those who may have on-premise phone systems that are capable of tying into its backend, while Google Apps takes an ala-carte approach in which users can take advantage of as little or as much as they please.
If you're looking for a seamless connection to your IP phone system via Microsoft's Lync, then Office 365 will suit your fancy. The platform allows you to choose between a variety of price points that allow you to take advantage of merely online-based live chat functionality or to tie in the entire phone system natively. I've seen offices in action that have this working, but keep in mind that Office 365 setup with Lync can get complicated quickly. The important take-away here is that you can go this route if you need to.
Google Apps' Gmail is an entirely different beast. It offers a very nice integrated chat system via Google Talk that essentially taps into the entire Google Accounts network (meaning Google Apps and Gmail accounts.) If you have a Google+ account tied to your email address, you can chat directly with connected friends in the same interface as you do with coworkers and business colleagues. Chat history (text form only) is automatically saved in your account for future reference, and can be searched/printed just like any regular email.
Where Gmail's "extended" communication functionality shines is in voice and video chat. Right from your browser, you can initiate a Skype-esque voice call to a fellow coworker, or opt for the bells and whistles of video chat through Google Hangouts. I'll be completely honest and say that this feature alone is worth the $50/person per year price tag on the service, since it provides a bevy of features which even Skype has a tough time competing with. You can do up to a 10 person live video meeting with friends or colleagues on the fly and even share desktops. I don't admit this in public, but my company has been doing virtual meetings using Hangouts for about a year now (we love it.)
Office 365 Outlook offers a very minimalistic approach to in-browser communications. There is semblance of inter-company chat in Office 365, but it's very clunky and honestly not half as clean as what Gmail provides. It definitely seems like an after-thought by Microsoft and didn't get the time of day it deserves. To be fair, you can install Microsoft's Lync desktop client to get most of what Gmail offers by means of live chat, but again, this does nothing to cut the desktop software umbilical cord. If Microsoft is serious about being "all-in" with the cloud, it needs to get this functionality into the browser and off the desktop.
Gmail offers the Best All-Around package, but Office 365 has its Place
Most businesses looking to make the jump to the cloud will likely find themselves best suited with Google Apps' Gmail. It was built from the ground up for browser-based email usage, and truly dumps the need for rescinding back to desktop Outlook (unless you truly need it.) Between the benefits it affords around "extended" communications features, spam filtering and overall speed, it's a clear winner in my book.
But I'm not counting Office 365 out by a long shot. Microsoft is truly dedicated to the cloud era as is evident from their killing of Small Business Server. There is no reason that Microsoft can't fix the current shortcomings in Office 365 Outlook and bring their cloud platform into the next generation. Besides, if the new Outlook.com service is any beacon of hope, it's that Microsoft still has a few tricks up its sleeve.
Photo Credit: 2jenn/Shutterstock
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
If you think you read my title wrong, take a second look. You'd think from all the overblown attention that the Modern interface is garnering, that I was going to focus another drab op-ed around that sole feature. Yes, the Modern UI is a radical change and will turn a lot of people off. But let's not forget that with every new Windows release comes features that actually don't get the time of day. I think a few of these deserve a sliver of attention.
We've been down this road before. Let's not forget that the introduction of the Office ribbon menu system was considered shocking back in 2006, and years later a majority of users have accepted and embraced the changes. Apple received similar kickback on its radical iPhone design back in 2007. I truly believe the heat on Windows 8's Modern UI will come and go like the rest of modern tech's evolutionary moments.
As the fog surrounding the UI debate starts to dissipate, it's interesting to see the preliminary support Windows 8 is getting for the enterprise and business in general. In the newest issue of Redmond Magazine, Don Jones pens a timely piece highlighting some of the better aspects of Windows 8. Correcting the misnomer that most pundits pass along, Jones explains how the Modern UI is nothing more than a "dashboard" like the Dashboard feature in OS X. There is still one true desktop in Windows 8, easily accessible in multiple manners at any given time.
I happen to like Windows 8 a lot -- and I don't adore the Modern UI. But then again, it didn't take a complete reawakening for me to understand how to best utilize the user interface. I think positive feelings about Windows 8, while taking into consideration qualms about Modern UI, can be mutually exclusive (unlike some other tech pundits would make you believe.) If you think Windows 7 is fast, 8 is that much faster. In many areas, it happens to be cleaner and offers a value proposition that end users may actually realize. Windows Refresh, anyone?
I was also intrigued to hear about Microsft's own internal transition to Windows 8 at a small tech briefing in Chicago a few months back. If any company has something to prove about enterprise embracing Windows 8, it has to be big M by far. Since mid-July 2012, Microsoft claims to have had 30,000 computers and roughly 30,000 employees moved over to Windows 8 (along with IE10.)
Impressive to say the least. I asked a Microsoft rep present at the aforementioned Chicago tech event how employees were dealing with the transition, and the answer was pretty honest. Smooth for most; rocky for others. Enterprise transitions are never easy, so it's good to see Microsoft eating its own dogfood very early on.
While the early adopters are clearly making strides towards Windows 8, these are some of my arguments for considering passing on Windows 7 in favor of 8.
1. Windows rollouts take 12-18 months; why fall further behind on your upgrade cycle? Most estimates on a business migration from one Windows version to another are pitted at a 12 to 18 month timespan. Think about this purely from a ROI perspective for a second: if you haven't even begun a migration to Windows 7, where will you be time-wise when you finally finish? Keep in mind that Windows 7 came out in mid-2009 which means if you are pondering whether to begin a move to Win 7 just now, you won't be finished until about late 2013 if not well into 2014. Windows 7 will already be an OS that has been on the market for 4-5 years at that point, meaning you will be eyeing your next Windows move in the not-so-distant future.
It's also acute to be mindful of Microsoft's support lifecycles for Windows versions when planning migrations. Windows 7 loses mainstream support by January of 2015, and gets completely cut off by extended support in January of 2020. At best, you're talking about an OS that will see only about 5-6 years of usage max before it's time to get back to the rollout drawing board.
In contrast, if your company made the move to Windows 8, you would enjoy a much healthier lifespan. Mainstream support for 8 ends in January 2018 and extended support runs all the way until January 2023. If I were planning a Windows migration for my FireLogic customers, I would be taking a second (or third) look at skipping 7 entirely.
2. Windows 8 is faster than 7 in every respect. Here's another thing you may have missed in the all FUD surrounding Windows 8: it's a helluva lot faster than Windows 7. And Windows 7 is what I consider the best OS yet from Microsoft! How can it perform better in light of what most people see as bloat from Modern UI?
It's got a few things going for it. First off, Microsoft stripped out a lot of what bloated Windows editions in the past. Things like DVD playback and other extraneous features that can easily be added on via third-party apps were dropped in an effort to slim down the OS. It's not surprising, then, that Windows 8 uses less memory pound-for-pound compared to Win 7. Couple Windows 8 with a decent SSD and you can turn that Vista-era snail of a PC into something almost reborn.
The reduced memory consumption and overall smaller footprint of Windows 8 means that aging corporate desktops can still keep ticking. If you were initially planning on replacing more of your PC fleet than you hoped, Windows 8 may save your company some decent cash. Testers claim that Windows 8 can run with as little as 128MB of RAM. While it's more proof-of-concept than anything, the takeaway from this should be that you may not have to toss that old hardware just yet.
In my testing, Windows 8 starts and shuts down so fast that I don't even bother putting it to sleep anymore. Cold boots are nearly as quick as sleep mode. Don't blink, because you may not even notice the new boot screen!
3. Microsoft's focus on security in Win 8 is readily apparent. If the performance aspects of Windows 8 aren't enough to sway you from 7, then perhaps all of the investment in security features will tickle you nicely. While there are too many to name in here, a few of the most important ones must be mentioned. Secure Boot is a core feature of Windows 8 security that in essence locks down the OS initialization process to the point where rootkits and other popular malware will no longer have a place to hide. Microsoft couples validated secure firmware to help authenticate the boot process and get rid of the "back door" that has existed for so long.
Windows To Go is a new feature that replicates what we have come to know in the Linux world as Live CDs. How does this fit into a business' usage of Windows 8? This enables an IT department to hand out Windows To Go powered flash drives (not all flash drives are compatible though) to contractors and other short-term workers who need access to a standardized instance of Windows 8. In the past, IT had to provide the hardware and software for end users. Not so much anymore.
AppLocker is a feature returning from Windows 7, but improved over its first iteration. This simple blacklist/whitelist technology allows admins to create strict application policies for end users, which extends to Modern UI apps as well in Win 8. For businesses looking to truly lock down a common desktop environment, AppLocker has to be one of the greatest gifts from Microsoft.
4. Sick of managing printer drivers? Windows 8 does away with the old mess. This aspect of Windows 8 hasn't nearly gotten the press it rightfully deserves. But for anyone who has tried to manage a modern print server, it's fairly well known that making end-users' lives easier entails a lengthy process of finding proper drivers, testing them, and deploying them centrally - hoping nothing screws up in the process.
Microsoft realized the mess that we know as printer driver hell and built an entirely new backend for getting Windows 8 and printers to talk. The technical details are explained in an excellent but lengthy blog post. Toss out everything you know about print drivers to date. Starting in Win 8, printer compatibility is primarily achieved through the use of truly "instant" connections made via a modern "printer class driver framework". This radical shift was sped up due to the arrival of Windows RT, but was properly extended to the entire Windows 8 range.
Printing in Windows, until now, has been literally a 1:1 process behind the scenes, where a specific print driver was matched to a given printer. This had to be specific per edition of Windows, and even further down to the variations between x86 and x64 flavors. This new framework allows a common printing driver to support a near endless array of printers old and new. Microsoft knows that not all printers will work with this new model and built in full compatibility with all previous Windows 7 printer drivers. But going forward, Microsoft aims for what the introduction of USB was supposed to harbor: true plug and play.
5. Multi-monitor support is finally done right. Corporate workers tend to use multiple monitors now to get their work done. It's a simple fact of life. At my previous school district IT job, the mere mention of getting a spare monitor for dual-screen usage caused some very public jealousy. Windows (even Win 7) wasn't perfect with how it handled multiple screens. Initial detection was always spotty; the taskbar never quite figured out how to span across all screens; and moving applications between screens was sometimes a chore when perfect placement was necessary.
Luckily, Microsoft has done a great deal to address the issues with multiple screen usage in Windows 8. Using multiple monitors shouldn't be a chore, and has been simplified in many regards. For example, you can now easily tell Windows 8 to span the common taskbar across all your screens. Customization of the various desktops is vast, with the ability to span large wallpapers or have separate wallpapers for every monitor. You can even move Modern UI apps over to different screens to your liking.
Other nagging issues like losing track of which taskbar items belonged to which instance of an app have been addressed. You can duplicate open items in the primary taskbar, and also have the screen that hosts a unique window to show its respective taskbar item on that same monitor. Bringing up the Start menu can be done on any screen, and likewise can be done with the common Charms bar. Microsoft spared no expense to get multi monitor support down to a T in Windows 8.
Final Thoughts
If you judge Windows 8 on the introduction of the Modern UI start screen alone, it may tank at first glance. But I challenge those involved with enterprise (or even small to midsize business, for that matter) IT to give Windows 8 a second look. It's polished, speedy and built with security in mind. I've got nothing against Windows 7, but after giving Windows 8 a spin myself, my apprehension with Microsoft's latest desktop release is dwindling quickly.
You can grab the 90 day Windows 8 Enterprise trial over on MSDN and see what you think. Here's hoping you actually find some usefulness in the new Windows -- as I surprisingly did.
Photo Credit: Pete
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
The cat's out of the bag, and we can all stop guessing as to what the Surface RT will cost. Microsoft confirms many things, namely that Steve Ballmer was spot-on with his estimates on Surface pricing roughly a month ago. The Surface RT is going toe-to-toe with the iPad down to the very last penny. That's a good thing.
One thing I'm curious about is how Surface will change the way K-12 looks at computing devices for the next generation of students. I've already penned my thoughts on why I believe the Surface could very well outshine the iPad in education. A big part of this winning equation has to do with the ecosystem that surrounds a given technology.
And I personally think Microsoft, not Apple, has a definite leg up in this area. Where iPad lacks, Surface intends to pick up; 1:1 education efforts in K-12 focus on providing each student that enters a given primary education level (namely high school) a single device to centralize their education upon. Yes, bid your goodbye to paper textbooks. eTextbooks are becoming the new norm.
iPad is the Forerunner but Largely Due to Lack of Effective Competition
Over the four years spent at a prominent high school district in Illinois, I saw first hand the enthusiasm that iPad received. It almost became a running joke for us IT folks at educational technology conferences, seeing how many lectures on classroom usage focused on the iPad. So much so that those in the IT crowds yearned for something new, fresh, and technologically more sustainable.
I'll fully give iPad credit where it is rightfully due. That credit sits solely in third-party application support and battery life. These are two areas where similar 1:1 trials with competing devices (namely netbooks -- yes, even I cringe at that defunct term) flailed at best. But after hearing about the 1:1 trial efforts of various districts dabbling in iPads, the technical concerns were what I took away from these conferences.
Centralized management. Security. User-level permissions. Ease of updates for apps and core iOS software. The list went on. And while those tasked with taking care of the devices (commonly teachers, as IT folks tended to stay clear of adding onto their already burdened task lists) tried to smile as they showed off their best efforts in classroom device management, it was clearly a love-hate thing at best. While the devices gave students an outlet for modern learning, I could see teachers were quietly wishing for a beacon of hope.
While it has yet to be seen what this potent combination looks like in the wild, it's hard not to realize that Microsoft may be looking to help K-12 fill this void left by the iPad. Sure, Surface will be far from hitting any astronomical app counts in the Windows Store anytime soon, but this doesn't mean development won't pick up when the device's potential becomes more apparent. Keep in mind that the iPad didn't launch with any dizzying array of useful apps, and it took a good year and a half before truly exceptional content started to hit the platform targeted for education.
Surface + Active Directory = Easy Alternative to iPad Management
Up until now, most districts in the USA offered a top-down approach to device usage. "We" provide the device; "We" manage the device; and "We" keep the device in school at the end of the day. That model is being turned on its head in a few important ways.
After numerous years of trials, K-12 is realizing that consistent deployment and usage not only cuts costs in textbooks and shared devices, but also fosters better learning through student buy-in with the technology they use. As more and more learning shifts to the Internet as a primary medium of informational knowledge and sharing, a 1:1 experience for students is becoming ever more integral for the 21st century.
But with decentralization of technology comes a glaring problem: keeping everything updated, in order and secure. As my experienced reflections above show, IT departments at school districts aren't fully sold on the concept of iPad as the de-facto common device. A classroom of iPads is tough to manage -- but a school of 2,000 students bearing the iconic slate is a downright nightmare.
If Microsoft listens to the voices of the educational technology community at large, it will follow through on what is a clear winning combination: Surface in the hands of students and Active Directory in the hands of IT management. Updates could be centrally managed; security policies could be rolled out en-masse; and a bloated iTunes-like application wouldn't ever be needed as an intermediary to handle all of the above. Not to mention this could all be pushed over already-standard 802.11g/n infrastructure, which means no downtime in managing cords, cables and the "octopus headache" that some iPad trials are known for.
Of course, all of my predictions on the Surface are just that: hopeful thinking. Microsoft's Education department has either not realized the potential throng of Surface users in K-12 or is waiting to lay their plans out on the table. Either way, I'm itching to see what happens because I truly think if Microsoft doesn't capitalize on Apple's shortcomings, Google surely will over time with the already released Chromebook.
"All you need is web" (and Chromebook), Google claims
In a way, Google's blog post of the same title is a bit presumptuous and perhaps a tad arrogant. But I happen to agree with it. Over the course of my previous IT career working for a public high school, I have seen the shift in progress with my own eyes. You could say what you will about traditionally installed software, but to the large majority of students I served, Internet access is almost a necessity in one way or another.
One could argue that this may stem from the fact that a large portion of learning and research is shifting to the web. This is true in every respect. But look at all the auxiliary functions that I saw personally supplement (sometimes overtake) traditional software. Students commonly use Google Docs in some form for classroom work. Mind mapping websites such as Bubbl.us are completely free and allow students to work on concepts anywhere and anytime. Even large encyclopedic references that used to be centrally managed by our district are offloaded to new interactive online-only editions -- and this over the course of only four years!
I personally like the proposition that the Chromebook makes. Just as the Surface potentially affords a school district simple management through Active Directory, Google's Chromebook takes this same notion and simplifies it a few degrees. Whereas a traditional IT department usually controls policies through Active Directory, a fleet of Chromebooks can be controlled by people with little technical background like teachers or even school execs. That's because the core management responsibilities for Chromebooks are easily manipulated within the familiar cloud-based Google Apps Control Panel.
And while Microsoft's approach to Surface management is still an educated guess at best, Google's definitely not blowing any hot air with their native Chromebook capabilities. My technology company had the chance to help Dallas County R-1 Schools in Buffalo, Missouri roll out their initial batch of Chromebooks back in December 2011. By the time our 3-day training effort was complete, we had teachers (read: not IT folks) managing default applications, home pages and other settings for their students' Chromebooks. Everything was controlled through a common web interface, with access no matter where teachers were -- home or at school. Yet upper level administrative control was never lost.
Since Chromebooks have been around for about a year and a half now, the true dollars and cents are starting to become clear. Google and IDC co-sponsored a study on Chromebook device management and the results were fairly staggering in comparison to traditional IT management of devices. For a 3-year time span, the Chromebook achieves a TCO of $935 per device. When it comes to installation, Chromebooks require 69 percent less labor. And the biggest eye-opener for educational IT departments has to be the fact that Chromebooks require 92 percent less labor in long term support. With tightening school budgets, these are cost savings that are hard to laugh at.
It must be pointed out that districts that have already invested in the Google Apps for Education path will naturally have the most seamless integration of Chromebooks. Google Apps is the cloud-based email and communications suite that is completely free of charge for any K-12 school district. I personally believe Google Apps is much more mature in comparison to Microsoft's hobbled attempts over the years with its own cloud-based email suite Live@Edu, which has been haphazardly replaced by Office 365 for Education (which is not free, by the way.) My former district was one of the first to move into Google Apps for staff and students, and it paid off nicely with simplified email and document collaboration, which our traditional Microsoft environment alone just didn't provide.
Will a device that considers the "web" its primary start screen succeed? My best guess is as good as yours. Chromebook's price point is definitely attractive for districts, especially when pitted against Surface (note: non-RT pricing on the Surface is still a mystery, which will have the AD tie-ins I predict in this article.) Only the market will unravel the true story of 1:1 device rollouts across districts in the USA. If Microsoft and/or Google can help clarify their advantages over Apple's pothole-filled path towards 1:1, they can easily help steer the debate in their favor.
I love the promise that both Surface and Chromebook have in terms of the ecosystems they offer. Apple may provide a name brand and an established experience, but it doesn't have the understanding of the Enterprise to be taken seriously. Now the underdogs just have to catch up to Apple where end users care: third-party apps.
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
The golden years of Apple's outright dominance in technical innovation is fading, and quickly at that. The iPhone 5 just launched with a deservedly ho-hum and lackluster reception, with many people asking the obvious question: that's it? For a company riding the high waves of Wall Street for more than a few years now, with earnings going through the roof quarter upon quarter, is this the best that a larger-than-life tech giant can bring us?
Maybe the naysayers are right in that Apple is the leftover shell of a monolith once passed (post-Jobs.) Perhaps that internal drive to bring out the best in technology they release is starting to fizzle. I'd go as far as to argue that Apple never really has been as continually innovative as many people may believe. While Apple does have an easy ability in commanding the lead for sectors it enters, this doesn't necessarily mean the company if filled with technical Einsteins as so many supporters clamor to believe.
Apple the Evolutionary Leader Follower
Take Apple's staple desktop OS, Mac OS X, for example. While the company goes to great lengths in order to detail feature after feature improved by each release, take a look at the long-term picture of what OS X represents over a 10-year span. At each point in time, you can see where Microsoft's Windows OS stood relative to OS X.
Do you see the trend? Microsoft has put forth larger monumental changes to the Windows platform release-upon-release than Apple can lay claim to for the near entire 10-year span of the OS X history. Niche features and apps aside, OS X is the same behemoth on the outset that we first saw back in 2001. It doesn't take a genius to connect the dots and re-evaluate whether Apple's design improvements are as far reaching and endearing as marketing claims to be.
This doesn't only ring true in the GUI everyone uses. Take, for example, the aspect of internal OS security. We can all agree that Microsoft has made leaps and bounds improvements in security; in the 10 years since XP, Windows 7 stands tall as the Fort Knox of client OS examples. So much so that many experts outright claim that Windows is pound for pound more secure than OS X and even the impenetrable iPhone is starting to gain the attention of malware writers. After years of pounding the "Macs don't get viruses" drum, Apple quietly admits that Macs aren't as bulletproof as once believed.
Playing catch up in Smartphone Features Every Direction You Look
The innovation disparity doesn't just reside on the desktop side. With the recent release of the iPhone 5, Apple's smartphone revolution evolution grinds to a path of limited improvement, if even dare I say it -- mere catch up. Just as with the desktop OS, have a look at the two major players in the smartphone arena right now, Apple and Samsung. Pitting three generations of the same model line of phone side by side, which competitor represents larger generational improvements in a like-for-like comparison?
Not only have Samsung's phones undergone appropriate physical changes to adapt with the smartphone needs of users at large, but the Android software powering them has also seen similar shifts, not just mere upgrades. But in face of all the competition from Android (and soon to be Blackberry 10 as well) to win the hearts and minds of today's hungry users, Apple turned on the glitz a few days back to churn out a mediocre iPhone 5. I'm not the only one scratching my head as to some of the decisions the company made with this new phone.
Near Field Communication? Nowhere to be found. Better battery life? Marginal at best. A true revolution in UI design? Apple was overly-giddy in describing how they've -- gasp -- added a 5th row of icons to iOS 6. And screen resolution? The Galaxy S3, Lumia 920, and even upcoming Blackberry 10 devices all have Apple beat already. If the iPhone 5 feature set is an example of the golden company it represents in users' eyes, then I guess we can begin discussions about a possible OS Hall of Fame induction for Windows Vista.
But I guess iPhone users really don't care about the specifics that much; they've been more than comfortable on 2007-era 3G speeds for some time now, so the new iPhone 5 must be a godsend in some regards. I pretty much run my computer repair company FireLogic on a Blackberry Torch 2 with HSPA+ so 3G is a fairly distant memory for me.
Oh, and I almost forgot to mention Apple's ingenious (read: sarcasm) new connector design, which debuts on the iPhone 5. Instead of allaying to industry standards and using an affordable Micro USB connector like everyone else, Apple implemented a new proprietary connection that will require a wonderful $29 adapter. The only win from this idiotic proposal is for Apple, which will rake in the hundreds of thousands of dollars from senseless adapter purchases. Yet the Apple feens will continue to support the march toward proprietary-dom while shunning the likes of Microsoft and Android.
Apple's Best Trait? Corporate and Legal War of Attrition
If Apple isn't the bastion of technical improvement, then what does it have going for it? A great corporate engine that allows the company to smash competition from mere size, if anything else. A fresh example of this corporate grind attitude just surfaced this week, with news reports that Apple is suing a Polish online grocer for far-fetched allegations that logos and website addresses are too close to that of the tech giant's. This is just a small blip on Apple's ever-growing war map. Some victims happen to be innocent bystanders, as in this latest Polish grocer suit, but some are clearly intended targets in Steve Jobs' not-so-secret thermonuclear war on Android.
Putting aside the sole fact that Apple defeated Samsung in the courtroom via questionable jury deliberations, the bigger issue here is whether Apple was truly justified in its positions against Samsung. It's already well known that the jury itself wasn't as much interested in technical details as they were in sending a message to Samsung. But on the larger picture, if Samsung really did copy Apple's intellectual design, where does that leave Apple in its own actions over the better part of its public history?
Tech entrepreneur and angel investor Mark Cuban took to Twitter shortly after the ruling was announced and quipped a few interesting things. "If the IBM PC was created in this patent environment there would be no Apple. They would have sued them out of existence". Probably the best message was, "Dear Apple, Xerox PARC called, they want their interface back". Funnies aside, Cuban is entirely right on the money. Apple is clearly set on auto-pilot for all out legal destruction of the competition.
Let Apple have its day. While the company seems to have taken the home court advantage and ran with it, I doubt future legal battles will be as clear cut as this one. And I really do hope that the courts push back on Apple's outright war against competition. For as much heat as Microsoft has received over the last decade for its dominance with Windows, I think Apple should be taken to town on the same laurels for its doings in the smartphone arena.
Only time will tell, I guess. Perhaps 50 years from now, as Malcolm Gladwell predicts, Steve Jobs will be almost completely forgotten in most respects and Apple will clutch the likes of a WebOS-esque standing in the smartphone world. For now, let competition ring so the technical leaps of Apple's marketplace foes can start becoming as evident as they should be. And even more importantly, it's time that the courts show Apple that legal teams alone can't drown out the competition.
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
If you still have a MySpace you likely fit into one of three groups: You forgot to formally delete your account; you are trying to advertise your small-time band to a couple dozen hardcore leftover users; you log into MySpace right after you finish signing into AOL Desktop merely as a matter of old habit.
But I'm not interested in singling out those still using the service, as the droves of users who have dumped the website outright far outnumber the faithful by now. I'm outlining something I'd call the "MySpace Effect".
Last year, MySpace shed as many as 10 million users in the single month of February. That's more than a Spring cleaning, it's more akin to a Spring wipeout. While direct comparisons between the dreadful downfall of MySpace and the current outlook for Facebook and Twitter may not be directly applicable, there can be some general extrapolation made to some of the early signs of things (soon) to come.
If Facebook's loss of 2 million users over the past six months is telling about anything, it's that the social media craze of yesteryear has peaked -- or is even on the slight decline already.
Facebook and Twitter are "Me-centric" Havens
When I was in college, Facebook still had a purpose. These were the days when the service was designed for college users only, and there was a sense of exclusivity for the site. But it wasn't the VIP club aspect that kept people like myself coming back daily. It was the tightly knit community feeling that existed among users back then. There was no such thing as Timelines to create online effigies of a life scrapbook. Advertising on the website was non-existent, and it was the discussion between classmates that kept you interested at next login -- not the useless updates about all five of your closest buddies checking in at the dive bar down the street.
Facebook and Twitter can boast about user numbers all they want, but those figures don't solve the core problem their products are causing: social media depression. More acutely, the American Academy of Pediatrics dubbed this new disorder Facebook Depression.
While CEO Mark Zuckerberg's brainchild takes name credit for the problem, I see clear lines between similar behavior affecting both obsessive Facebook and Twitter users. Utah Valley University came out earlier this year and solidified these initial connections between heavy social media users and health problems. Their study linked low self esteem to the very users who have a hard time logging off.
I've personally dropped my social media usage by fair margins. The Twitter account I had over a year ago was deleted after my dissatisfaction with the service, and my Facebook account now exists solely to keep tabs on my company FireLogic's presence on the network (if a personal account weren't necessary to handle this, I'd likely shut that side of my presence too.) I can see how people entrap their lives in the endless cycle of playing social catch-up. Your friends do something fun and exciting, so you have to one up them and display it to the world. They, likewise see the updates and get caught in the same euphoric reaction, and the downward spiral continues without end until one eventually gets diagnosed in the newly dubbed (and still relatively murky) Facebook Depression.
Are the 2 million users who logged off Facebook in the former part of 2012 a group that has grown disenfranchised with the "game" of keeping up with friends? Or are they merely jumping ship to other services like my favored Google+? Or perhaps, another scenario: are they just logging off from social media altogether? It's hard to say, and while there aren't any clear numbers to back up any hypothesis yet, I'd fairly assume that it's a healthy mixture of some or all the above.
Twitter's Content comes from under 1% of its User Base
Never mind the fact that research firm Pear Analytics came out with a study in 2009 that pinned over 40 percent of all tweets generated as "pointless babble." Even if four of every 10 tweets generated mean nothing to anyone, the Twitter faithful will still clamor to the 60 percent of tweets that still have some value. A pair of Rutgers University professors beg to differ, however, and in 2009 further proved that 80 percent of users on Twitter are merely on the service to promote their own self-centered activities: what they're wearing, where they're going, and what they're thinking. Maybe Time magazine got it right in 2006 for their Person of the Year special; perhaps it is all about us in the end.
If Twitter is supposed to give every user the power to disseminate useful information, then even more troubling to its longterm survival is the fact that 50 percent of all tweets generated come from only 0.05 percent of its users. How is this so, you might ask? Simple. Twitter's initial appeal of "me-power" eventually wears off for most users and they utilize the service to follow the legions built before them. Namely, that stands in the form of the Twitter upper-echelon that control the bulk of information shared on the service.
The "big four" as they call them include media, organizations, bloggers and celebrities. If this is truly the case, is Twitter even necessary as an information survival tool in the first place? I myself figured out that Google Reader still did a better job in delivering the news that I cared about, hence part of the reason I dumped Twitter altogether.
What does such a top-heavy service like Twitter truly offer for the long term? Once the big four start flocking to new avenues of user interest, I think Twitter will be joining in the MySpace emigration party. The above numbers clearly carve out a picture of the service that is anything but the idealistic "information nirvana" that enamored its claim to fame.
Google+ fosters a Breath of Fresh Air for Users
I'm not the only one who feels optimistic about the hopes for Google+ as a viable alternative to the aforementioned. While I'm not as active on the service as I'd like to be, perhaps that's the reverse psychology on why I enjoy it when I do log in: there's no feeling of being left out. There's almost an unwritten rule of the land on Google+, which keeps the update feeds fairly clear of photo dumps from overzealous vacation trips, status updates from wild bar nights and the other useless banter which Facebook and Twitter are flooded with. I'm not sure where the different thinking actually lies. Are users flocking to the service as a means to escape Facebook-ness? Can't say, but whatever it is, it's working.
Google+ must be doing something right though. While Facebook's users are starting to log off, and Twitter continues to get hammered by its "big 4" categories of tweeters, Google+ wins the hearts and minds of American users. Of all the social media heavyweights out there, the only service that can take top honors next to Google+ is Wikipedia. In the truest sense I wouldn't consider Wikipedia on the same level as Google+, but definitions aside, the bigger problem here is that Facebook and Twitter are languishing at the bottom of the satisfaction survey. In fact, the only service that ranks lower in satisfaction than Facebook is MySpace, and that's merely due to technicality since the website isn't being measured for satisfaction any longer (if you're curious why, check my second paragraph to this article.)
Where is the disparity, then, between the supposed upticks in user counts and downticks in satisfaction? I'd argue that while both Twitter and Facebook are still growing user bases in new territories overseas, the formerly entrenched State-side faithful of both websites are starting to feel a bit different about their appetites for established social media. While Pinterest has garnered some of the attention of a meandering (mostly female) user base, its satisfaction numbers still lag a fair margin behind that of Google+.
A more respectable argument would be that a good portion of these users leaving Facebook specifically are tired of being stuck in a constant uphill battle of social competition. At a certain point, one must ask: is it worth it? Waking day to day to merely get back onto the hamster wheel of virtual social movement?
Judging from the referenced statistics coming out about user behavior on both Facebook and Twitter, I think permanent separation is becoming a more tenable option for a growing majority.
Photo Credit: ra2 studio/Shutterstock
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
The conjured term "death spiral" has been so overused in discussions about Blackberry-maker Research in Motion that one must ask: Are the tech pundits crying wolf too often, too soon? Do a targeted search for "Research in Motion death" on Google and you will easily see that this rush to judgement started all the way back in early 2010. Like the doomsday naysayers of yesteryear, RIM's date of decease has anything but solidified (to some pundits' shock.)
The short-term future for RIM is a rocky road indeed. With its face-saving Blackberry 10 OS release being pushed back another quarter into early 2013, the smartphone giant has little glitz to match the other big boys temporarily. Samsung's instant-hit -- aka the Galaxy S3 -- has already touched down. Google's got its latest iteration of Android, Jelly Bean, cooking for its flagship devices including the Nexus and the S3. And the iPhone 5 rumor mill just can't take a week off as of late.
Let's also not overlook the fact that RIM has announced upwards of 5,000 layoffs of its core staff to help trim costs and restructure internally. But does all of this add up to a near term death notice for the BlackBerry goliath? I doubt it - unless things get much worse, much faster, that is.
Perfection First, Market Release Second for Upcoming BB 10 OS
There's no denying that RIM's overzealous approach towards the development cycle of its most profound OS update to date is hurting its PR game. But even so, let's face it: RIM's never been a marketing genius and likely never will be. Their products fill gaps that Android and Apple devices have yet to fully figure out, and it's why RIM has a big shot here with their OS 10 reboot.
So far, they seem to be hitting the mark. Even the most hesitant analysts give BB 10 some high marks for first impressions of the new platform. BB 10 ditches the Java backbone of most BlackBerry OS versions to date and opts for the stable and steady Unix spine that comes with the new QNX-powered ecosystem that RIM purchased a short time ago. This upheaval in the underlying architecture will bring numerous functionality changes, like an arguably best-in-class modern touch keyboard.
Multi-tasking is also being taken to a new level with what RIM calls "Flow view" integration that tears down the walls when it comes to using different apps at the same time. From everything RIM has shown off thus far, BB 10 will hopefully meet our hefty expectations. It's good to see that their recent BB 10 preview at BlackBerry World in May lifted more than a few brows.
In RIM's defense, however, this begs the question: what's so wrong with getting their new BB 10 devices in perfect order before rushing to market? The Apple legions won't admit it, but by all realistic means, Siri has been a fluttering mess since release. Apple's own cofounder didn't hesitate to provide his own thoughtful lip service as to the sub-par effectiveness of Siri.
And the company has started defending itself against what is likely the first of many lawsuits surrounding misleading claims on what Siri can truly do. And should we even get into specifics about the sour taste left in everyone's mouths from the short-lived MobileMe? Being first to market is great - as long as you have a quality product to back up the fanfare.
All Doom and Gloom? The Numbers State Otherwise
No, I'm not delusional. I'm fully aware that RIM's current financial situation isn't anything to brag about, nor is their now stagnant BlackBerry 7.1 device offering. But outside of the circle of raw marketshare numbers for all the smartphone brands, RIM's outlook isn't as bleak as some make it out to be. For example, BlackBerry has continued its strong hold on the British market and actually topped two years in a row as the leading seller of smartphones in the United Kingdom.
RIM likewise recently announced that it hit 78 million subscribers globally (up from 70 million in September 2011.) And a recent interview by CNET with RIM's managing director for the USA, Richard Piasentin, brought to light the fact that RIM is still in clear control of smartphone sales to 90 percent of Fortune 1000 companies. Companies which, according to Piasentin, are very loyal to RIM.
The US Government is also hesitant on Android- and iOS-based devices due to the stringent security needs that only BlackBerry has been able to provide thus far. The DOD a short time ago approved many BlackBerry 7 devices for official usage, and the governments of Australia and New Zealand followed in similar fashion. If BlackBerry is safe enough for the leader of the free world, RIM still has some decent hope left for the ride into BB 10 next year. While government purchases don't see the kind of consistent sales upswings that smartphone makers need to stay viable, this relatively secure source of solid income for RIM should help keep the company's coffers afloat until BB 10 reignites the brand.
And while the US market may be a tough region for BlackBerry as of late, other parts of the world still admittedly flock to RIM's devices. Just over a week ago a report came out from SRG that placed BlackBerry Playbook tablet in second place only to "weakening" iPad in Canada. The Middle East is giving RIM a nice push as well, with a reported 140-percent increase in subscribers compared to a year ago. One of the primary reasons for RIM's continued success in the Middle East is consumers' love of BBM, the integral messaging service that was only recently replicated by Apple in the form of iChat.
RIM believes in the BlackBerry brand so heavily in its AMEA markets that it recently announced opening its own BlackBerry stores across the regions. Dubai will get a store, as well as Kenya, Nigeria and roughly 4,000 locations in Indonesia alone. Apple already has a budding presence in these territories, but if RIM holds to its word, it could outpace Apple's own mastered retail strategy.
BlackBerry Still Capitalizes on Core Strengths: Battery Life, QWERTY Keyboard, Messaging Heaven
People call me crazy when they find out what kind of smartphone I use. "Are you stuck in the dark ages?" That's probably one of the best responses as of late when I mention that I proudly carry around a BlackBerry Torch 9810. Am I really alone in the characteristics I adore in a smartphone? A physical keyboard, excellent battery life, best-in-class messaging, and portability that doesn't kill the pocket in my pants?
Outside of the Verizon-specific Motorola Pro+ Android phone, no one has realized that there is still a class of users that don't care much for horizontal sliding keyboard phones or pure touchscreens. Nor do I have much appetite for what is on the top-10 free apps list of Apple's iTunes Store. I live in my email, Evernote, mobile web browser, and the occasional Pandora. The rest is all a nicety, and BlackBerry handles my core needs without issue.
It's quite entertaining to place the question back on friends who flash their iPhones or big-screen Androids, but are endlessly stuck on power cords recharging when my BlackBerry's battery is just ticking down to about 50-percent empty. My computer repair company FireLogic depends on my ability to be 100-percent mobile & available when on the go, and I can't worry about when I'll be next to a power outlet.
And to those who think I've never given the other side a try, I actually did. I spent about a month on an iPhone 3GS a bit over a year ago and was not heavily impressed. Everything I relied upon for my daily work needs was slower or felt clunky. I guess in the end I still had access to oodles of free games from the App Store, something which Blackberry's App World can't touch in terms of raw numbers. And seeing how even Apple's flagship 4S still lacks 4G access of any sort, my HSPA+ capable BlackBerry Torch 9810 is still leagues faster when it comes to data rates. To top it off, no US version of the iPhone can even offer me WiFi-Calling like all the BlackBerries I've owned in the past 3 years. Apple's iPhone offering in my eyes is nothing but 'old hat.'
Numbers Don't Paint the Whole Picture
Research in Motion still has plenty of game left in it. It's sailing on a ship gone slightly astray for the past few years, but with BB 10, I think they've got what it takes to get back on the right course. Let's not make the RIM survival discussion merely about numbers alone. Apple has been in this same position as RIM before, lest we forget.
Its own unsavory little anniversary is coming up on August 6 - the day in 1997 that Microsoft helped keep Apple afloat with a generous $150 million investment. And look how far Apple has come since those dark days, when everyone thought they were otherwise done for in the wake of a Windows-powered world. If history has anything to say about this mess, I'm not counting RIM out just yet.
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.
Unlike most tech industry analysts that pit Google versus Microsoft in a paper-specs war each time they opine about these cloud email platforms, I’ve got two cents to offer on the subject from a slightly different -- and perhaps more down-to-earth -- perspective. I’m an IT consultant by day who is responsible for implementing, supporting, and training on each company’s product.
It allows me to have better perspective about how end-users feel about these major cloud suites when “non techies” are at the wheel. And the things they tell me are often no-holds-barred as they rarely hold back. The bigger question most analysts fail to answer still stands: who’s winning the “hearts and minds” of those using these suites?
Before the fanboys go haywire on my assertions abbout each cloud platform, I will preface this article by saying that my computer repair company FireLogic fully supports both Office 365 and Google Apps. We have been working with customers utilizing each platform for a few years now, and while we tend to provide more professional training on Google Apps, we haven’t pigeon-holed ourselves into either heavyweight’s corner. We even happen to be Office 365 resellers at this point in time. My opinions in this op/ed are based both around the feelings of my customers and my experiences in supporting these environments.
For the uninitiated, Google hit the ground running with their Google Apps suite for businesses back in 2006. The service started out as a vanilla clone of Gmail for those looking to move their email to the cloud, but quickly expanded into a full blown suite offering a bevy of sub-products that were rolled in at no extra charge.
Microsoft took notice of this yearning for the cloud in the business sector and gave us BPOS (Business Productivity Online Suite), which launched a few years after Google Apps. Since then, Microsoft morphed BPOS into what we now know today as Office 365, which is the culmination of bringing together the power of Office Web Apps and Exchange Online. Each company has had its respective cloud platform in the wild for some time now, so it’s more than fair to pit them in an honest “in the wild” taste test.
Office 365 Cloud of Confusion
I strongly believe that Microsoft has a solid head on its shoulders. But when customers approach me with questions about which Office 365 price plan is best for them, I can rarely answer in a single phone call. This cloud of confusion begins with the very heart of Microsoft’s perception problem in how Office 365 is priced: the labyrinth of options people have in what service level they need is as much a test in patience as it is a fact-finding mission.
If you don’t believe me, have a look at their full pricing layout for the various Office 365 plans. Where does one begin? Their main flavors of O365 are divided into 6 plans across 3 tiers, with another separate tier of 2 plans for “kiosk workers.” In total, that’s eight possible flavors of Office 365 to choose from. And you thought the menu of Office 2010 or Windows 7 editions was tough to wade?
The confusion doesn’t end there. Have a peek at the FAQs page that Microsoft offers, which covers numerous other details about plans. One big question from potential Office 365 customers is if they can move between plan levels if their organization’s needs change. You’d think that would be an easy task in the cloud (as Google Apps offers at any time) but Microsoft says it doesn’t allow for this. In order to change plan “tiers” you need to go through the ungodly task of cancelling your old account and creating a new one under the fresh plan. Is that a sensible upgrade/downgrade path for customers? Everyone I’ve talked to doesn’t think so.
Furthermore, take a look at which levels within the various tiers get 24/7 support from Microsoft. Google solves this problem quite simply by following a common logic: free Google Apps users get no support; paid users get full support. Redmond instead offers support for nearly all plans except for the Small Business and Kiosk.
Fair enough, until you have a look at the price points of these plans. Note that the cheap $4/month Email Only plan gets support, but the more expensive P1 Small Business plan at $6/month gets no support. Of all the customers who would use Office 365, don't you think that small businesses would be the ones needing the most support Microsoft can offer? Given how Microsoft targets Office 365 towards small businesses without formal IT departments, this gotcha just doesn’t add up. To me, this is a continuation of the one-foot-in, one-foot-out approach of where Office 365 sits in Microsoft’s portfolio.
By contrast, customers rarely have any issues or questions about Google’s offering with Apps because the approach is simple enough. My smallest customers who refuse the 24/7 support and SLA who likewise have under 10 users sometimes opt for the Standard edition of Google Apps, while most business customers willingly agree that $50/user/year is well worth the admission price for the full blown suite. If Microsoft wants to see more uptake on Office 365, then they need to get serious about thinking like a small business owner.
Whose Chair in the Cloud is More Comfortable?
Winning the hearts and minds of users goes beyond merely appealing to peoples' ingrained assumptions about how their email platform should function. And this is where I believe Microsoft’s public thinking about cloud email is fundamentally flawed in some ways. They believe that anyone testing the cloud waters coming from an Exchange environment is automatically expecting a duplication of look, feel, and functionality. If that was solely the case, then cloud platforms wouldn’t be in consideration in the first place. Making a move to the cloud inherently comes with flexibility and a differentiated approach to finding a replacement product. Leaving full-blown, on-premise Exchange and expecting to end up in the same boat riding in the cloud is an incorrect approach. This is where IT professionals like myself come into play to help customers with tough decisions about the future of their communication backbone.
One particular client of ours who asked to be moved off of Office 365 and onto Google Apps complained about the complete dependence on Sharepoint Online and the desktop version of Office 2010 in order to fully utilize Office Web Apps. Unlike Google Docs (which the customer is enamored with), Office Web Apps utilize a three step approach before they can be edited anywhere through a web browser. One must:
This customer’s previous IT consultant failed to disclose this disparity in perception and actual usage, and they quickly became frustrated. With Google Docs, they can utilize Drive or Office Cloud Connect if they wish to use a hybrid cloud approach to document management, or stick to merely creating and editing docs through the online interface for a full-on cloud experience. This strategy can be mixed and matched to the user’s own content. If Microsoft expects Office Web Apps to become as integral to the suite as Google Docs is to the search giant’s platform, they need to take off the virtual leash from its desktop cousin.
This reliance on the desktop doesn't stop at Office Web Apps. How about the need for the Lync client in order to fully utilize instant messaging and voice/video conferencing within your Office 365 domain? This is another thing that irks customers of ours. Some of them use it; a majority install it and just let it run after the novelty wears off and the hassle of switching interfaces gets to them. Google Apps integrates Chat within Gmail with voice and video, and you don’t need the separate Talk client for much of anything nowadays. Good move, Google.
Microsoft’s Legacy Background Not Entirely a Bad Thing
The few places where Google Apps flounders is where Office 365 takes the upper edge. Not surprisingly, most of these areas are Microsoft’s strong points from their deep knowledge of the landscape of enterprise IT. For example, Google Apps utilizes a neat feature called User Organizations, which on the outset is a stellar inclusion but under the surface it still needs a good deal of work. One blatant missing facet of this attempt to recreate the look/feel of Active Directory-like controls is the fact that Orgs and Sub-Orgs are used only to control which services different groups of users have access to. None of the fine service-level control that Google Apps offers (save for a few like email security) can be ratcheted down per Sub-Org. Office 365 picks up the slack where Google has a few things to learn on security, and has the breadth of security functionality that admins are used to seeing. Is this a deal breaker? In some cases, yes.
Shared contacts is another area that I have experienced first-hand to be a sore after-thought on Google’s part. They provide a nifty Shared Contacts API that can be scripted to work if you have the coding background to get it going; but a clean user interface for managing these shared contacts (external global contacts, to be exact) is sadly missing in the Google Apps Control Panel. Getting this feature to work like some organizations are accustomed to takes some serious heavy lifting either through Google Docs scripting, or through the use of excellent Google Apps Marketplace tools like the free SherpaTools Directory Manager. But then again, Microsoft relies on its partners for much of the functionality we have come used to folding in via third party -- so I can’t hark on Google too far.
Google Apps for the Masses, Office 365 for Niche Users
This sums up my feeling towards how end users feel about the two suites. Google Apps brings an interesting breath of fresh air to the cloud platform market, while Microsoft feels like it’s forcing the tie-ins back to its desktop roots in order to make a cloud presence. For those that have a strong reliance on big M’s desktop apps like Office 2010, their taking to Office 365 may be a better fit. But the “rest of us” who are looking for a new way to manage email and take the jump into the cloud, Google Apps bakes a prettier cake that tastes pretty good, too.
I really do want to fully appreciate Office 365 and what it has to offer, but it seems that at every turn, Microsoft throws a prickly thorn into its side with all the roundabouts and catch 22’s that represent its showing in the cloud email atmosphere. If Microsoft claims that it’s truly invested with a “We’re All In” approach to the cloud infrastructure sphere, they need to place a better foot forward with Office 365. Google Apps innovates on a rapid-fire basis, and the users I support continuously see Office 365 as nothing more than a “me-too” product not ready for prime time usage.
Photo Credit: Tom Wang/Shutterstock
Derrick Wlodarz is an IT professional who owns Park Ridge, IL (USA) based computer repair company FireLogic. He has over 7+ years of experience in the private and public technology sectors, holds numerous credentials from CompTIA and Microsoft, and is one of a handful of Google Apps Certified Trainers & Deployment Specialists in the States. He is an active member of CompTIA's Subject Matter Expert Technical Advisory Council that shapes the future of CompTIA examinations across the globe. You can reach out to him at info@firelogic.net.