Tom Petrocelli's take on technology. Tom is the author of the book "Data Protection and Information Lifecycle Management" and a natural technology curmudgeon. This blog represents only my own views and not those of my employer, Enterprise Strategy Group. Frankly, mine are more amusing.
Monday, August 23, 2010
Computer Industry Goes Zoom Zoom
Wednesday, August 18, 2010
Piling on the Dell/3Par News
- 3Par would have eventually hit the wall. The hardware industry is a game of numbers. Big volume plus low cost equals great margins. You need market share and manufacturing prowess for that. A company the size of 3Par would have eventually gotten eaten alive by the big boys.Or faded into irrelevance. That would have been the slow death.
- The deal provides a nice Return on Investment for 3Par investors. I like it when people make money in startups. It provides fuel for more startups and gives hope to the rest of us entrepreneurs. Now, if you all want to swing some of that cash my way…
- I bet Dell really wants 3Par. 3Par could have gotten bought up by someone who just wanted them out of the way. That would have been sad for the industry. There is a better chance that some of what makes 3Par unique will continue to live on at Dell. It’s nice to be loved.
- 3Par employees can get great deals on Alienware computers. I’m just speculating but wouldn’t that be cool. Those babies are hot! If that’s not in the term sheet then amend that puppy now.
- US$1.15B is a lot of money. Dell is going to have to sell a lot of storage to make that back. That’s especially hard to do when the 3Par message has often been how you could buy less storage at a cheaper price to get the same functionality. I get the “less is more” messaging for a startup but you all have to make back a big pile of money now.
- Dell’s bought a lot of storage companies but still doesn’t have a cohesive storage message. This is actually a good-not good thing. On the one hand, you don’t think of Dell as being in storage the way you do, say, HP or EMC. They’ve bought up a boatload of storage companies but it’s like Yatzee - all tossed in an incomprehensible pile. On the other hand the scrappy 3Par people are really good at new marketing. If they stick around (and Dell should make it worth their while to stick around) they could have a positive effect on Dell’s overall storage marketing. If they’re allowed to which brings us to…
- They can’t use what makes 3Par special. People think that companies like 3Par are about technology. Not really. They are about ideas. The simple audacity of 3Par is part of what makes it successful. That rarely translates well in a big company. Just because Dell wants 3Par doesn’t mean they know what to do with them. The impact of the creative folks that have been driving the company will be diluted once they are just a cog in the Dell machinery.
- On some level, this has to annoy EMC, Dell’s big storage partner. The more meat Dell adds to the storage stew, the less tasty it is for EMC. I keep wondering how long EMC will put up with this. Dell clearly wants to create a business that competes with EMC. An ugly breakup would be bad for Dell since EMC could probably crush them in the enterprise storage segment. My guess is that the only reason this has yet to happen is that Dell has not gotten it’s act together enough to really get in EMC’s way. Maybe this is what EMC needs to go buy a server company and finally become the full service provider that they should. Some of those Taiwanese computer companies have good SOHO servers that would fit in well with Iomega and Mozy. Just sayin’…
Thursday, October 22, 2009
eBox Shebop
Back in the day, Linux was mostly a geek toy. You had to compile the kernel from source and install all the applications including the GUI by hand. Even by Windows 3.1 standards, it was very technical and primitive. In those days, Linux's best attributes were that it was free and basically UNIX. A lot has changed since then. Linux has become a viable UNIX replacement in servers, helping to fuel the rise of a great many Internet companies. It has also tried, with limited success, to become a desktop operating system and rival to Windows and Mac OS X.
One of the biggest holdups to widespread adoption of Linux has been installation and configuration of software applications. Linux distros seem to subscribe to the philosophy that real men hand edit configuration files. It's the command line that separates the men from the boys. Linux is like a techie version of a sports car. It's about proving something. I just won't say what that something is. Package managers have done a lot to streamline installation but configuration has always been a black art. There are entire books written to help trained system administrators tackle SAMBA configuration. Not to mention every other major Linux package.
This might be fine for the hard core sys admin. It makes them feel superior to the rest of the morons out there. It doesn't work, however, for the vast majority of people milk fed on Windows installation and GUI-based configuration. Even when there are decent configuration tools (which often, in an awesome display of irony, have to be installed and configured separately) and package managers, everything is piecemeal. To set up a user on a box requires configuring many different applications using different tools, some only available from the command line. All of this has been holding back adoption of Linux as a commercially viable alternative to the Windows hegemony.
The good news is that this is changing. A fairly new distribution called eBox has solved many of the problems that have plagued Linux server installation and maintenance. Perhaps it's fair to say that eBox is a mega-distro. It is based on Ubuntu Server, which is itself Debian based. What sets it apart is the comprehensive web-based management tools. They allow single screen configuration for many typical tasks that a sys admin faces. For example, you can set up a user account along with associated file shares, email accounts, and groupware configuration all from one place. This is even better than Windows which still relies on wizards walking you through the process.
Think of it this way. Old Linux is like shopping on a busy city street. You have to walk and walk to lots of individual stores to get what you want. Windows Server is like a department store. Everything is in one place but you still have to go from department to department. eBox is like a personnel shopper. Everything comes to you.
One of the best features of eBox is the initial package installation. It groups packages into functions like networking, security, communications, office (basically file and print services), and infrastructure. This makes it easy to configure a server for specific purposes such as an office file server or a network gateway. The documentation clearly shows where to place the different types of servers in your network to get maximum safety and effect.
eBox is not perfect by any means. Installation (on a virtual box I admit) was difficult. Not difficult in the sense of hard to do since it walked me through every step. Difficult because it hung up a bunch of times. The root password is not obvious. I wasn't asked for it and it isn't the same as the initial admin account as is typical of most Linux installations. That severely restricts what additional software I can install on the server.
While the seection of packages are good, there is not database server package. I know that PostGre SQL server is installed but configuration for it is not included in the web-based configuration system. Application or database server and developer packages would be a good idea. Virtualization packages would also be nice in the future. Maybe a “cloud” package although I suspect that way madness lies.
eBox is an important step forward in making Linux a viable alternative for enterprises of all types but especially for the Small-Medium Business market. With limited or no IT resources SME organizations need easy setup, configuration, and management. That was hard to deliver using Linux before eBox.
There is one truly unfortunate aspect of eBox: it's name. It shares said name with a small PC product from Taiwan. It's bound to cause confusion. I suggest changing it as soon as possible.
Disclosure: The eBox software was provided for free. Of course, it's provided to anyone for free. Just download it from their web site. So, I guess this doesn't really count but why mess with the FTC.
Thursday, June 22, 2006
I'm Running Out of Network Connections
This has created an incredibly rich and useful desktop environment. All of these apps allow me to do the little things I used to have to go to a web page for or load some monstrosity of an application. It has also created a problem I've never encountered before on a desktop machine - I'm running out of network connections.
There are lots of other applications that want to get to the network too. Practically every piece of software wants to check for updates (although I turn this off for most applications). Add on all the communications most of us use now including VoIP/Skype, IM, and E-mail and you have major contention for network resources. There are limitations to the number of TCP/IP sessions that Windows allows for each network connection. The more small apps I have trying to get to the Internet or even my own network, the more contention there is for those connections. Network adapters, especially the cheap ones they put in most desktop computers, also have limitations. Even if that is very high, there is still limited bandwidth to work with and building up and tearing down network connections takes CPU and network adapter resources.
Now, I might be an ubergeek who wants to have all of these little do-dads loaded all the time but most of these are becoming normal features of the desktop environment. This coupled with the judicious use of AJAX on web sites and even the average person will soon find themselves running out of a precious resource that they didn't know was limited or even there. In a sense, we are back to the days when most of us would get those cryptic messages from Windows because we were running out of memory. Fine for the computer savvy of the world but mystifying to the average Joe.
Now, some of the problem is the applications themselves. When they encounter an overloaded network connection, they act like something has gone terribly wrong. Rather than wait for resources to become available, they spill out an error message in techno code. The upshot is that normal users of computers may start to see them as more harm than good and shy away from them.
Better application design would also help. These various desktop scripting programs should include a network management capability that takes this into account. Small app designers should also work out better a schema for building and releasing network connections. Some applications are constantly building and tearing down connections which is hard on a system.
For my part, I'm going to try an additional Ethernet card and spread the load out a bit. Most folks can't do that. A little more discipline may go a long way to ensuring that this new approach to software doesn't die on the vine.
Thursday, June 01, 2006
VoIP Has a Way To Go But It's On The Way
Vonage deserves it, in a way. They don't make money. Instead they lose tremendous amounts of money. That's the problem with an IPO. Once public, you're not judged on your potential, concept, or technology. Just on your numbers. Their numbers stink, so there you have it.
My hope is that all VoIP doesn't get painted black because of this. Perhaps VoIP will be judged by the Skype experience instead. Perhaps, but life and business are cruel. Memories are short and folks usually remember the last stupid thing they heard. That's too bad because VoIP is a truly revolutionary technology. It is already transforming the way we communicate and holds the promise of finally bringing about communication convergence.
The hold up is that VoIP, unlike the PC or Internet, is really a bunch of technologies wired together and pushed to their limits. That means that the various technologies don't always play nice together. You have broadband networking (one set of providers), usually a bunch of SIP servers, and the traditional phone system (another set of providers) that all must work together. This creates all kinds of interface and provisioning problems. The result is that the call quality can vary from better than a land line to worse than a bad cell phone connection.
I've been experiencing this first hand over the course of the last month. I decided to finally drop my traditional landline in favor of a VoIP providers. The benefits were certainly there. What I got was:
- costs less than half of traditional phone service;
- network services like caller id and voice mail for free;
- long distance in North America that is free and international calling that's cheap and;
- did I mention it's half the cost of traditional service?
I also get the warm and fuzzy feeling that my expensive broadband connection is being used for something other than low bandwidth e-mail or small stuttering renditions of 30 year old TV shows. Welcome Back Kotter! It's like you never left.
There are tradeoffs however. VoIP is not as plug and play as vendors make it appear. Hooking up the adapter to the network is a no brainer but getting it to work right is not. A couple of calls to tech support at least got the connection stable enough to use.
I'm still trying to find a way to get the call quality to be consistently good. I'm making progress and new friends in my provider's tech support department (as well as a few enemies I think). I say "progress" because call quality is no longer consistently bad. It's now inconsistently good or bad, depending if you are a "glass is half empty or half full" type. I still experience drop outs, echoes, and static. Just not all the time. So, sometimes the connection is good and sometimes it's lousy.
The problem seems to be a misunderstanding between my ISP and VoIP provider. The latency in my ISP's network is more than the VoIP system can handle. I don't usually notice it since Internet applications are engineered for high latency and are more concerned with bandwidth. VoIP is apparently more like storage and sensitive to latency. This is the killer problem that must be overcome. If VoIP needs low network latency or even a predictable latency, then it will have problems in the the very SOHO market it targets. Cable ISPs don't guarantee quality of service, especially to a home. DSL providers don't guarantee QoS. Yet, VoIP needs a guaranteed minimum QoS. That's a problem that needs fixing before they lose the market.
All in all, I'm quite happy with the VoIP experience. It has dramatically cut my costs while giving me services that I never could have afforded before. Time will tell but I'm betting that VoIP providers will work out the call quality problems. It just might take awhile.
Friday, December 16, 2005
Okay! Now I get VMWare
At first I was mystified. Okay, running Linux in a box on my Windows desktop was cool but was it useful? I struggled to find a reason for it. Sure, there are a lot of developers out there that do cross platform work and this saves them having to have multiple machines. Everyone knows that multi-boot machines are a pain in the neck and you have to constantly reboot to use them. Still, how many people is that really? A handful.
So, as I'm goofing around with the VMPlayer, it dawns on me. With the right scripts, I can run legacy operating systems and hence, legacy software on my computer. Cool! That old DOS program that I like but can't run under Windows XP is once again usable. Perhaps in the future, I can even run the Intel version of MAC OS X (a fantastic operating system) and Windows together.
Even that is not the real reason that VMWare and it's kin (such as QEMU) matter. Instead, what these emulators represent is upgarde security. There is nothing worse then upgrading to a new OS (for example Vista) and discovering that your old applications don't run. Heack I just upgraded to Win X SP2 and found one of my old favorite applications isn't working right anymore. With the virtual machine technology, that's not nearly as big a concern. That also explains why Microsoft has there own VM technology. No more refusing to upgrade because of that one old application that you absolutely rely on. No need to upgrade all your applications for the new OS. Run the old stuff in a box if it won't run native.
Hopefully, this makes up for the 2000/XP Command Prompt program that looks like DOS but doesn't run DOS programs well. In fact, shouldn't this be part of the OS? Sure should. Not that Microsoft is going to listen to me. Okay, maybe they will make it part of the server OS but that will miss the point entirely.
In the meantime, maybe my new favorite folks ar VMWare will give us some legacy operating system scripts and such. Then, I can run cranky old programs on my new computer. Huzzah!
Friday, August 05, 2005
Open Source Fails for Enterprise Applications
Let's, for the moment, set aside the whole "what is open source and what is not" debate. Sure, some products marketed as open source aren't really and open source is not necessarily the same thing as free. That doesn't take away from the usefulness of the model. It is a detail that still needs attention.
It is more useful to dialog about where open source is useful and where it is not. Clearly, open source has shown that it is attractive as infrastructure. Linux and the Apache web server, to name two popular packages, are the backbones of many corporate systems.
On the desktop, open source makes infinite sense. Applications like Firefox, Thunderbird, and OpenOffice are popular because they are multi-platform and evolve features more quickly than commercial applications. Even more interesting is that the development agenda tends to coincide closely with the end-user's agenda. They are not hampered by the need to make money.
When you step up to Enterprise applications, open source's advantages dim. Look at a typical open source Enterprise applications such as ERP or CRM. These are complex applications that rely on a host of infrastructure products. Practically none support commercial products such as Oracle. The true cost of these products is not in the liecense costs but maintenance, development, and administration.
I'm constantly amazed at how long the list of required software is for an open source application. Commercial Enterprise applications almost always ship with most of the software they need. The list of non-bundled software is short, usually only a database and operating system, sometimes a development framework.
Contrast that with the a typical open source version. One product I saw required Linux, MySQL, Apache web server, Tomcat, Perl, PHP, and three other open source products I hadn't even heard of. You had to get that all in place, tested, and fully operational befor installing the application.
That's an incredible amount of work. It also doesn't plug-in to the infrastructure of many companies which rely on Windows and Oracle. I can even understand the need to first install the database and maybe a development framework, but the all the other packages? I can't see too many companies wanting to do that unless they have already made a committment to these applications and packages. This complexity makes maintenance cost more, not less.
On top of that, who is going to support it? The application vendor will provide support for their product but support for the ten other pieces of software are either going to come from a bunch of other vendors or "the community". That won't give a system administrator a warm and fuzzy for sure.
The idea of Enterprise open source is great. The packaging is all wrong. Vendors need to make all products available in a single install, expand support for non-open source infrastructure especially databases, and reduce the shear number of contingent packages. Otherwise, this will be great for hobbyists but never for the datacenter.
Meanwhile, I'll keep using Firefox and OpenOffice which install everything I need nice and easy.
Tuesday, July 26, 2005
While the hardware integrates, the software decouples
Why is this possible? Because software has been pulled apart into components, even operating system software. The GUI is dissassociated from the applications frameworks, which is further decoupled from the device interfaces and hardware. What began in the 1980s with the first PC is now reaching its logical conclusion - complete seperation of software functions. In the Windows world, we have a very mutable GUI, sitting atop frameworks such as .Net or JSEE, on top of the core Windows XP OS which, in turn, is seperated from hardware through devices drivers and the HAL. Everything is abstracted and, hence, changable and extensible.
This is very good indeed. We as consumers of computing products can now choose the best underlying OS for our needs, with the type of framework that fits or applications, and the GUI that works best for us. These can be put into different types of devices that suit our needs for performance, size, and features.
Of course, this tends to strip away many of the advantages touted by manufacturers of operating systems. I won't buy an Apple based on the GUI when I can have that GUI on the platform of my choice. If I can choose a Windows application layer (like Mono which is an open source .Net implementaion) for my Linux box then I can have my favorite applications on any platform I choose. I don't need to choose Wintel.
For users of computer technology (and proponents of open source applications) this will bring more choices at lower cost. Good for us. Now, if vendors can only find a way to make money in that environment...