Tom Petrocelli's take on technology. Tom is the author of the book "Data Protection and Information Lifecycle Management" and a natural technology curmudgeon. This blog represents only my own views and not those of my employer, Enterprise Strategy Group. Frankly, mine are more amusing.

Showing posts with label systems. Show all posts
Showing posts with label systems. Show all posts

Monday, August 23, 2010

Computer Industry Goes Zoom Zoom

You would think that last week’s announcement that Dell was acquiring 3Par for US$1.15B was news enough. Ha! Intel then raised eyebrows by announcing the acquisition of McAfee for US$7.6B. Now, comes Monday morning and HP raises the stakes against Dell by sending in their own and bigger bid for 3Par. It’s nice to be loved. Somewhere in all this, Hitachi Data Systems announced that they had acquired the Intellectual Property and core engineering team of Parascale, a cloud software company. Too bad for them. What should have been a sweet announcement was lost in all the noise.
So, what the heck is going on here? On the one hand, this is actually not that surprising. Computer tech companies tend to throw off lots of cash so they have a lot sitting around for acquisitions. Most of these big companies can thus afford to buy expertise or market share. This is especially true when you are coming out from the bottom of the market. Best to build up the arsenal before the economy really picks up.
This is an industry with a tradition of letting smaller companies trail blaze new technology and markets then get their payoff from a big company. In the long run this is cheaper and less risky for big companies but profitable for small ones. More unusual are the Googles and Microsofts who start in a garage and end up a behemoth. That’s the myth of computer tech but not the reality. What is not a myth is that deal making gives folks like me something to talk about. So here’s the talking about part.
Intel-McAfee Makes for Secure Communications
The Intel-McAfee deal has a lot of pundits scratching their heads. It’s a lot of money for a company with a big consumer business. McAfee’s revenue would barely be a rounding error for Intel. In 2009 Intel’s revenue was 18.5 times McAfee’s (~US$35B vs. US$1.9B). $1.9B is nothing to sneeze at but it will be a long time before a McAfee revenue stream makes up for the money Intel paid for it. What McAfee has going for it is lots of core security technology. More importantly, it’s spread across all aspects of the digital world – web, mobile, desktop, and server. Combined with Intel hardware and chips and you have a much higher revenue generating business than McAfee alone. It’s like having your cereal with fruit and milk. It’s part of a complete breakfast. It also well positions Intel for the long term. This is an example of the Gestalt principle – the whole is way better than the sum of the parts.  Besides, people said similar things about EMC’s RSA acquisition and that has worked out well for them, right?
3Par Bid Up by HP
I wasn’t that thrilled about Dell’s acquisition of 3Par, except insofar as it worked well for the 3Par folks (nice folks). I’m both more and less thrilled about the HP bid along the same lines. It’s better for 3Par financially, so I’m more thrilled. It’s makes less sense for HP though. Unlike Dell they have a coherent storage story, reputation and brand going back decades, as well as an extensive product line. Do they need 3Par? At least with Dell, 3Par would be a prominent part of the line up. They might have even kept their name, like Equalogic did. With HP, they will be absorbed. It’s hard to see what this deal adds to the HP product mix that they can’t get or build more cheaply. I doubt they need 3Par’s customer base really. Perhaps it’s just a way to keep Dell from becoming a serious competitor in storage. Perhaps. Generally, I don’t like this for HP but do for 3Par investors. It will be interesting to see how high this one gets bid up. There could be crazy amounts of money tossed around here.
HDS Goes Parascaling Up In The Clouds
The cloud is about software. It sells hardware but doesn’t exist without software.  Parascale provides software that makes storage and servers into clouds. I don’t know enough about Parascale to say if it worked or was particularly good software. Assuming it worked just fine, then this is the kind of technology play that I like. It adds immediate value, helps move hardware, has broad, future potential in an emerging market, and is a deal that is easy to do. It’s kind of conservative but conservative often pays the bills.
Bye Bye to OpenSolaris
There were also a bunch of other, smaller announcements too. One that is significant was that Oracle will be dropping support for the OpenSolaris project. This is sad since there was a vibrant community around OpenSolaris. It was not, however, unexpected. Oracle has nothing to gain by supporting an open Unix product. In the end, this will be good for the Open Source community. There are already too many Linux and Unix projects and variants diluting the talent pool. Do we really need OpenSolaris and FreeBSD and OpenBSD and NetBSD and Darwin and so on and so on. Not really. So, while I understand how this bothers some people and generates a lot of “what else will Oracle kill?” questions (Don’t worry it won’t be Java or MySql. They generate revenue) it’s really for the better. Time to move on.
I must admit, all this activity is exciting. It’s rare that this industry gets a week like this. Deals are usually more evenly spaced out. It’s like NASCAR for computer geeks.

Wednesday, August 18, 2010

Piling on the Dell/3Par News

Whenever some news comes out about an acquisition, everyone chimes in. It’s like kids playing little league football. Someone tackles the kid with the ball and all the other kids pile on.  I promised myself I wouldn’t do that. I lied. Hey, if you can’t lie to yourself, who can you lie to?
But really, I follow the storage segment but don’t claim in-depth technical knowledge anymore. I’m too interested in technology and business strategy to dive into the deep technical details. I can make a thin provisioning joke but that doesn’t mean I have the kind of encyclopedic knowledge of the segment that folks like Marc Farley (of 3Par – ready to buy that boat?) or Chuck Hollis of EMC have. Sticking to what I know here are some thoughts.
Why it’s a good thing (in list form):
  1. 3Par would have eventually hit the wall. The hardware industry is a game of numbers. Big volume plus low cost equals great margins. You need market share and manufacturing prowess for that. A company the size of 3Par would have eventually gotten eaten alive by the big boys.Or faded into irrelevance. That would have been the slow death.
  2. The deal provides a nice Return on Investment for 3Par investors. I like it when people make money in startups. It provides fuel for more startups and gives hope to the rest of us entrepreneurs. Now, if you all want to swing some of that cash my way…
  3. I bet Dell really wants 3Par. 3Par could have gotten bought up by someone who just wanted them out of the way.  That would have been sad for the industry. There is a better chance that some of what makes 3Par unique will continue to live on at Dell. It’s nice to be loved.
  4. 3Par employees can get great deals on Alienware computers. I’m just speculating but wouldn’t that be cool. Those babies are hot! If that’s not in the term sheet then amend that puppy now.
Why it’s not a good thing (also in list form):
  1. US$1.15B is a lot of money. Dell is going to have to sell a lot of storage to make that back. That’s especially hard to do when the 3Par message has often been how you could buy less storage at a cheaper price to get the same functionality. I get the “less is more” messaging for a startup but you all have to make back a big pile of money now.
  2. Dell’s bought a lot of storage companies but still doesn’t have a cohesive storage message. This is actually a good-not good thing. On the one hand, you don’t think of Dell as being in storage the way you do, say, HP or EMC. They’ve bought up a boatload of storage companies but it’s like Yatzee - all tossed in an incomprehensible pile. On the other hand the scrappy 3Par people are really good at new marketing. If they stick around (and Dell should make it worth their while to stick around) they could have a positive effect on Dell’s overall storage marketing. If they’re allowed to which brings us to…
  3. They can’t use what makes 3Par special. People think that companies like 3Par are about technology. Not really. They are about ideas. The simple audacity of 3Par is part of what makes it successful. That rarely translates well in a big company. Just because Dell wants 3Par doesn’t mean they know what to do with them.  The impact of the creative folks that have been driving the company will be diluted once they are just a cog in the Dell machinery. 
  4. On some level, this has to annoy EMC, Dell’s big storage partner. The more meat Dell adds to the storage stew, the less tasty it is for EMC. I keep wondering how long EMC will put up with this. Dell clearly wants to create a business that competes with EMC. An ugly breakup would be bad for Dell since EMC could probably crush them in the enterprise storage segment. My guess is that the only reason this has yet to happen is that Dell has not gotten it’s act together enough to really get in EMC’s way. Maybe this is what EMC needs to go buy a server company and finally become the full service provider that they should. Some of those Taiwanese computer companies have good SOHO servers that would fit in well with Iomega and Mozy. Just sayin’…
Ultimately, this is very good for 3Par, it’s investors, and many of it’s employees. Making honest money always is. Whether Dell gets it’s $1.15B out of the deal remains to be seen.  They need to develop a simplified but cohesive product line. Better storage marketing would also help. 3Par people can help but will they be allowed to? Wish i knew.

Thursday, October 22, 2009

eBox Shebop

Back in the day, Linux was mostly a geek toy. You had to compile the kernel from source and install all the applications including the GUI by hand. Even by Windows 3.1 standards, it was very technical and primitive. In those days, Linux's best attributes were that it was free and basically UNIX. A lot has changed since then. Linux has become a viable UNIX replacement in servers, helping to fuel the rise of a great many Internet companies. It has also tried, with limited success, to become a desktop operating system and rival to Windows and Mac OS X.

One of the biggest holdups to widespread adoption of Linux has been installation and configuration of software applications. Linux distros seem to subscribe to the philosophy that real men hand edit configuration files. It's the command line that separates the men from the boys. Linux is like a techie version of a sports car. It's about proving something. I just won't say what that something is. Package managers have done a lot to streamline installation but configuration has always been a black art. There are entire books written to help trained system administrators tackle SAMBA configuration. Not to mention every other major Linux package.

This might be fine for the hard core sys admin. It makes them feel superior to the rest of the morons out there. It doesn't work, however, for the vast majority of people milk fed on Windows installation and GUI-based configuration. Even when there are decent configuration tools (which often, in an awesome display of irony, have to be installed and configured separately) and package managers, everything is piecemeal. To set up a user on a box requires configuring many different applications using different tools, some only available from the command line. All of this has been holding back adoption of Linux as a commercially viable alternative to the Windows hegemony.

The good news is that this is changing. A fairly new distribution called eBox has solved many of the problems that have plagued Linux server installation and maintenance. Perhaps it's fair to say that eBox is a mega-distro. It is based on Ubuntu Server, which is itself Debian based. What sets it apart is the comprehensive web-based management tools. They allow single screen configuration for many typical tasks that a sys admin faces. For example, you can set up a user account along with associated file shares, email accounts, and groupware configuration all from one place. This is even better than Windows which still relies on wizards walking you through the process.

Think of it this way. Old Linux is like shopping on a busy city street. You have to walk and walk to lots of individual stores to get what you want. Windows Server is like a department store. Everything is in one place but you still have to go from department to department. eBox is like a personnel shopper. Everything comes to you.

One of the best features of eBox is the initial package installation. It groups packages into functions like networking, security, communications, office (basically file and print services), and infrastructure. This makes it easy to configure a server for specific purposes such as an office file server or a network gateway. The documentation clearly shows where to place the different types of servers in your network to get maximum safety and effect.

eBox is not perfect by any means. Installation (on a virtual box I admit) was difficult. Not difficult in the sense of hard to do since it walked me through every step. Difficult because it hung up a bunch of times. The root password is not obvious. I wasn't asked for it and it isn't the same as the initial admin account as is typical of most Linux installations. That severely restricts what additional software I can install on the server.

While the seection of packages are good, there is not database server package. I know that PostGre SQL server is installed but configuration for it is not included in the web-based configuration system. Application or database server and developer packages would be a good idea. Virtualization packages would also be nice in the future. Maybe a “cloud” package although I suspect that way madness lies.

eBox is an important step forward in making Linux a viable alternative for enterprises of all types but especially for the Small-Medium Business market. With limited or no IT resources SME organizations need easy setup, configuration, and management. That was hard to deliver using Linux before eBox.

There is one truly unfortunate aspect of eBox: it's name. It shares said name with a small PC product from Taiwan. It's bound to cause confusion. I suggest changing it as soon as possible.

Disclosure: The eBox software was provided for free. Of course, it's provided to anyone for free. Just download it from their web site. So, I guess this doesn't really count but why mess with the FTC.

Thursday, June 22, 2006

I'm Running Out of Network Connections

Over the past year I have been adding all kinds of neat widgets, gadgets, AJAX-driven pages, and other Web 2.0 stuff to my working environment. Google, which used to be a simple search page, now displays my calendar, news, stock quotes, and other information, all updating automatically. Each piece of the page is its own little AJAX application and operate independently. Same goes for my desktop widgets courtesy of Yahoo!, formerly known as Konfabulator. My Google desktop, which used to be about searching for files on my hard drive, now has a bunch of little software applications called gadgets. These gadgets either float free like Yahoo! Widgets or attached to the "sidebar" docking bar. Microsoft Vista (when it finally comes out) is also supposed to have some similar little applications.


This has created an incredibly rich and useful desktop environment. All of these apps allow me to do the little things I used to have to go to a web page for or load some monstrosity of an application. It has also created a problem I've never encountered before on a desktop machine - I'm running out of network connections.

There are lots of other applications that want to get to the network too. Practically every piece of software wants to check for updates (although I turn this off for most applications). Add on all the communications most of us use now including VoIP/Skype, IM, and E-mail and you have major contention for network resources. There are limitations to the number of TCP/IP sessions that Windows allows for each network connection. The more small apps I have trying to get to the Internet or even my own network, the more contention there is for those connections. Network adapters, especially the cheap ones they put in most desktop computers, also have limitations. Even if that is very high, there is still limited bandwidth to work with and building up and tearing down network connections takes CPU and network adapter resources.

Now, I might be an ubergeek who wants to have all of these little do-dads loaded all the time but most of these are becoming normal features of the desktop environment. This coupled with the judicious use of AJAX on web sites and even the average person will soon find themselves running out of a precious resource that they didn't know was limited or even there. In a sense, we are back to the days when most of us would get those cryptic messages from Windows because we were running out of memory. Fine for the computer savvy of the world but mystifying to the average Joe.

Now, some of the problem is the applications themselves. When they encounter an overloaded network connection, they act like something has gone terribly wrong. Rather than wait for resources to become available, they spill out an error message in techno code. The upshot is that normal users of computers may start to see them as more harm than good and shy away from them.

Better application design would also help. These various desktop scripting programs should include a network management capability that takes this into account. Small app designers should also work out better a schema for building and releasing network connections. Some applications are constantly building and tearing down connections which is hard on a system.

For my part, I'm going to try an additional Ethernet card and spread the load out a bit. Most folks can't do that. A little more discipline may go a long way to ensuring that this new approach to software doesn't die on the vine.

Thursday, June 01, 2006

VoIP Has a Way To Go But It's On The Way

I've been watching the Vonage IPO fiasco with great interest. The company is becoming memorable as one of the worst IPOs in history. It opened to lackluster interest and dived almost immediately. Ever since, it has continued to trade in a narrow but ever decreasing band. Today it sits just a bit over $12, down roughly 25% from its opening. Yowzer!

Vonage deserves it, in a way. They don't make money. Instead they lose tremendous amounts of money. That's the problem with an IPO. Once public, you're not judged on your potential, concept, or technology. Just on your numbers. Their numbers stink, so there you have it.

My hope is that all VoIP doesn't get painted black because of this. Perhaps VoIP will be judged by the Skype experience instead. Perhaps, but life and business are cruel. Memories are short and folks usually remember the last stupid thing they heard. That's too bad because VoIP is a truly revolutionary technology. It is already transforming the way we communicate and holds the promise of finally bringing about communication convergence.

The hold up is that VoIP, unlike the PC or Internet, is really a bunch of technologies wired together and pushed to their limits. That means that the various technologies don't always play nice together. You have broadband networking (one set of providers), usually a bunch of SIP servers, and the traditional phone system (another set of providers) that all must work together. This creates all kinds of interface and provisioning problems. The result is that the call quality can vary from better than a land line to worse than a bad cell phone connection.

I've been experiencing this first hand over the course of the last month. I decided to finally drop my traditional landline in favor of a VoIP providers. The benefits were certainly there. What I got was:

  • costs less than half of traditional phone service;
  • network services like caller id and voice mail for free;
  • long distance in North America that is free and international calling that's cheap and;
  • did I mention it's half the cost of traditional service?


I also get the warm and fuzzy feeling that my expensive broadband connection is being used for something other than low bandwidth e-mail or small stuttering renditions of 30 year old TV shows. Welcome Back Kotter! It's like you never left.

There are tradeoffs however. VoIP is not as plug and play as vendors make it appear. Hooking up the adapter to the network is a no brainer but getting it to work right is not. A couple of calls to tech support at least got the connection stable enough to use.

I'm still trying to find a way to get the call quality to be consistently good. I'm making progress and new friends in my provider's tech support department (as well as a few enemies I think). I say "progress" because call quality is no longer consistently bad. It's now inconsistently good or bad, depending if you are a "glass is half empty or half full" type. I still experience drop outs, echoes, and static. Just not all the time. So, sometimes the connection is good and sometimes it's lousy.

The problem seems to be a misunderstanding between my ISP and VoIP provider. The latency in my ISP's network is more than the VoIP system can handle. I don't usually notice it since Internet applications are engineered for high latency and are more concerned with bandwidth. VoIP is apparently more like storage and sensitive to latency. This is the killer problem that must be overcome. If VoIP needs low network latency or even a predictable latency, then it will have problems in the the very SOHO market it targets. Cable ISPs don't guarantee quality of service, especially to a home. DSL providers don't guarantee QoS. Yet, VoIP needs a guaranteed minimum QoS. That's a problem that needs fixing before they lose the market.

All in all, I'm quite happy with the VoIP experience. It has dramatically cut my costs while giving me services that I never could have afforded before. Time will tell but I'm betting that VoIP providers will work out the call quality problems. It just might take awhile.

Friday, December 16, 2005

Okay! Now I get VMWare

I started goofing around with VMWare. There has been an awful lot of noise about virtual machines and how they will change everything. I was compelled to download VMWare's VMPlayer and see how it worked.

At first I was mystified. Okay, running Linux in a box on my Windows desktop was cool but was it useful? I struggled to find a reason for it. Sure, there are a lot of developers out there that do cross platform work and this saves them having to have multiple machines. Everyone knows that multi-boot machines are a pain in the neck and you have to constantly reboot to use them. Still, how many people is that really? A handful.

So, as I'm goofing around with the VMPlayer, it dawns on me. With the right scripts, I can run legacy operating systems and hence, legacy software on my computer. Cool! That old DOS program that I like but can't run under Windows XP is once again usable. Perhaps in the future, I can even run the Intel version of MAC OS X (a fantastic operating system) and Windows together.

Even that is not the real reason that VMWare and it's kin (such as QEMU) matter. Instead, what these emulators represent is upgarde security. There is nothing worse then upgrading to a new OS (for example Vista) and discovering that your old applications don't run. Heack I just upgraded to Win X SP2 and found one of my old favorite applications isn't working right anymore. With the virtual machine technology, that's not nearly as big a concern. That also explains why Microsoft has there own VM technology. No more refusing to upgrade because of that one old application that you absolutely rely on. No need to upgrade all your applications for the new OS. Run the old stuff in a box if it won't run native.

Hopefully, this makes up for the 2000/XP Command Prompt program that looks like DOS but doesn't run DOS programs well. In fact, shouldn't this be part of the OS? Sure should. Not that Microsoft is going to listen to me. Okay, maybe they will make it part of the server OS but that will miss the point entirely.

In the meantime, maybe my new favorite folks ar VMWare will give us some legacy operating system scripts and such. Then, I can run cranky old programs on my new computer. Huzzah!

Friday, August 05, 2005

Open Source Fails for Enterprise Applications

I have this real affection for open source software. The whole idea of legions of programmers, working together for the shear joy of creating is very attractive. It speaks to the geek in me. What bothers me is the near religious fervor of the open source community coupled with this "open source everywhere" attitude. I'm sorry folks, but open source doesn't work for everything.

Let's, for the moment, set aside the whole "what is open source and what is not" debate. Sure, some products marketed as open source aren't really and open source is not necessarily the same thing as free. That doesn't take away from the usefulness of the model. It is a detail that still needs attention.

It is more useful to dialog about where open source is useful and where it is not. Clearly, open source has shown that it is attractive as infrastructure. Linux and the Apache web server, to name two popular packages, are the backbones of many corporate systems.

On the desktop, open source makes infinite sense. Applications like Firefox, Thunderbird, and OpenOffice are popular because they are multi-platform and evolve features more quickly than commercial applications. Even more interesting is that the development agenda tends to coincide closely with the end-user's agenda. They are not hampered by the need to make money.

When you step up to Enterprise applications, open source's advantages dim. Look at a typical open source Enterprise applications such as ERP or CRM. These are complex applications that rely on a host of infrastructure products. Practically none support commercial products such as Oracle. The true cost of these products is not in the liecense costs but maintenance, development, and administration.

I'm constantly amazed at how long the list of required software is for an open source application. Commercial Enterprise applications almost always ship with most of the software they need. The list of non-bundled software is short, usually only a database and operating system, sometimes a development framework.

Contrast that with the a typical open source version. One product I saw required Linux, MySQL, Apache web server, Tomcat, Perl, PHP, and three other open source products I hadn't even heard of. You had to get that all in place, tested, and fully operational befor installing the application.

That's an incredible amount of work. It also doesn't plug-in to the infrastructure of many companies which rely on Windows and Oracle. I can even understand the need to first install the database and maybe a development framework, but the all the other packages? I can't see too many companies wanting to do that unless they have already made a committment to these applications and packages.
This complexity makes maintenance cost more, not less.

On top of that, who is going to support it? The application vendor will provide support for their product but support for the ten other pieces of software are either going to come from a bunch of other vendors or "the community". That won't give a system administrator a warm and fuzzy for sure.

The idea of Enterprise open source is great. The packaging is all wrong. Vendors need to make all products available in a single install, expand support for non-open source infrastructure especially databases, and reduce the shear number of contingent packages. Otherwise, this will be great for hobbyists but never for the datacenter.

Meanwhile, I'll keep using Firefox and OpenOffice which install everything I need nice and easy.

Tuesday, July 26, 2005

While the hardware integrates, the software decouples

In my last post I spoke of the trend toward hardware integration. What's interesting is that the opposite is happening in software. Proof of that is my PC. I have a Wintel computer that looks and behaves very much like an Apple OS X machine. By using various skinning and modding applications I have transformed the Microsoft Windows interface into fair approximation of OS X down to the Finder and cool toolbar at the bottom. The last of the great tests of software uniqueness - look and feel - has been swept away.

Why is this possible? Because software has been pulled apart into components, even operating system software. The GUI is dissassociated from the applications frameworks, which is further decoupled from the device interfaces and hardware. What began in the 1980s with the first PC is now reaching its logical conclusion - complete seperation of software functions. In the Windows world, we have a very mutable GUI, sitting atop frameworks such as .Net or JSEE, on top of the core Windows XP OS which, in turn, is seperated from hardware through devices drivers and the HAL. Everything is abstracted and, hence, changable and extensible.

This is very good indeed. We as consumers of computing products can now choose the best underlying OS for our needs, with the type of framework that fits or applications, and the GUI that works best for us. These can be put into different types of devices that suit our needs for performance, size, and features.

Of course, this tends to strip away many of the advantages touted by manufacturers of operating systems. I won't buy an Apple based on the GUI when I can have that GUI on the platform of my choice. If I can choose a Windows application layer (like Mono which is an open source .Net implementaion) for my Linux box then I can have my favorite applications on any platform I choose. I don't need to choose Wintel.

For users of computer technology (and proponents of open source applications) this will bring more choices at lower cost. Good for us. Now, if vendors can only find a way to make money in that environment...