Tom Petrocelli's take on technology. Tom is the author of the book "Data Protection and Information Lifecycle Management" and a natural technology curmudgeon. This blog represents only my own views and not those of my employer, Enterprise Strategy Group. Frankly, mine are more amusing.

Showing posts with label web development. Show all posts
Showing posts with label web development. Show all posts

Friday, May 25, 2007

All You Need is Love!

It's true. All you need is love. So, do we really need another scripting language? Ever since Perl and Javascript showed us that we could crank out code quickly by using a simpler, non-compiled language, scripting languages have proliferated like nuclear weapons in the 1960s. Besides the old languages such as C Shell, AWK, TCL, or SED, and the middle period ones like Javascript, VBScript, Flash Actionscript, and Perl, we nowmust contend with a whole host of new ones.

In the last few years we have seen the emergence of Ruby, Python, and PHP. All have their adherents who defend them with nearly religious fervor. PHP has become one of the most deployed server side scripting languages while Ruby is often used for interfaces that provide rich user experience (in other words Web 2.0) web sites. We can also add to the mix Java Server Pages (JSP), a scripting language for Java Servlets (popular amongst the Tomcat crowd) also for user interfaces. On top of that, Javascript has been extended and integrated into the AJAX model to provide for pretty much the same purpose.

Now, we can add a couple of new languages or environments to the mix. Some have been pushing Groovy, another servlet-oriented language. Despite the obvious Java-oriented scripting language proliferation, Sun has seen fit to launch yet another scripting language, JavaFX. Microsoft has also announced its scripting effort called Silverlight, despite have ASP.Net already.

Why do we need all of these languages? My take on it is that software companies need to have an entry in this space. Most scripting languages have come from the user community, with Javascript being a notable exception. Unlike compiled languages, which typically have high learning curves and need extensive tools, scripting languages are easy to learn and implement. By creating new scripting languages, many of which are similar to proprietary compiled languages, they extend the franchise in the part of the business that makes them money and neutralizing the user scripting languages which don't. This explains Sun and Microsoft's latest announcements.

I also think the open source and hobbyist communites feel compelled to develop new scripting languages because they simply don't like something about existing ones. It usually starts out with the need to have a language that solves a very narrow problem and expands into a general scripting language. Python and Perl comes to mind here.

The result is too many languages that offer few benefits over existing ones. Most are similar to another language but not exactly the same. This leads to a large number of mistakes that actually slow down coding. Anyone who writes both PHP and Javascript will know exactly what I mean. It also makes for a confusing landscape when trying to choose the what languages to use for a project. What I want is a language that can be implemented as compiled code, client side scripting, or server side scripting. It should start with a standard computer language syntac like C++ or Java. That way I only have to learn one language and can use it for everything. I can also start junior programmers on the more stripped down client side scripting version and work them up to the full, middleware enabled, component-oriented version used in my high volume server applications. This is something for Sun to consider since they, more than anyone else, have the capable to deliver this breadth of options based on the same core language, specifically Java. Microsoft might be able to accomplish this if they standardize on C# syntax.

Scripting languages are a very good idea. They allow for rapid, iterative, development, are exceptionally useful for user interface programming in browser-enabled applications, and make it much easier to deploy small applications quickly and with few resources. They also helped recreate programming as a hobby, something that was on the verge of disappearing as professional programming became more complex. We just have too much of a good thing and need to pare it down a bit. That way we can get on with writing code and not learning new computer languages.

Friday, February 23, 2007

Back To The Future with Google Apps

I am generally a proponent of Software as Service (SaS). To me, the advantages are clear. With central control you eliminate the costs associated with updating and distributing client applications. It's also much easier to share information that resides in a central repository. Even client-server applications, where data is stored and managed in a central database, have the problem of client distribution. Strip away the GUI and make it a web page and you will achieve cost savings. SaS makes the most sense for enterprise applications such as MRP, CRM, and other three letter acronym systems. It also works well for social networking. What I don't get is SaS for office productivity applications, such as word processors and spreadsheets. I'm not even convinced of web-based e-mail except as an adjunct to a client-server environment or as a low-end, free, consumer product.

Google has been rolling out free versions of its word processor and spreadsheet products for months. They hope to entice corporations, big and small, to use this service instead of buying standalone productivity applications such as Microsoft Office. Google, like Yahoo and Microsoft, has been selling premium on-line e-mail packages for quite some time and think office applications dovetail nicely into this business. The e-mail services have been popular with individuals and small businesses because of their low or no cost. The fact that they lack the features of Thunderbird or Outlook matters very little to the technophobic low-usage public that doesn't want to install and, more importantly, configure an e-mail application. However, that won't be a problem for business of more than a few people.

What Google seems to be recreating is the ancient IBM PROFS suite. PROFS was developed by IBM in the 1970s and deployed on mainframes throughout the US Government. It had an integrated e-mail and calendaring system and was often used for word processing as well. PROFS got killed by client-server e-mail systems in the 1980s, though it lingered on under the name OfficeVision for quite some time until finally replaced by Lotus Notes and Domino.

PROFS became irrelevant for the same reasons that Google Docs and Spreadsheets are a tough sell for me. They can't create the kind of user experience you need in a word processor or produce the range of features desired in a spreadsheet. Even with all the new techniques for enhancing user experience, these applications are slow, quirky, and lacking in features when compared to established office suites. Even more importantly, you need to be tethered to a high speed network connection. That eliminates the ability to work on an airplane, secluded beach, or anywhere else that you can't get a broadband connection. A secure and reliable broadband connection at that. Add to that all the normal problems of network applications such as network congestion and overloaded servers – again the problems of mainframe applications – and you have to wonder why we seem to be going backwards in time.

There are also some special problems associated with SaS for office applications. For starters, you have to feel comfortable having your intellectual property and trade secrets housed off-site by different company. That pretty much killed the Storage Service Provider Market five years back. It scared the pants off corporate security analysts to have someone else control many classes of corporate data. Will spreadsheets be okay somehow? What about privacy? Will there be problems if I write a performance review or termination letter in Google Docs?

A key argument in the sales pitch for office SaS is that the cost of applications such as Microsoft Office are high. True enough. The latest version of Office is very expensive not only to buy but to deploy. You have to believe that the benefits of the revamped Office interface will pay off enough to overcome the costs of retraining and supporting confused workers.

There are alternatives that don't include a feature poor service. You can avoid the high purchase costs of upgrades or even new deployments by using open source applications such as Thunderbird for e-mail and the OpenOffice.org productivity suite. There are a number of lower cost commercial suites as well such as the storied WordPerfect. All of these are perfectly good for the modern office. Besides, no one says you have to update you word processor just because Microsoft has a new version. With on-line applications, you may not get the choice.

Ultimately, SaS makes sense when there is a natural advantage to being connected. In that case, you are going to be networked anyway. Intranet portals, order entry, customer management, workflow management, and group calendaring makes sense as a service. To be useful, they have to have access to a central data repository so you might as well make the whole system centralized. Productivity applications are designed around individual work and need to be available when there is no network.

Google thinks they can take us back to the days of PROFS. Nice idea but I doubt it will work. Centralized mainframe office applications suffered a quick death for a reason. Those reasons haven't changed. While they may get a bunch of consumers to use these applications and sell ads around it (nothing wrong with that) they won't get a sufficient number of corporate clients to make it viable as a Word or Excel killer. If I were Microsoft, I would be more worried about the Open Source community than Google's on-line apps.

Wednesday, October 11, 2006

Eating My Own Cooking

Like so many analysts, I pontificate on a number of topics. It's one of the perks of the job. You get to shoot your mouth off without actually having to do the things you write or speak about. Every once in awhile though, I get the urge to do the things that I tell other people they should be doing. To eat my own cooking you might say.

Over the past few months I've been doing a fair bit of writing about open source software and new information interfaces such as tag clouds and spouting to friends and colleagues about Web 2.0 and AJAX. All this gabbing on my part inspired me to actually write an application, something I haven't done in a long while. I was intrigued with the idea of a tag cloud program that would help me catalog and categorize (tag) my most important files.

Now, you might ask, "Why bother?" With all the desktop search programs out there you can find almost anything, right? Sort of. Many desktop search products do not support OpenOffice, my office suite of choice, or don't support it well. Search engines also assume that you know the something of the content. If I'm not sure what I'm looking for, the search engine is limited in it's usefulness. You either get nothing back or too much. Like any search engine, desktop search can only return files based on your keyword input. I might be looking for a marketing piece I wrote but not have appropriate keywords in my head.

A tag cloud, in contrast, classifies information by a category, usually called a tag. Most tagging systems allow for multidimensional tagging wherein one piece of information is classified by multiple tags. With a tag cloud I can classify a marketing brochure as "marketing", "brochure" and "sales literature". With these tags in place, I can find my brochure no matter how I'm thinking about it today.

Tag clouds are common on Web sites like Flickr and MySpace. It seemed reasonable that an open source system for files would exist. Despite extensive searching, I've not found one yet that runs on Windows XP. I ran across a couple of commercial ones but they were really extensions to search engines. They stick you with the keywords that the search engine gleans from file content but you can't assign your own tags. Some are extensions of file systems but who wants to install an entirely different file system just to tag a bunch of files?

All this is to say that I ended up building one. It's pretty primitive (this was a hobby project after all) but still useful. It also gave me a good sense of the good, the bad, and the ugly of AJAX architectures. That alone was worth it. There's a lot of rah-rah going on about AJAX, most it well deserved, but there are some drawbacks. Still, it is the only way to go for web applications. With AJAX you can now achieve something close to a standard application interface with a web-based system. You also get a lot of services without coding, making mutli-tier architectures easy. This also makes web-based applications more attractive as a replacement for standard enterprise appliacations, not just Internet services. Sweet!

The downsides - the infrastructure is complex and you need to write code in multiple languages. The latter creates an error prone process. Most web scripting languages have a syntax that is similar in most ways but not all ways. They share the C legacy, as does C++, C#, and Java, but each implements the semantics in their own way. This carried forward to two of the most common languages in the web scripting world, PHP and JavaScript. In this environment, it is easy to make small mistakes in coding that slow down the programming process.

Installing a WAMP stack also turned out to be a bit of a chore. WAMP stands for Windows/Apache/MySQL/PHP (or Perl), and provides an application server environment. This is the same as the LAMP stack but with Windows as the OS instead of Linux. The good part of the WAMP or LAMP stack is that once in place, you don't have to worry about basic Internet services. No need to write a process to listen for a TCP/IP connection or interpret HTTP. The Apache Web Server does it for you. It also provides for portability. Theoretically, one should be able to take the same server code and put it on an any other box and have it run. I say theoretically because I discovered there are small differences in component implementations. I started on a LAMP stack and had to make changes to my PHP code for it to run under Windows XP. Still, the changes were quite small.

The big hassle was getting the WAMP stack configured. Configuration is the Achilles heel of open source. It is a pain in the neck! Despite configuration scripts, books,a nd decent documentation, I had no choice but to hand edit several different configuration files and download updated libraries for several components. That was just to get the basic infrastructure up and running. No application code, just a web server capable of running PHP which, in turn, could access the MySQL database. I can see now why O'Reilly and other technical book publishers can have dozens of titles on how to set up and configure these open source parts. It also makes evident how Microsoft can still make money in this space. Once the environment was properly configured and operational, writing the code was swift and pretty easy. In no time at all I had my Tag Cloud program.

The Tag Cloud program is implemented as a typical three tier system. There is a SQL database, implemented with MySQL, for persistent storage. The second tier is the application server code written in PHP and hosted on the Apache web server. This tier provides an indirect (read: more secure) interface to the database, does parameter checking, and formats the information heading back to the client.

As an aside, I originally thought to send XML to the client and wrote the server code that way. What I discovered was that it was quite cumbersome. Instead of simply displaying information returned from the server, I had to process XML trees and reformat them for display. This turned out to be quite slow given the amount of information returned and just tough to code right. Instead, I had the server return fragments of XHTML which were integrated into the client XHTML. The effect was the same but coding was much easier. In truth, PHP excels at text formating and JavaScript (the client coding language in AJAX) does not.

While returning pure XML makes it easier to integrate the server responses into other client applications, such as a Yahoo Widget, it also requires double the text processing. With pure XML output you need to generate the XML on the server and then interpret and format the XML into XHTML on the client. It is possible to do that with fairly easily with XSLT and XPath statements but in the interactive AJAX environment, this adds a lot of complexity. I've also discovered that XSLT doesn't always work the same way in different browsers and I was hell-bent on this being cross-browser.

The JavaScript client was an exercise in easy programming once the basic AJAX framework was in place. All that was required was two pieces of code. One was Nicholas Zakas' excellent cross-browser AJAX library, zXml. Unfortunately, I discovered too late that it also included cross-browser implementations of XSLT and XPath as well. Oh well. Maybe next time.

The second element was the HTTPRequest object wrapper class. HTTPRequest is the JavaScript object used to make requests of HTTP servers. It is implemented differently in different browsers and client application frameworks. zXml makes it much easier to have HTTPRequest work correctly in different browsers. managing multiple connections to the web server though was difficult. Since I wanted the AJAX code to be asychronous, I kept running into concurrency problems. The solution was wrapper for the HTTPRequest object to assist in managing connections to the web server and encapsulate some of the more redundant code that popped up along the way. Easy enough to do in JavaScript and it made the code less error prone too! After that it was all SMOP (a Simple Matter of Programming). Adding new functions is also easy as pie. I have a dozen ideas for improvements but all the core functions are working well.

The basic architecture is simple. A web page provides basic structure and acts as a container for the interactive elements. It's pretty simple XHTML. In fact, if you look at the source it would look like nothing. There are three DIV sections with named identifiers. These represent the three interactive panels. Depending on user interaction, the HTTPRequest helper objects are instantiated and make a request of the server. The server runs the request PHP code which returns XHTML fragments that are for display (such as the TagCloud itself) or represent errors. The wrapper objects place them in the appropriate display panels. Keep in mind, it is possible to write a completely different web page with small JavaScript coding changes or even just changes to the static XHTML.

The system has all the advantages of web applications with an interactive interface. No page refreshes, no long waits, no interface acrobatics. It's easy to see why folks like Google are embracing this methodology. There's a lot I could do with this if I had more time to devote to programming but Hey! it's only a hobby.

At the very least, I have a very useful information management tool. Finding important files has become much easier. One of the nice aspects of this is that I only bother to tag important files, not everything. It's more efficent to bake bread when you have already seperated the wheat from the chafe. It's also good to eat my own cooking and find that it's pretty good.

Friday, December 02, 2005

I got the new Firefox. Yawn!

I just installed the newest version of Firefox (v1.5), my most favorite browser. This was touted as a major release with all kinds of improvements. Unfortunately, most of the improvments are under the hood. It seems like the same ole Firefox to the me - which is not a bad thing. The bad thing is that some of my extensions aren't working anymore, especially Bookmark Sychronizer.


Adblock, one of my favorite and most important extensions, is still working but differently. Instead of the neat little tab that used to be displayed that allowed me to block Flash ads, I have to go through new gyrations and use the Overlay Flash feature. It's not a big deal but not a positive improvement. I also don't understand why all the search plug-ins I used to have loaded simply disappeared. It would be less of an annoyance if the plug-in site was accesible, which it isn't right now.


Many of the new features, such as the ability to rearrange tabs, have always been available via extensions like Tabbrowser Extensions. Others are just not obvious. Perhaps they have benchmarks that show that the back and forth buttons work faster but I don't see it. It's never been much of a problem for me so it's not something I would notice.

What I like the best is the new error message handling. Since the dawn of the Internet, error messages have been less then useful. If you didn't know what the rather common 404 was, you would just sit there perplexed. Firefox 1.5 seems to be able to interpret errors better and actually provide useful feedback. This is very timely indeed since most of the Mozilla web sites are timing out, probably because they are being hammered.


Ultimately, this looks like a minor release and not the major one that Mozilla has been touting. Maybe that's why it's still 1.5 instead of 2.0. To quote Dom Deluise in The History of the World Part I (a really funny movie) it's "Nice. Not thrilling but nice". For all the hype, there shoudl be something more .. innovative.

Friday, September 16, 2005

Web Clipping with Jeeves

In my line of work (as a technology analyst), you need to keep articles and informaiton that you find on the Web all the time. Company announcments, new regulations, news, you name it. I need to keep it. As a business owner who does a lot of business on the Internet, it's good to be able to store all those order forms as well.

In the past I had to save web pages to my hard drive. I found my hard drive filling up with useless web pages. Managing them was a bit of a pain as well. Worse yet, if I wanted access to these web pages while I was traveling, I had to make copies of them on my laptop, which predicably has a much smaller hard drive.

The best solution is web clipping. With web clipping, you keep a reference or copy of the page in an application that helps to organize the pages. I've mostly used Sage, an application that hooks into the Firefox browser as an extension. It makes it easy to save and manage clippings but still suffers from the problems of clogging my hard drive and lack of prortability.

So, it was with great excitement that I greeted the arrival of Web based clipping applications from Yahoo and Ask Jeeves. Both plug into your browser and give you a little toolbar to save pages to their servers. The toolbars have a bunch of other functions, mostly to help access their web sites, but the primary reason I care about them is the clipping. Now I can save my clippings to a web site where I can retrieve them whenever I want.

After a bit of goofing around I have come to the conclusion that the Ask Jeeves version (part of the MyJeeves Beta) is the best. Why? Because it is so easy to use. I can save clippings, save them different folders, even with a set of my own notes attached. All with the click of a button. Better yet, I can access my clipping whereever I am, assuming I have an Internet connection. I can also save my bookmarks/favorites as well for access on the road.

The other services are pretty good too, although not all the bugs are worked out yet. At least for now, MyJeeves is where it's at (pun intended).