Today is a good day to code

Dirty Tricks in Web Advertising

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Dirty Tricks in Web Advertising

Picture of Irv Owens Web DeveloperContrary to what most people believe, web advertising is in its infancy. Many companies are still trying to figure out what works, and what doesn't. Their experiments are understandable, they are trying to figure out an audience that spans all known geographic, ethnic, social, economic, racial, religious, ideological, and moral boundaries, phew! That was a mouthful. There are still even newer marketing demographics and sub-demographics being created while they are trying to figure out how to target the old ones. How on Earth is a marketing / web development studio supposed to get a grip on all of it. The answer is elusive, but first I will say what won't get the job done, then we'll explore some ways to get it done.

The way advertisers won't get a grip on web niches is by utilizing dirty tricks in advertising. This includes, but is not limited to, pop-ups, pop-unders, javascript pop-ups, unwanted javascript redirections, flash pop-ups, spam email, and tacky, poorly designed banner ads. Let's look at these one at a time. There has never been a time in the history of the internet where unsolicited pop-up advertisements have been a good thing. As indicated above, this was forgivable because the internet was new, and this was a new way to reach people. Once, however, people began to hate this method of advertising, and demonstrate it by installing software to prevent pop-ups it should have stopped, right. Wrong, instead web marketers began to circuimvent users' defences and use pop-under ads, or advertisements that would come up and hide behind your top browser window, waiting until you closed your browser. Great idea right?!!? Wrong, that is like letting that one advertising exec with the awful ideas in the office get a shot at a limted run of ads. For example, he comes up with A new cola bottle with an overweight child pouring a bag of sugar with the cola label into his mouth, with a moniker reading cola making a big America even bigger. This runs in limited fashion despite the passionate pleas of every focus group it is exposed to. Cola sees a radical drop in its sales numbers, but instead promotes this guy to creative director, thereby putting the ads on billboard all over the country. Eventually Cola goes out of business, a smouldering ruin of its former greatness.

That should never happen in real life. That is the absurdity of trying to irritate users into adopting your product, it just doesn't make sense, and will end up making a company bankrupt. But it didn't stop there, the anti-pop up software got smarter, and was better able to block pop-under, and javascript pop-up windows. Now, there are always going to be an element of shadyness associated with some companies. That is as true in reality as it is on the web, hence unwanted redirections. But there were and are legitimate companies that have used, and are still using these tactics. Surely by now these companies have gotten the message that users don't want a bunch of pop-ups littering their desktops; and they have. The problem now is that in an effort to be less invasive, they have adopted CSS and Flash pop-ups. Talk about dense! People don't want to wait to get to their content. These are barriers, just like splash pages. People will click away.

Spam email is probably the most reviled thing the internet has ever produced, however companies continue to do it, and they put their (click here to remove yourelf from our list) in like 6pt. font at the bottom of their email surrounded by disclaimer information. Most users at this point aren't even looking at the garbage that comes across in their email. They either delete it immediately, or they look at the ad, remember the vendor so that they can never ever buy anything from them again.

Tacky banner ads are the least of the evils described in this article, but they can be just as distracting as pop-ups. Flashing, excessively moving or audible banner ads are no-nos. If you want people to be able to view your website at work without their bosses going nuts, you should make it look professional so that it blends in with the rest of their applications. Not draw attention to it so that they get a repromand for spending too much time on the net.

So, now that we have explored how not to advertise on the net, let's see how to advertise. When I go to Froogle


Safari and Standards Complicance

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Safari and Standards Compliance

Picture of Irv Owens Web DeveloperApple with Safari 2.0 has taken a major step toward standards compliance and largely are taking a leadership role in this area with its outstanding support for the Java runtime. I have heard some griping about Apple using KHTML, the default rendering technology behind the Konqueror browser for KDE, for a base, then running away with the open source once they have figured it out and not giving it back to the OSS community.

While I am extremely happy that Apple has made their browser Acid2 compliant, and they may have one of the fastest CSS rendering engines around built into the AppleWebCore. It is pretty upsetting that they would not share these advances with the developers working on KHTML so that it could also pass the Acid2 test. I can understand that some things you want to keep close to your vest for security reasons, but I can hardly believe that changes you have made to the way pages render in a browser could compromise your system integrity. This appears to be a situation in which Apple wants to be the most standards compliant platform on the market. This would be fantastic from a business standpoint since many in the scientific and mathematics communities would probably prefer to use technology that adhered to standards so as to better communicate information between offices, regions, and countries. I can understand that Apple wants to distinguish its platform from others, and I love the fact they are using standards compliance to do this, however I feel that it is to break the spirit of open source / corporate collaboration not to give something back to the KHTML community.

Speaking of Safari, I noticed a bug recently while writing some javascript for it. I have a javascript that sets the tabindex for a number of input fields, and it works properly, however in Safari it persists in scrolling the real browser scrollbar instead of the div, overflow:auto, element's scrollbar. I had noticed this way back in Safari 1.2 where if you put a flash item within a scrollable div, it would take the flash element and while scrolling lay it on top of all your other content, even if it was above or below the div. All other browsers, even IE 6, handle this properly, scrolling the div with the tabbing. This is a pretty big bug if they want to promote standards compliant web development and accessability. I'd like to see this fixed in Mac OS X 10.4.1, but after browsing the message boards elsewhere, I'd say they already have their hands full, so I am not supremely hopeful.

Microsoft is promising that its IE7 browser will be standards compliant, but just how standards compliant is really the question. I think that Microsoft has learned the error of its proprietary ways. Sure it will continue to bundle its software with everything anyone buys from them, but I don't think they will continue to cripple other products to make theirs look better. They seem to have given up on their own version of DHTML and are happy with XHTML. I noticed that their primary page even validates now. I think that it makes sense for Microsoft to go the standards route also, and with no shortage of developer feedback, they have almost no excuse not to.


Internet Explorer 7 Won’t Make the Grade on Acid

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Internet Explorer 7 Won't Make the Grade on Acid

Picture of Irv Owens Web DeveloperAs the market leader and pace-setter as far as which technologies make the cut for the web Microsoft has a responsiblity to create the most standards compliant browser possible, even at the risk of breaking legacy sites built specifically for IE. Microsoft has always wanted developers to use it's unusual flavor of IE. Whether it is by building extra padding into block level elements regardless of how the css padding attribute is used, or allowing oddities like allowing the use of the color attribute on TR table elements, developers have always had to consider the quirks of IE when building anything for deployment over the web.

I'm sure that IE 7 will be much improved over IE 6 as far as standards compliance is concerned, and some of those oddities I truly enjoy, like being able to give a TR an ID attribute and specifying a header style for my tables in a stylesheet, but at the same time, if we don't have web standards we'll devolve into fragmented development languages like it was 1995 all over again. IE 6 actually had excellent standards compliance when it came out, but times have changed and there are some advanced features like page-break-after that I'd love to use more widely. Part of the reason I love to build intranet applications for Mac only shops is that I know they will be using Safari 2.0 which is an excellent browser based on the open source Konqueror browser bundled with many Linux distros. It supports most if not all CSS 2 tags, and should pass the Acid2 test with ease. Also, by developing to XHTML 1.0 Strict I know that my site will degrade gracefully on everything from mobile devices to old 3.0 browsers. Using ECMAScript also keeps most backward compatability and allows developers to create reliable JavaScripts that will work across all compliant browsers in the same fashion.

I agree with Hakon Lie that Microsoft should really take more time and make sure they nail this one, not just for right now, but for the future since we all know they won't release another web browser perhaps forever since they are convinced that Avalon will change the face of web applications and render the web browser superfluous. We've heard that one before, remember Active X? I hope that everyone calls on Microsoft to work to get IE 7 to pass the Acid2 test, not just so that it will support some bizarre standard that is going to make all our lives harder, but so that developers can be sure that applications they develop today will still look and work the same five years from now. C'mon Microsoft please?

Next Explorer to fail Acid Test – CNET


Configuring ColdFusion MX 7 and Apache

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Configuring ColdFusion MX 7 and Apache

Picture of Irv Owens Web DeveloperAnother issue I kept coming across during my configuration of the XServe G5's Apache and JRun4 was that the virtual hosts didn't seem to be resolving. The same site appeared to collect all the hits. After several hours last night troubleshooting, I finally found the culprit.

When the JRun / Apache bridge is configured, a small module is built and plugged into Apache that allows it to process ColdFusion templates from within its default web root. This functionality is great, it allows a user to serve up .jsp, .php, and .cfm files from the same folder. A single modification is needed to JRun to allow web users to get to your files without having to add /cfusion to the end of their URL request. In JRun there is a setting under the “Application Server” > “Summary,” you will see a section titled Web Applications. Under this header there will be two apps if you have JRun and ColdFusion set up correctly. They will read “CFMX RDS Application” which we are not going to do anything to, and “Macromedia Coldfusion MX,” which we are going to change. If you click on the name of the application “Macromedia Coldfusion MX,” you will see a simple screen that will show you the current context path for the application, which should be “/cfusion” or something similar. If you change it to “/” then your templates will run from the root domain.

With this process, however there are a couple of caveats. You may have to copy all of the coldfusion JavaScript files to a cfusion subdirectory in your applications folder, if you are using ColdFusion forms validation. Also, the images for the administrator will nont appear when you work with the administrator. Accessing the administrator is not quite as straightforward as you might expect, also. A minor change is needed, it obviously no longer needs the “/cfusion/CFIDE/Administrator/index.cfm,” instead it now will use “/cfide/Administrator/index.cfm.” Make sure to make the “cfide” lowercase or it will not work.

Once you have this working, if you already have applications loaded into the “JRun4/servers/cfusion” directory, and they happen to have the same folder name as the ones in your Apache web root folder, then when you call your templates, the server will not know which ones to pick which will have the effect of causing long nights of hair pulling to figure out why your file changes have no effect on the operation of the server. The resolution is simple, do not use the servers directory of JRun to execute your web applications, instead use the Apache web root. You will have to delete any common files between the appliation in your folder within the JRun servers folder, and the Apache web root. Basically just delete your web application from the JRun application folder, and have it only located in Apache's web root, if you haven't already gotten that.

My issue was that both files had the same index.cfm file, and what was happening was that the virtual root was resolving properly, but a cflocation tag that I had in the index.cfm contained within my JRun servers directory was being chosen over the same file in my Apache web root. Once I deleted the version of the application in the JRun folder, the issue disappeared, the server was behaving correctly.

The moral of the story, don't leave superfluous files around your server, they will always come back to haunt you in the end.


What is this Y!Q stuff?

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

What is this Y!Q stuff?

Picture of Irv Owens Web DeveloperYou may have noticed all of the Y!Q links everywhere on my site. It is a new beta product from Yahoo! that allows people to perform web searches constrained by selected content from the page they are searching from. The content that goes to Yahoo! is selected by the publisher and targeted to return even more relevant results than would be possible going directly to the search engine.

When a user visits a search engine, the system has no background about the person to constrain their results so it makes it difficult to perform a search, for example if I knew someone were from Washington State, and they typed in the word apple, then I could assume they might be looking for apple wholesalers, or apple growers, or apple trees. If someone from California searched for the word apple, I might return the company. This is possible if you know something about the person who is searching, which is why personalized search has been receiving more focus of late.

I prefer the context based approach, because then I don't have to provide any personal information for the search engine to give me what I want. It would know just by the content of the web page that I am searching from.

I'll be honing the coldfusion parsing scripts to give the best possible content to Yahoo! I'll be removing words that are less than four characters in length from the article, to get rid of parts of words and words that carry little meaning like 'the.' I hope to have the best, most relevant results, because Yahoo! is offering $5,000 in their contest. Of course there had to be some motive for me to use this beta program!

I suppose that in its final iteration, Yahoo! will create some type of advertising revenue sharing model similar to Google's adwords. They seem to be hoping that it will generate more clicks because of its usefulness to the user. It is still kind of buggy, for example in all browsers other than Safari 2.0 a semi-transparent overlay pops up when the Y!Q link is pressed, on Safari, it takes you to Yahoo's relevant results page. Hopefully they will fix this soon, I'm pretty sure it has something to do with the changes Apple made to Safari's javascript processing engine. Also, since I am trying to automate this, sometimes a character gets into the string, and causes the Y!Q to return something not valid. I hope this will help with your searching.


JoostBook – Joost to Facebook Interface Widget

Posted: December 31st, 1969 | Author: | Filed under: java, JavaScript, Programming, Uncategorized | Tags: , , , | No Comments »

JoostBook – Joost to Facebook Interface Widget

Picture of IrvinSince I'm in love with Joost, I have been thinking about good applications that I could write for the platform. Before I get into talking about the widget / plugin, let me just say that the experience I have had with communicating with the Joost engineers, through their joost-dev google group, as well as them allowing early access to their SDK, has been outstanding. I have rarely come across a more open and generous group. Typically, the SDK guardians are very selfish about discussing future features, and are usually quite arrogant about the possibility of a developer finding an undiscovered bug. None of this has been the case with the Joost SDK staff.

If you don't want to read the details about how I built it, and you just want to use it, you can get it here: JoostBook: Joost / Facebook Interface. You will need Joost, and a facebook account to get started.

Now, about the widget. Firstly, the installation is a little wierd because of the level of control facebook insists on. In order to use the SDK, you have to authenticate, if an unauthenticated request is made, the response is with the facebook login page. This makes for some unique error catching conditions.

Secondly, we web developers often take for granted that the DOM will have a listener attached to it, and will automatically refresh if anything in the DOM changes. Well, I know that the Joost engineers are working on it, but it doesn't refresh, and therefore, while you can create new XHTML elements, as well as modify the ones that are there with JavaScript. You are best off currently just hardcoding all of your objects up-front, and changing their contents. Also, injecting XHTML using innerHTML doesn't really work so well currently either. I'd suspect that much of this is because there is a bridge between the 2D world of XULRunner / Mozilla, and the 3D world of the Joost interface. I'm sure there is a lot of complexity between the two.

So basically, once you have downloaded Joost, and installed the plugin, the first thing I had to do was check for if you are logged in, if you aren't logged in, it has to show you the facebook login page in an iframe so that the XULRunner browser can be cookied. After that, the widget should work like one would expect. You may have to log in alot, and if you aren't logged in, obviously the application can't update the JoostBook facebook application.

Writing the Joost plugin was the easy part, getting the facebook stuff to work was the hard part. Most of it was because the error handling is terrible. Since facebook doesn't allow you to see the 500 errors that your server is throwing, and it doesn't log it, you have to find other ways to check to see if your server is behaving properly. I spent a lot of time in my logs checking for errors.

The install process is a little wierd too, for example, in Firefox 2.0.0.8 on Windows XP, when I clicked on the Joda file linked in the page, it tried to open it as if it were some kind of markup file, obviously the joda looked like garbage, I had to right click and save. Perhaps if I had used a joost:// link it would have worked OK, but I think more research is in order. I didn't really try it in IE because most of the readers of this blog use Firefox, but it should work the same way.

Then having to install the application in facebook can be a little difficult as well. Well, the installation isn't difficult, its the concept that you have to install two applications that work together that is hard. At least there is no particular order in which you need to install them, worst case whenever you run the JoostBook plugin in Joost, it'll show you the facebook login page all the time.

At any rate, it was a fun experience, and I still think the guys at Joost are on to something. I'm slightly less psyched about the facebook platform, but I'm still excited about it.


New Internet Explorer 7 to Allow More Customization

Posted: December 31st, 1969 | Author: | Filed under: Google, JavaScript, Microsoft, Programming, Uncategorized | Tags: , , , , | No Comments »

New Internet Explorer 7 to Allow More Customization

Picture of Irv Owens Web DeveloperI love the ability I have to add more functionality to Firefox. Right now I have the web developer tools so that I can check out a page's stylesheets, javascript, block level elements, etc… I have the IP tool installed so that I can see the IP address of the site that I am currently visiting. I have the Gmail notifier and the PageRank tool all incorporated in my browser, most of which modifies the status bar at the bottom of the browser and is completely innocuous. Internet Explorer has always supported plug-ins, but they were limited in their ability to change the user's browsing experience, relegating them to toolbars and the like. That is about to change.

Similar to the new Google dashboard Internet Explorer will allow small web applications to be installed in the browser, it will allow a user to modify the webpages they are viewing, create a new download manager using the .net languages, really the implications seem to be pretty huge. There is just one problem. Security.

One of my biggest fears with a heavily extensible Internet Explorer is that people will be able to use it to compromise the security of the operating system. We have heard time and time again that in Longhorn, ahem, Vista, users will be able to run Internet Explorer 7 in a sandbox of sorts, or a least privileged user account, preventing would be hackers from compromising the system. That is great for Vista, but what about on Windows XP Service Pack 2? Don't get me wrong, I think Microsoft has done as much as can be expected of anyone when patching a completely insecure OS, and they did it in record time too. Still, there have been plenty of bulletins regarding more compromises and exploits in Windows XP SP2, some regarding Internet Explorer. If you give individuals the ability to distribute code that a user can install, it is possible, by definition to compromise that user's system. I'm sure that Microsoft would be quick to point out that then it isn't their fault that someone installed software that allowed hackers to have their way with all their files, but at the same time it is very easy to misrepresent a piece of software to a computer novice who is using Windows. Just look at how far Gator / Claria has gotten sneaking software onto systems. I think that while having the ability to customize one's web browser is cool, Microsoft should consider passing on this potential nightmare. It is sort of reminiscent of Microsoft's touting of Active X and how it was going to obliterate the line between desktop software and internet applications and change the way we all use our computers. Well, it changed the way we all use our computers, we all need anti-virus / spyware / malware filters that sniff out those Active X controls and disable them. Most of us, those in the know, if we have to use windows, turn the Active X controls off altogether.

I think that Microsoft should really not include this feature, and I mean even for toolbars unless they are reviewed by Microsoft and signed by Microsoft. That is the only way to be sure users aren't getting malware. If the plug-in isn't signed by Microsoft then the OS should refuse to install it. It should be that simple. Of course it makes developing for IE that much more difficult, but Microsoft could release a developer's version of IE that was open source so that the plug-in verification could be disabled to allow all plug-ins to be installed. Everyone in the software business knows that features move boxes, but Microsoft should keep their eyes on the prize of security. They really need to get their reputation back, and integrating more sketchy features in not the best way to do this.

IE Extensibility – From the IE blog


Big Iron (Mainframes) and the World of Tomorrow

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Big Iron (Mainframes) and the World of Tomorrow

Picture of Irv Owens Web DeveloperThere was an article in CNET yesterday espousing the need for developers to pick up mainframe development, and schools reinstating their mainframe classes. While I don't think anyone should waste their time learning about a mostly dead technology, it makes sense to learn from the applications developed on mainframes and take the lessons with a grain of salt.

Right now I am working on converting a legacy mainframe application that was implemented in the 1970's into a web application. The real issues are stemming from the current business process with that mainframe. The database, probably some RDBMS variant, is normalized in such a way that it makes enough sense to keep that structure rather than try to re-invent the wheel. What has been suprising is that it also makes sense to maintain most of the data presentation layer.

The people who use the current system get a ton of data from a very small amount of screen real-estate. The mainframe systems were usually text based, and limited in the number of characters that could be stored in a field, and therefore displayed. Much of the business process that resulted from these limitations has evolved around using codes and cheat sheets to figure out what the codes mean. This also has the effect of shielding somewhat sensitive information from outsiders and customers. The use of codes as a shorthand for more detailed information also has the effect of being able to transfer a large amount of knowledge in a very short time for experienced users. Similar to the way we use compression to zip a text-file into a much smaller file for translation later. When a user inputs the code, they are compressing their idea into a few characters that the user on the other end can understand.

I have been more fortunate than most, because I have access to one of the original architects of the system, and I believe that having an understanding of the business environment and the system architecture is more important than knowing the actual code. Most people looking to hire individuals who understand the mainframe are really looking for people to dis-assemble their applications and rebuild them as web applications.

I do intend to maintain the look of the existing mainframe screens, but intend to replace the current cheat-sheets with simple hover javascript events to display descriptions of what the codes mean. I like this approach of blending the old with the new since it will create a sustainable bridge between the legacy users and incoming users who may not have had the same experience.

The article in CNET further implies that mainframes still sport some advantages over server based applications. That may be true to a degree for deployed desktop applications, but maiframes have no advantage when it comes to web applications. Still, people who know COBOL, FORTRAN, and other low level languages can command a premium for their technical knowledge in the few shops who feel that maintaining these mainframe applications and hardware are better for some reason than replacing them, but it is only a matter of time until these shops agree that paying an ever increasing amount for maintenance and upgrades is more expensive than bringing someone onboard to convert the application to the web. Therefore I see no future in the mainframe, however some great applications were developed for them, and the applications that are still running on them were probably more robust than average.

Much of the methodology I tend to follow when constructing a database or organizing code were implemented for the first time on big iron, so I actually feel priviliged to be able to work with it. Its almost like looking into a time machine where you can see and feel the environment of the past which, even though it may seem the same, is vastly different than the business climate today.

Learn COBOL today!

What is a mainframe anyway?


Internet Explorer 6 Hangs with Multiple Connections

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, JavaScript, Programming, Uncategorized | Tags: , , , | 6 Comments »

Internet Explorer 6 Hangs with Multiple Connections

At work we are using the demis map server, which by itself is an incredible application. We had built a flash based client as our application to allow people to see images overlaid on top of the vector data digested by the map server. One of the issues we had observed with the application was that it tended to hang, or stop responding when a user would ask for many images to be shown on top of the vector map, then they navigated away from the current screen. Now, since I had seen the code and it was a mess with JavaScript setting cookies that ColdFusion was supposed to read and pass to flash, and images for checkboxes, I automatically suspected the code. However, the problem was deeper than that.

The code needs to be rewritten no doubt, there are many more efficiencies to be had, but that didn’t explain the hang. I combed over the server, watching response while a user was using the application. The map server stresses the machine, because it needs a ton of I/O and it would spike the CPU frequently, but no processes went to 99% CPU utilization, and the server seemed to respond to other clients even when one of them was hung up. It was pretty clear then that the problem wasn’t with the server. To take this logic a little further, we built a load test using wget and saving the result to a file. We looped over the calls as fast as we could and we never caused the map server to hang. It performed as expected.

The next logical step was to look at the possibility of corrupt files. We did notice that we could get the map server to crash when we fed it corrupt files, but we found no eveidence that the files that we were using in production were corrupt in any way. At this point we were plenty dejected, because we had spent something like 35 hours over a couple days working on this problem and we had nothing. We performed a new ColdFusion install on a different server, we built a server with better hardware, we reinstalled the map server application multiple times, nothing seemed to affect it. We even improved the network bandwidth available to the client, still nothing. At that point I was down to either it was the code, or it was the client.

To test this theory I commented out all of the flash calls on every page and went through the application to try to cause the system to hang. I couldn’t do it, so I had effectively limited the possible cause to the Flash movie. I started to go through what the Flash movie was doing, and what could cause it to fail. The demis people told us that they had seen hangs when the map server wasn’t responding, and the Flash player was parsing XML. This lead me to try the application in Firefox, and lo and behold, it never hung up. It worked like a charm. The only problem was that our client was set on Microsoft Internet Explorer

I started about the arduous task of removing all XML parsing from the Flash code, then I tried it and it still hung. I was truly disappointed, but I rethought what was happening with the XML. It was making server calls, I realized that I could have up to 8 consecutive connections going on. At the time I thought it was nothing, but then I started trying to find out what was different between Internet Explorer and Firefox. I happened upon an article on MSDN about a known bug that Internet Explorer will hang for 5 minutes when there are 2 persistent connections to a server, and rich content is downloaded. I had found my culprit. It turns out that I had to add 2 keys to the registry. MaxConnectionsPerServer, and MaxConnectionsPer1_0Server. I set the latter to 8 and the former to 24, hexadecimal. The keys need to be DWORD keys.

That would allow 8 connections for HTTP 1.0 and 32 or so connections for HTTP 1.1. The HTTP 1.1 guidelines recommend that there only be 2 connections allowed, but if Firefox wasn’t adhering to it, why should I. I added the keys to HKEY_CURRENT_USER>Software>Microsoft>Windows>Current Version>Internet Settings and it worked like a charm. Everything was perfect. Talk about looking for a needle-in-a-haystack. I’m still amazed that I found it.

The purpose of this entry is so that no one has to go through the week that I just went through. Generally no software should be in front of the client before it is ready, but in this case we already had a client. Hopefully this will help anyone out there who is experiencing hangs in Internet Explorer. Darn Microsoft and not fixing bugs for almost 3 years!

*EDIT Make that 8 years, since IE 8 appears to still suffer from the same problem!*

Here are some helpful links that might be better at explaining than I am…

Wininet Connection Issue

IE Hang Issue


Macromedia / Adobe Flash and AJAX: Companions or Adversaries

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Macromedia / Adobe Flash and AJAX: Companions or Adversaries

Picture of Irv Owens Web DeveloperOne of the hottest new things in web development right now is pretty old. JavaScript is taking the world by storm through the XMLHTTPRequest. My question is, isn't this exactly what Flash MX was designed to do?

I have only been working with Flash for about three-and-a-half years, and one of the first things that drew me to it was the ability to get and post to other pages without a page refresh. Flash was designed to do this from the beginning. With the ColdFusion flash gateway, developers can even directly access CFCs and other template pages. The question then is do we really need AJAX?

I think so. One of the benefits to using AJAX is that it is possible to create standards compliant web pages that are more dependent on the resources of the client and less on the server. Back in the nineties, it was much better to rely on the servers because they often had more computing power, but now desktops are very powerful and most can handle the rigors of sorting and validating data. These are probably some of the more banal uses of AJAX, but these are things that should be handled by the client and not the server.

There will be some overlap between AJAX and Flash. Many in the AJAX camp will claim that AJAX is much lighter than Flash as far as bandwidth is concerned, and I can see that poorly designed Flash will take more bandwidth than well designed Flash. It is possible to draw components with actionscript. This puts the drawing entirely up to the client, with the Flash movie being mostly just compressed script. If AJAX needs to use graphics, it has to send them via CSS during the initial download, and afterwards these images will be available as long as they are in the browser's cache. It is even possible, as it is in Flash, to have the initial page appear while still downloading components.

I think that for some projects AJAX will be the technology of choice, but for others Flash MX will be optimal. Personally, I believe that for most of the jobs you could do with AJAX, Flash will be the faster solution because of the well designed nature of the IDE. Flash is now a platform and the Flash Development Environment is the tool. Macromedia is going to embrace Eclipse to try to get Java developers to see the benefits of creating web applications with Flash. I think that in the long run, Flash is a good bet, and that AJAX is sort of a fad that will become less and less a good choice as bandwidth becomes more available. I like a lot of what is happening with AJAX, and hopefully the developers of Flash will keep working toward accessibility. But in the end, well designed flash applications are hard to beat. They don't need screen refreshes, the Macromedia components are well designed and often will take XML as their data source. The applications allow more interface flexibility than traditional CSS, although this is changing, and overall lead to a better user experience.

So why do I bash Flash constantly? My negativity where Flash is concerned comes from having to endure many, many very poor Flash websites and applications that use Flash just because it moves. The developers often spend little or no time in working with the actionscript, and they don't plan for low bandwith users. Many Flash developers believe that the dial-up and ISDN / Mobile users don't matter and that is simply bankrupt thinking. Developers should plan and develop for the least common denominator. A light design can still be a good design, and is often, in my opinion, the best design. AJAX lends itself to better developer practices by its complexity, but I don't believe that complexity is ever a good solution to a problem. Perhaps with the introduction of AJAX tools, and an IDE this complexity could be improved upon, and we are already seeing the beginning of the uses of AJAX in web applications and they are quite impressive, but most of the impressiveness comes from the fact that they are doing it without Flash, not from the application itself.

The fact is that over 90% of the web is Flash plugin enabled, and it is a relatively small and fast download. If you want to design really solid applications, take everything you have learned about minimal design and apply that to flash development. Perhaps then, Flash can turn its negative image around and become a real tool for business solutions.

About AJAX
Flash Remoting LiveDocs