Today is a good day to code

The Biggest Trick that Jeff Bezos Ever Pulled

Posted: April 19th, 2012 | Author: | Filed under: Amazon, Apple, Companies, Facebook, Google | Tags: , , , , , , , , | No Comments »

Boa Constrictor

For reasons unknown, it seems that the tech media completely fails to give Jeff Bezos and Amazon the recognition that they deserve.  I believe that this is due to a deliberate strategy executed by Amazon to quietly grab as much mind and market share as they can.  If they continue on their trajectory, they may become unassailable, in fact, they may be already.

There are blogs and podcasts called things like Apple Insider, This Week In Google, Mac Break Weekly, etc… I have yet to hear about any blogs or podcasts about what Amazon is doing week-in and week-out, but in many ways it is much more interesting.  Amazon now handles 1% of consumer internet traffic, pushing all of through its near ubiquitous compute cloud infrastructure.  They are rapidly and efficiently dismantling existing retail.  Amazon is probably on their way to completely owning web commerce.  Amazon has mass amounts of data on what people have, want, and will want based on what they own and buy.  Through their mobile applications they are gathering pricing signals from competitors so that they can use their own cluster computing prowess to spot change pricing.

What is shocking about this is, despite their proficiency, no one discusses how absurdly dominant Amazon has become.  Everyone just treats Amazon running all internet commerce and large swaths of its infrastructure as “the way it is.”  Amazon is more a force of nature at this point than a company.

It isn’t just the tech media that doesn’t give them the credit they deserve, major tech companies aren’t either.  Google and Apple seem ready to laugh off the Kindle Fire while Amazon soaks up more signals.  Microsoft doesn’t even try to match them.  Google’s commerce efforts look half-baked compared to what Amazon does, and they show no signs of trying to do better.

It is absurd to think that with the bitter rivalries we constantly hear about between Apple and Google, Microsoft and Google, Microsoft and Apple, etc… that someone would start a podcast about Amazon.  Fifty years from now technology changes will have toppled Apple, Google, Facebook, and Microsoft, but I’d bet that Amazon would still be around.

Jeff Bezos and his company wield algorithms and data more effectively than anyone else in the industry, despite all the credit we give Google for search.  Their suggestion and comment filtering algorithms are bar none, the best around.  Amazon is integrated into the fabric of our lives and that is something that no other tech company has done to that level.

Amazon will keep doing what Amazon does best, being ruthless, being efficient, executing better than anyone else, and staying ahead of the curve.  As long as we keep ignoring them they are doing their job.  The greatest trick Amazon ever pulled was convincing the world that they didn’t exist.  They have convinced the world that they are just retail.


Google’s Vision of the Future is Correct… But They May Not Be The Ones Who Implement It

Posted: January 8th, 2011 | Author: | Filed under: Companies, Google, Lifestyle | Tags: , , , , , , , , | No Comments »

On a drive from Colorado to Las Vegas this past week my daughter and my son were in the back seat of our car using my daughter’s netbook, she has recently turned 7 years old so I bought her a netbook and I am starting to teach her how to code.  My son wanted my daughter to change the video that they were watching and she began to explain how the internet works to him.

She told him that all of her stuff was on the internet ( emphasis mine ) and that the movie that they were watching was the only one that was on her netbook, she explained how her computer was barely useful without the internet, that the internet came from the sky and her computer needed to have a clear view of the sky to receive the internet.  In addition she said that since we were in the car and the roof was obscuring said view that they couldn’t get the internet, and couldn’t change the movie.

Listening to this conversation gave me a bit of pause as I realized that to my children, the internet is an etherial cloud that is always around them.  To me it is a mess of wires, switches and routers with an endpoint that has limited wireless capabilities.  When I thought through it, however, I realized that my kids had never seen a time when someone had to plug in their computer to get to the web.  Plugging in an ethernet cable is as old school as dial-up.

Once that sunk in, I understood that the Cr-48, Google’s Chrome OS netbook is a step in the right direction, and while I am very enthusiastic about several aspects of Google, and in all fairness others’ vision of a web based future, I do not feel that the current approach will work.

A centralized system where all of users’ data lives, and all communications go through is not an architecturally sound approach.  As the number of devices that each user has goes up, the amount, size and types of connections is going to stress the servers exponentially.

It is already incredibly difficult to keep servers running at internet scale, we need entire redundant data centers to keep even small and simple web scale endeavors running.  When you take a step back you realize that a system like Facebook is barely working, it takes constant vigilance and touching to keep it running.  It isn’t like a body where each additional bit adds structural soundness to the overall system, instead each additional bit makes the system more unwieldy and pushes it closer to breaking.

Google is another example of a system that is near the breaking point, obviously they are struggling to keep their physical plant serving their users, and like Facebook they are so clever that they have always been able to meet each challenge and keep it running to date, but looking at the economics of it, the only reason this approach has been endorsed is because of how wildly lucrative mining usage patterns and the data generated by users has been.

I don’t think this will continue to be the case as the web reaches ever larger and larger groups of people. I don’t think any particular centralized infrastructure can scale to every person on the globe, with each individual generating and sharing petabytes of data each year, which is where we are going.

From a security and annoyance perspective, spam, malware, and spyware is going to be an ever increasing, and more dangerous threat.  With so much data centralized in so few companies with such targeted reach, it is pretty easy to send viruses to specific people, or to gain access to specific individuals’ data.  If an advertising company can use a platform to show an ad to you, why can’t a hacker or virus writer?

The other problem that is currently affecting Google severely, with Facebook next is content spam.  It is those parking pages that you come across when you mistype something in Google.  Google should have removed these pages ages ago, but their policy allows for them to exist.  Look at all of the stack overflow clones out there, they add no real value for themselves except for delivering Google adsense off of creative commons content.  What is annoying is that because of the ads, they take forever to load.  Using a search engine like Duck Duck Go things are better, but this is likely only because it is still small.  DDG also says that it will not track its users, that is awesome, but how long will that last?

It is possible for a singly altruistic person to algorithmically remove the crap from the web in their search engine, but eventually it seems that everyone bows to commercial pressure and lets it in in one fashion or another.

Concentrating all of the advertising, content aggregation, and the content in a couple of places seems nearsighted as well.  The best way to make data robust is to distribute it, making Facebook the only place where you keep your pictures, or Google, or Apple for that matter is probably a bad idea, maybe it makes sense to use all three, but that is a nuisance, and these companies are not likely to ever really cooperate.

It seems to me that something more akin to diaspora, with a little bit of Google wave, XMPP, the iTunes App Store, and BitTorrent is a better approach.  Simply, content needs to be pushed out to the edges with small private clouds that are federated.

This destroys most of the value concentrated by the incumbents based on advertising, but creates the opportunity for the free market to bring its forces to bear on the web.  If a particular user has content that is valuable, they can make it available for a fee, as long as a directory service can be created that allows people to find that content, and the ACLs for that content exist on, and are under the control of the creator, that individual’s creation can not be stolen.

Once the web is truly pervasive then this sort of system can be built, it will, however, require new runtimes, new languages, protocols, and operating systems.  This approach is so disruptive that none of the existing large internet companies are likely to pursue it.  I intend to work on it, but I’m so busy that it is difficult.  Fortunately, however my current endeavor is has aspects that are helping me build skills that will be useful for this later, such as the Beam/Erlang/OTP VM.

The benefit is to individuals more than it is to companies, it is similar to the concept of a decentralized power grid.  Each node is a generator and self sufficient and the system is nearly impossible to destroy as long as there is more than one node.


Steven Wolfram’s Computation Knowledge Engine

Posted: March 9th, 2009 | Author: | Filed under: artificial intelligence | Tags: , , , , | 2 Comments »

Ars Technica has no faith.  They are already saying that Wolfram’s knowledge engine will fail based, I’d imagine on the complete and utter disaster that Cuil and other would be google challengers have been.  Here’s why I think that the computation knowledge engine can be a success.

First of all, its Stephen Wolfram, who truly shouldn’t be underestimated.  He is also not trying to say that it can cure cancer, really he isn’t saying what it can do, or what its ultimate goal is.  Except to say that it is going to answer simple questions.  I don’t understand why this is impossible.  Technology is clearly accelerating at a near exponential rate.  The same improvement in technology and science between 1997 and 2000, was probably accomplished by June 2002, and so on.  If that is to be accepted, then you have to believe that at some point soon we should get to an intelligent system that can answer a simple question like what color is the sky.  Not by looking it up in a database, but by actually reasoning out the answer.

I think that Ars isn’t giving these guys enough credit.  I can’t wait to see what they have cooked up.


A Response To: “The CSS Corner: Using Filters In IE8”

Posted: February 23rd, 2009 | Author: | Filed under: Companies, CSS, Microsoft | Tags: , , , , , , , | No Comments »

Well, the IE team has posted an excuse for why IE 8 will not handle widely used CSS 3 extensions.  The reason, its hard, and it was a stretch goal.  Instead we are left with a slightly more standard implementation of the filter css attribute, -ms-filter, as opposed to filter.

Furthermore, the IE team claims that they are doing this so that “web authors do not have to rewrite their stylesheets”. 

OK.  Let’s look at this objectively.  It is indeed hard, building a web browser from scratch is no joke.  I have tried several times, and I am still trying to build a web browser.  I have tried this in C++, Java, even Ruby.  It is always hard.  Most of the difficulty comes from trying to render pages that aren’t formatted properly.  Right or wrong, it is how the web is currently built.  However; I have a radical solution, I apologize in advance for the shout, *USE WEBKIT*.  Why is this a problem?  It would be easy to use the standard msie7.dll or whatever for pages that need the *broken* button in IE 8.  Then use a new WebKit based render mswebkit.dll for pages that are standards compliant, or not using that strange IE 7 tag.  If Multiple IEs works, this would be completely possible.

Let’s take a quick look at why Microsoft might not want to do this.  Google uses WebKit, and Apple uses WebKit.  As far as the technical difficulty in this, many lesser organizations have implemented a WebKit based browser from the webkit source without hiring a million developers.  I think that an organization like Microsoft should be able to handle building a browser using or based on WebKit within a few months.  I wish Microsoft could occasionally be more like Google and throw out the product managers and just build what the world wants. I don’t understand why they can’t consider this.

Now about the sentence, so that “web authors do not have to rewrite their stylesheets.” I am a web author, and I will not rewrite my stylesheet.  IE users, I am sorry, you will just have to live with a broken layout.  I do not have the time or the interest in rewriting my cool, cutting edge web applications to work with 10 year old technology.  They said this stuff was written originally in IE 4.  It came out for the PC originally in 1997!  Come on, advance!  I will not write anything for IE.  I will make sure it functions and none of the tasks that a user would do in my web applications are blocked, but I am not going to try to make it have rounded rects, or opacity, if IE doesn’t support web standards.  That sentence alone indicates Microsoft’s hubris, note the “have.”  If it were mozilla, they would say so that web authors don’t want to rewrite their stylesheets, not that they would ever have that problem.  Microsoft is still pretending that IE is relevant as far as developer mindshare.  

Microsoft does some amazing things, but as far as the web is concerned, it is pretty much off my radar.  Users, please, please upgrade your browser to Chrome, Firefox, or Safari.


NBC Still Doesn’t Get It

Posted: February 23rd, 2009 | Author: | Filed under: Hulu, Media, NBC | Tags: , , , , , , , | No Comments »

I was wondering how long it would be before NBC started to bite the hand that feeds it with Hulu.  I still think that eventually NBC will completely kill it, with help from Comcast.

First of all, I was amazed that Hulu was allowed to exist, and once it did, I started to count the days until it was killed.  Not that I don’t love Hulu, I do.  I think that Hulu, Joost, and other sites that blend big media programs with net content are awesome.  It allows me to watch television again.

For several years, I didn’t care what was on TV, I didn’t really watch TV.  All I did was watch Netflix and YouTube, when I wanted to consume video.  I know that I am not the only one that doesn’t have time to sit down and watch my favorite TV shows in primetime.  Not to mention that I don’t even know when most of  the shows I like air.  Before Hulu, and Joost, I didn’t even care, I just stopped watching.

What I can’t understand is why NBC seems to not understand that no one wants to watch TV at preset times anymore.  Not to mention that if I am going to be advertised to, I don’t want to pay for the “privilege” to watch shows at a time of my own choosing.

As far as Comcast is concerned, I can’t believe that it is a mistake that everytime I watch a show on Hulu, now that Comcast has run my local ISP out of business and bought it at bargain basement prices, all I see are US Military and Comcast ads.  Comcast, I am not going to pay more for your cable package, I am not going to pay for your “on demand,” and as soon as there is an alternative that can provide some semblance of decent speed, I am not going to pay for your compromised internet.  They claim that they are packet prioritizing to ensure network integrity, but Hulu is much slower than it used to be, even while doing speed test show something like 14MB burst downloads.  That doesn’t make sense.  I went from 4.5 MB down to 6 with 14 burst and it is slower?  I have all this speed, but I can’t use it for anything… Fail…

By removing its content from Hulu affiliate sites, NBC is proving that they don’t get that consumers want to consume video in the way they want to consume it.  I am seriously considering just buying this stuff from iTunes and being done with it.  CBS gets it, and are doing a good job, the only problem is that they just don’t have the content.

I think they must believe that if people can’t get the NBC content anywhere except for the TV, that they will just sit in front of the Boob Tube and watch it, but they are wrong.  People will stop knowing about the shows, and will begin to look for alternatives like video games, or short, indie programs that will be readily available on online only networks link ON Networks, Revision 3, etc…  I already consume way more video podcasts than TV shows anyway, it wouldn’t take much for me to just drop TV entirely.  What would that do to NBC’s ad revenues?  Comcast needs to get a clue and realize that they are a dumb pipe, and they need to forget about the Coax business and get with the TV over IP business.  If they want to compete, how about creating their own quality content to get the ad business instead of crapifying my internet connection, and spamming me to try to get me to embrace their dying business model.

NBC will never get it.  They need to just go away.  I like a few of their programs, like Battlestar and Heroes, but I am not sure that it is worth the effort, especially with iTunes and Netflix around.  If they take that away, well I just don’t know, perhaps I’ll have to write and produce my own Sci-Fi stories.

NBC (Hulu) Removes content from Boxee


I Finally Found a Real-World Use For AJAX

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

I Finally Found a Real-World Use For AJAX

Picture of Irv Owens Web DeveloperI'm working on this project now that makes heavy use of standard javascript, Flash, ColdFusion, and HTML. The project is using the TCallFrame javascript function to control the flash movie. Because of the nature of the project, the timing between the movie refresh calls makes using a refreshing iFrame not possible for the text information that I need to display. Likewise, the text needs to be displayed using complicated styles and colors, controlled by ColdFusion components that Flash can't handle easily.

So I was faced with a dilemma. I couldn't refresh an iFrame without throwing the timing of the TCallFrame request, combined with some javascript cookie writing, I didn't write the original application, and Flash remoting calls. Basically, nothing should happen until the cookie is written, which it doesn't, most of the time.

It eventually dawned on me, not only that the application needed to be strengthened with some type of cookie listener at the very least, and a small-scale rewrite at the most, but that I only really needed the application to think that there was no iFrame refresh, but an AJAX call using an invisible iFrame could work.

So I began working on it and it evolved into a two-part system. On one hand, the div that contained the text to be returned from the ColdFusion components needed to be continuously refreshed, which I found out after struggling with another timing issue. Eventually I realized that the variable that needed to be set in my parent window was empty at the time of the screen refresh because the iFrame hadn't finished loading yet.

So what I had to have happen was for the iFrame to make a javascript function call to the parent once the variable was written. Once I had that accomplished it was easy to have the checkboxes that called the TCallFrame javascript call the function to refresh the iFrame. It worked wonderfully, but unfortunately it still caused a timing issue with the main movie. It delayed the writing of the cookie by a hair. Still it was cool to see, that I had an actual use for AJAX.

I have actually found other uses for it, but none as clear cut as that. Flash just wasn't flexible enough. I'm not reversing myself, I still think that Flash should be most developers' first choice when it comes to remoting, but in a pinch AJAX is allright!


Why Flash is Still Oh, So Wrong for So Many

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Why Flash is Still Oh, So Wrong for So Many

Picture of Irv Owens Web DeveloperI have recently come under some minor pressure from various factions about why, while knowing Flash fairly well, I am always reluctant to design and build a flash site featuring the technology. My history with Flash is pretty much the same as most other developers. My first versions of this very site three or four years ago were made entirely in Flash, as were many of my customers sites. Flash seemed like the way to go. It rendered the same in every browser, fonts weren't an issue, and it allowed an incredible amount of freedom to create.

So why then were my sites so problematic. The first issue was one of bandwidth. I had music and lots of motion on these sites. They were extremely interactive and eye catching. The problems came up when users had to come to my site using dial-up. When they hit the site and saw the loading bar, the first thing they did was to click back and go on to another site. My webtrends illuminated this for me. My next step was to go more minimal, which is my favorite thing to do, but then I wondered why I was using Flash at all, because now the motion was mostly gone, and so was the majority of the interactivity. I was using flash simply for the z-index, and I was finding that I could do this with CSS. So, not to be deterred, I did another redesign that kept the motion and interactivity, but minimized the huge bitmap graphics that were giving me the long download times. Instead, I used vector graphics. These were much smaller, but now I had a new problem. If my clients didn't have at least a Pentium 4 running at greater than 2 GHz, my site ran slowly, so slowly that it was almost unusable.

The next issue was that in all the time I had my site, I could never find it using search engines. I discovered that search engines couldn't index my site because they couldn't see through the Flash. To the spider, my site looked like a huge gif in a HTML file with some meta-tags. In other words, it looked like nothing. I tried alt tags, no script tags, etc… but nothing helped. Finally, I decided to design an alternate site for dial-up users using good ol' XHTML and CSS. I found that as soon as I uploaded the file, the search engines had me, and no one ever visited my Flash site anymore.

Suffice it to say that I took my Flash site down. Later, I would redisign my site again so that it would adhere to web standards and could render even faster for all users. That site is this one, and it is the first that I am happy with. I am enjoying some minor success with getting listed on search engines and blog aggregators, and life is good.

I don't hate Flash any more than I hate Allen wrenches or crowbars. It is a tool, and typically you try to use the right tool for the job. It seems to me that many web developers, however are trying to use a sledgehammer to staple two pages together. It just doesn't work. In some cases Flash is OK. In corporate settings, Flash is an excellent tool for presentations, product demonstrations, promotional materials delivered through the company intranet, or from the presenter's local hard drive, as long as it doesn't have to be delivered over the web.

There are a few cases where it is perfectly reasonable for designer / developers to build flash-only web sites for people. Art sites, such as photography showcases can benefit from Flash and its fantastic bitmap compression. Flash photography sites can often download faster than their HTML / CSS counterparts due to smaller image sizes. Some product demonstrations can benefit from Flash and its interactivity. Many cellular phone providers have used Flash to great effect in this regard. Simple branding banners contained within standard HTML / CSS pages with limited motion and interactivity can be excellent, as long as the text of the page is available for the user to read while the Flash is loading.

Still, designers and developers need to ask themselves, what exactly am I trying to do, and who is my target customer? I have had a very hard time making a solid business case for Flash on most of my ecommerce and business sites. Flash, like ColdFusion and Chess, takes only a minute to learn, and can take a lifetime to master. There is a lot to Flash, and a good designer knows how and when to use it to make a site look more professional, or to enhance content that may otherwise appear to be bland. However, beginners seem to tend to develop only in Flash because it addresses many of the apparent problems with XHTML / CSS. Those of browser incompatability, having to learn JavaScript, etc. Someone with limited knowledge of ActionScript and no knowledge of HTML is able to open Flash MX 2004 and create a website. Many designers use Flash exclusively, for this reason.

It seems that XHTML / CSS / JavaScript is having a renaissance. With the proliferation of blog sites, and better browser support of web standards many Flash sites are starting to look tired, and compared with the relative quick response of the HTML sites, many users are deciding to click away from the loading screens in favor of a site with similar content, or products, that is designed in standards compliant XHTML. Not because they love web standards, but because to the user the XHTML site works better and they don't have to wait. I have actually heard designers say that they don't care if dial-up users can't access the site, it has to be beautiful. This thinking is bankrupt, probably 80% of the country is still using dial-up. BroadBand is still frequently ridiculously expensive, and until this changes Flash will be limited to design and car sites mostly, while the bulk of the web is built using XHTML.

I'd actually like to see that change. I'd like to see 3 Mbps synchronous connections standard in every home across the country, and Flash sites loading instantly, but the reality is that it won't happen within the next 5 to 10 years. At least not until garbage cable company decides to charge reasonable rates, and build better fiber backbones, and adequate DNS resources.

In the meantime, I'm quite happy with CSS / XHTML. It does everything I used to do with Flash, but it does it faster and is more accessible. Hopefully more designers will build standards compliant sites, and will realize they can be every bit as beautiful as Flash sites. Check out csszengarden.com to see other great CSS designs.


Microsoft IE Developer Toolbar

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Microsoft IE Developer Toolbar

Picture of Irv Owens Web DeveloperI didn't even know about this and it has been out for about a month. Microsoft has heard the cries from web developers used to using Firefox's developer toolbar extension. While it is often pretty easy to validate your pages using Firefox, see how your block level elements are behaving, and look at the DOM of your page using the Firefox extension, it has been almost impossible with the awful lack of tools for Internet Explorer. They have finally addressed this.

The new IE Developer Toolbar has almost everything that its Firefox adversary has, except for the strong javascript debugger. This is very upsetting especially considering the lame debugging that is built into IE today, but with the relative dearth of tools for internet explorer, anything is welcome.

I have found the toolbar to be extremely useful. The DOM inspector is wonderful in that it highlights the selected item if visible to indicate for which item you are viewing properties. If you have to build applications or websites using Internet Explorer at work, I hope you are designing for Firefox at home, no… I guess you always have to design for Internet Explorer, then you will love the new toolbar. I'd suggest that you download it and install it right away.

IE Developer Toolbar


Preventing Comment Spam With Spamhaus in ColdFusion and Java

Posted: December 31st, 1969 | Author: | Filed under: ColdFusion, Programming | Tags: , , , , | 3 Comments »

Preventing Comment Spam With Spamhaus in ColdFusion and Java

Recently I turned comments on again for my blog, but I started getting hammered with spam comments so I looked into trying to figure out how to stop spammers.

Most people rely on some type of image based spam prevention. This is probably performance wise the best solution, the problem is when people with poor eyesight come in, or actual people come to spam your site. This solution doesn’t prevent that scenario.

A client of mine got me to look into SPF for protecting everyone else from someone masquerading as us. That somehow led me to spamhaus again. I had always thought of using them for spam filtering, but what I didn’t know is that you can use them for web submission protection. On the site, I learned that implementing their blacklist filtering is really easy to implement. Basically the way they work is that you supply part of your visitor, whether they are sending an email on port 25 or whether they are visiting on port 80, since Spamhaus includes their web site on their black list, if someone is sending spam mail, they will return for an http request.

Basically you have to send a request over to Spamhaus’ zen DNS server. If it returns a value, then they are a spammer, or at least they are listed at Spamhaus as a spammer. The method you use is you reverse the bytes of the IP address, for example if the IP address is 2.3.4.5 then you would send a DNS request over to Spamhaus like 5.4.3.2.zen.spamhaus.org. For Java and ColdFusion, I use InetAddress, but there are methods in every language to perform these tasks.

To do this in ColdFusion you could use code that looks like this:


<cfset address = CGI.REMOTE_ADDR />

<cfset addressArr = listToArray(address,".") />

<cfset newArray = ArrayNew(1) />

<cfset newArray[1] = addressArr[4] />

<cfset newArray[2] = addressArr[3] />

<cfset newArray[3] = addressArr[2] />

<cfset newArray[4] = addressArr[1] />

<cftry>

<cfset inet = CreateObject("java", "java.net.InetAddress") />

<cfset inet = inet.getByName(arrayToList(newArray,".") & ".zen.spamhaus.org") />

<cset hostName = inet.getHostName() />

<cfif hostName NEQ "">

<cfreturn />

</cfif>

<cfcatch type="any">

<!-- do nothing -->

</cfcatch>

</cftry>

To explain, in the first line you are starting a cftry / cfcatch block. The reason for this is if the visitor’s server is clean, it will throw an error because it won’t be able to get a response. However, if it doesn’t throw an error, it will be able to complete.

So the way this code works is that it takes the visitor’s IP address, splits it into an array, copies it into a new array, and then reconstitutes it into the IP address string but backwards.

It then creates an instance of InetAddress and calls the getByName method passing in the created address and waits for a response. If it doesn’t get one, it does nothing. If it does get one, and the result is something that isn’t an empty string, such as 127.0.0.2, it will return and not allow whatever it is protecting to be executed.

If you are using Java the code could look like this:

try{

InetAddress inet = new InetAddress();

inet = inet.getByName(backwardsip.zen.spamhaus.org);

// this should return an IP, but it really doesn’t matter what it

// returns

String hostName = inet.getHostName();

// this is just to catch a possible error

if(hostName.length() > 0){

System.out.println(‘Visitor is dirty’);

return;

}

}catch(Exception e){

System.out.println(‘Visitor is clean’);

}

So far this is proving to be a good way to check for bad commentors. The problem is that if you are a high-traffic site, Spamhaus will want to charge you for the use of their static list. If you are a high-traffic site, you would want it anyway for performance reasons. I think the cost may be worth it though in hassle.


Adobe ColdFusion MX?

Posted: December 31st, 1969 | Author: | Filed under: JavaScript, Programming, Uncategorized | Tags: , , | No Comments »

Adobe ColdFusion MX?

Picture of Irv Owens Web DeveloperNow, I am almost never one to stand in the way of business progress, however this doesn't seem to be a good day for application developers. There are those who believe that with more capital, everything gets better, but this developer is not one of those people. Does Adobe have better marketing than Macromedia? Arguably, no they don't. Has Macromedia done a great job of marketing ColdFusion? No, they haven't. Will a combined Macromedia and Adobe do a better job than an unacquired Macromedia? Probably not. I don't think that Adobe will put the resources that are needed behind future ColdFusion development. It is just too far away from their core business competency. So just where will ColdFusion go?

It makes sense for Adobe to sell off ColdFusion and Flex, kill Freehand, GoLive, and ImageReady, and roll Dreamweaver and Fireworks into their Creative Suite in their place. It also makes sense for them to continue Breeze development as that is well within their abilities. Flash will probably thrive under the combined company as should RoboHelp, etc…

I think that if Microsoft is paying attention, and I believe they are, it makes sense for them to acquire ColdFusion from Adobe and combine it, Flex, and XAML into one ubiquitous language. It wouldn't be too hard to map ColdFusion to XAML and vice-versa. The benefit to Microsoft is that they could phase out ASP altogether and embrace the tag-based ColdFusion as their web development language of choice. After all it is in line with their corporate vision which is apparently to let web developers make desktop applications as easily as they currently build web applications.

While I don't particularly relish the idea of Microsoft owning my current development lanugage of choice, they do know a thing or two about marketing code, and it wouldn't be difficult to have it run on top of the .net framework and Java so that it could be portable. It would of course be a little faster on the .net framework. Besides, Microsoft ColdFusion just sounds better than Adobe ColdFusion. Having used the VisualStudio beta for C#. I really like it. I could get used to this being my development environment for ColdFusion. It would also be nice for Microsoft to release a VisualWebStudio for Mac and PC centering around ColdFusion. While we are speculating, it would also be nice to have a .net framework for Mac and Linux, but this could take a while.

So, lets assume that Adobe has a clue of what they have in ColdFusion. They could begin to use it to develop their own desktop application language around Flex and the standalone Flash player. This is why Redmond's ears will be perked up today, and for the next couple of years. Adobe has interest in delivering 3D over the web, and Flash makes a good vehicle for this. It would be possible to either expand the Flash player into a Flash runtime and use ColdFusion as the language to create all sorts of juicy applications that spanned the web and the desktop. They would then be in a position to deliver a rapid development environment for desktop applications and would compete squarely with Java and Microsoft in this space, albeit with a much better interface aestetic.

I hope the latter is what will happen. I belive that competition in all aspects of technology are good for consumers and the overall business. Still, either way ColdFusion has either a very bright future, or a very convoluted future. I find it interesting that none of the analysts looking at this acquisition are looking at ColdFusion. I guess that is because it isn't the primary business driver for Macromedia, and Adobe is all about graphics, which is my primary concern.

There is a third option, and one which looks really good to me. It is possible that Adobe will simply allow ColdFusion to languish and eventually the product in that form will atrophy and die. This would be bad, but there is an open source movement for an OSS version of ColdFusion. It would be sweet to see this because it would become more robust, more object oriented, and a lot faster. It would also be more secure because of all the eyes on the code. ColdFusion could become an underground hit, much the way that PHP has been getting a lot of attention recently.

Enter Apple. Has anyone been paying attention to the Apple Widgets in the new Tiger? Does anyone get how important this is? Web developers can create really sexy looking desktop applications as widgets using JavaScript and CSS. This has massive implications as there is already a significant installed base of JavaScript developers, and many of them happen to be pretty good at CSS. JavaScript has been seeing a revival of late and I expect that it will continue. Soon delivering cool applications over the web to Mac users will be easier than it is to learn C# and do it for Microsoft users. Enterprises will feel good about building enterprise applications that use these widgets to communicate with Java applications on the back end. Many people are switching to the Macintosh because they feel more secure running Linux than they do Windows, and this is another reason for businesses to embrace the Mac, although many of them don't realize it yet.

All of this will marginalize the need for PC users to upgrade to Longhorn. Microsoft already is going to have a tough sell to businesses based on the stagnation of hardware sales and the poor business case for upgrading. Most large organizations are still running Windows 2000, and they are going to tell them that they have to upgrade every system company wide in order to run this? If they can get away with it, I'd expect most organizations to upgrade to Macintoshes because of their lighter IT demands and more granular controls over user access.

Microsoft has to get XAML right, and it makes sense for them to buy their only real competition which is ColdFusion, especially now that it is owned by an ally who is almost incapable of understanding it, or its fanatical developer base (of which I am proud to be a part). They would probably let Microsoft have it for a song, and ultimately ColdFusion would be a more robust language with wider appeal. This would be a good thing. But Microsoft really needs to get their act together today if they hope to sell even one copy of Longhorn Server. If CF were bundled in the IIS package with this, I would most certainly upgrade to it. I think that most developers who don't have an irrational hatred of Microsoft would too, if it were a serious effort to make both CF and IIS better. My major gripe with Microsoft is that they make consistently boneheaded business decisions, missing the boat entirely in some places, and jumping out with an idea that is ten years ahead of its time in others. I don't hate them for obscure philosophical reasons, in fact I don't hate them at all, I just think they aren't getting the best out of their products or their developer community, and aren't offering their customers what they want.