With the purchase of my new router a week or so ago, I realized that now that I was able to actually forward my ports instead of just opening them, I could now run my site off of a home webserver thus giving me an infinite amount of control over the site, its server, and even access to the operating system itself. This was a tantalizing incentive, and more than enough motivation to devote the time and effort to get it working.
First off, the bandits over at Optimum Online block port 80, the default port for http traffic. In order to unlock it, you have to sign up for their upgrade package at an extra $10 a month. I realize this is a minuscule amount of money, but it's the principle of the matter. They are trying to extort money from their customers by purposely blocking ports that are used in conjunction with certain functionalities, a side effect of which is indirect control on the capabilities of a user's home set-up. I for one would like my overall functionality to be limited by the extent of my knowledge on computer technologies. I shouldn't start working with a new piece of software or a new protocol only to later discover it was fruitless as my ISP for some reason seems to disagree with the assumption that after already paying for an internet connection and all the hardware associated with it, I should have full and unmitigated access to the internet, over any and all of my ports as I see fit. You'd think that when a plumber connects a water line to your house, and you pay for the pipes and water, you'd be allowed to flow it out of whatever sink you wish... But I digress.
Bypassing Optimum's extortion scheme took a bit of resourceful thinking. I wound up daisy-chaining free domain names to manipulate a smooth redirection process. I had registered a second DNS name, and bound that to the webserver so that browsing to domain.com will redirect you to domain2.com:8080 which the server runs on.
While I was designing the site, I made the extra effort to make it conform to the XHTML 1.0 transitional web standards set out by the W3C (World Wide Web Consortium). However due to the rather unconventional redirection procedure, The W3C validation tool can not access my server, so I validated the site by copying and pasting the code into there validator's direct submission thing. I figure I'd let it be because the average visitor to the site won't care, and those who do could easily copy and paste the source code themselves, but rest assured it is web standard XHTML.
In other news, this web server is not currently hardwired into the router, it's wireless. For some reason my wireless PCI card is supported by Linux out of the box, while the integrated ethernet controller on the motherboard is not. Odd yes, but a cheap fix. Tomorrow I'll rummage through a garage full of old computers looking for a card, and if for some odd reason I don't find one, they run for around $10 at my local Microcenter. I figure the hard line has become a requirement at this point as there is so many things going on on this computer. It's a Bittorrent Dedi, a print-server, a web server, a samba server, and soon to be more. This is a lot of bandwidth to travel of a wireless G connection, so the card is imperative.
edit: W3C validation now fixed. The buttons successfully forward to page to the validation tool, which will tell you the code is web standard.
Rudy Rucker on Walkaway
15 hours ago