Stories by Paul Venezia

Tutorial: How to create your own VPN

If you need to encrypt traffic from your computer or mobile device, you have many options. You could buy a commercial VPN solution, or you could sign up for a VPN service and pay a monthly fee. Or for less money, you could create your own VPN and gain the use of a Linux VPS (Virtual Private Server) anywhere in the world.

Opinion: It's high time to ditch the fax machine

There are only two types of technology that I absolutely hate with a passion: printers and faxes. Printers are obviously the bane of IT. With all those drivers for every operating system version (usually about 150 times the size of the actual driver file itself), a predilection for jamming, and of course those ever-popular toner explosion scenarios, I'm still scarred by memories of printer disasters.
But I can accept that printers exist because, yeah, sometimes things need to be printed out. Faxes, however, should be banished to the land of RLL drives and the 5.25-inch floppy. Faxes have no need to exist today, yet they're still all over the place. It's maddening.
Consider what a fax machine actually is: a little device with a sheet feeder, a terrible scanning element, and an ancient modem. Most faxes run at 14,400bps. That's just over 1KB per second — and people are still using faxes to send 52 poorly scanned pages of some contract to one another. Over analog phone lines. Sometimes while paying long-distance charges! The mind boggles.
Here's what a fax should be: a little device with a sheet feeder, a reasonably solid scanning element, an Ethernet cable, and no modem whatsoever. It should just be a network scanning function. That's it. You drop a paper document in the feeder, run a small applet on your computer (or on the device itself) that drags the resulting scan into a nice clean (and possibly encrypted) PDF on your system or network, at which point you use any number of methods to send it to someone. Heck, be "old-fashioned" and email the thing. It'll get there in 1/100 the time of the fax, it won't cost money or tie up a phone line, and it will result in far better quality than a low-res scan compressed to squeeze through a data path with the same bandwidth as a piece of bailing wire.
Yet the ancient fax machine survives.
One of the big reasons that people resist mothballing the fax machine is that some items need signatures. I can understand that, but there are already a dozen ways to digitize your signature and apply it to documents. You could also "sign" on your touchpad. These days a signature is all but worthless as an actual traceable indicator of identity, as borne out by the hundreds of times I've signed a POS credit-card machine or delivery service signature pad. Even I couldn't tell you that the resulting signature was mine seconds after I'd signed it. They all look more or less like a random horizontal line that may or may not start with what might be a P. Or an R. It doesn't seem to matter.
Nonetheless, I can understand that traditionalists might demand faxes for contracts and other documents that need an ink signature. OK, fine, let's limit it to that. But no! According to GreenFax, there are over 200 billion pages still faxed every year, and you can be damned sure they're not all contracts. Informational faxes fan out from tons of organizations to tons of other organizations -- with a computer generating the faxes on the send side using a fax modem to send them. It's like a tiny bubble of 1995 surrounds every fax machine. A computer could easily be tasked with emailing the same information or even (gasp!) updating an RSS feed.
True, email is not what it once was, thanks to the bottom-dwelling villanous scumbags that persist in sending billions of spam messages every year. But I'd venture that for person-to-person document transfers, email is far better than faxing, which can easily result in illegible pages, printing errors, faxing errors, security problems on the receiving side ("hmm, this looks interesting"), and a bevy of other problems.
Smart companies don't bother with paper-based fax machines and instead opt for document centers that have scanning abilities in addition to fax capabilities, although they're still beholden to accept faxes. If they're smart, they'll employ a fixed fax server that has four, six, or two dozen analog lines attached and the ability to take incoming faxes and turn them into PDFs and email them internally (or, sadly, ship them directly to a printer). That's still better than a cranky old fax machine sitting in the corner with a pile of forgotten incoming faxes in the hopper. Heck, you can check out any number of companies (like GreenFax) that handle all the fax-to-email capabilities you'd possibly want.
We've been promised the paperless office for decades. Every time someone jettisons a fax machine, that promise gets a little closer to reality. Every time a company omits fax numbers from its business cards and website, it gets even closer. Every time someone sends a 50-page analog fax of a document they just printed out from a PDF on their desktop, it gets further away.
If something as appallingly stupid as the fax machine can live on, it makes you wonder how we make progress at all. Old habits die hard. It just goes to show you: Bad technology generally isn't the problem; it's the people who persist in using that technology rather than embracing far superior alternatives.

Opinion: Addicted to IT - quitting is not an option

It's this time of the year when I find myself gazing wistfully out the window, taking in yet another beautiful day from the confines of my office and wondering if maybe I shouldn't give it all up and become a shepherd. Alas, my computing chariot awaits, and I turn back to my keyboard.
During the decades I've spent in the deepest corridors of system and network architecture, I used to wonder how anyone could do anything else. This field has it all: intrigue, mystery, and constant problem solving. It's engaging work that (in most cases) produces clear results and a sense of accomplishment.
There's no end to the ways that a clever brain with the right idea can not just succeed, but provide tools, frameworks, and solutions that help thousands or millions of people. Add to that the "new toy" culture that outfits IT with sleek and speedy servers, big storage, and the oohs and aaahs of virtualization, and it seemed less like drudgery and more like a fantastic puzzle that you get paid to solve.
The other side of that is all the hours spent in windowless data centres, the overnight maintenance windows, the frustration when things aren't going right, myriad compliance headaches, and the pressure to find a solution to an emergent problem as soon as humanly possible, driven by people who have no idea of the complexity of the task. Yet for me, it was always easy to overlook the bad in favour of the good, especially when a pallet of new servers and network gear shows up, ready to be built out into a new data center or server farm.
As time passes, I find I don't enjoy the thrill of the chase quite as much as before. It seems that finding and fixing system and network problems isn't as exhilarating as it once was, and the idea of tackling a large data center build conjures up thoughts of the beaucoup headaches involved in little details like wiring and equipment procurement. Don't get me wrong, I still love the smell of a new data center in the morning, but I'm now less inclined to stay up until 2am tweaking things until they're just right.
So when I look out that window and see a robin perched on a tree gently swaying in a warm breeze, I begin to wonder just what life would be like if I wasn't in the IT game any more - if I didn't have to keep up on every new development and trend, if I wasn't constantly bombarded with new technologies at every turn, and instead spent my days making sure that the sheep didn't range too far. I could sit in the shade under that very tree, my brain free of all worry and concern over massive projects, the roar of the air conditioner absent from my daily reality, the very idea of checking my email every few minutes an absurdity. How simple, how stressless that seems - how perfect.
When my brain snaps back to reality, I understand completely why I do what I do - because I have the type of brain that needs constant feeding, constant exercise. The life of a shepherd may seem hopelessly romantic and enticing when labouring under a build deadline or frantically searching for the cause of an abrupt network crash, but I'd be a fool if I thought that I'd last more than a day or two without some knotty problem or project to occupy my head.
That's why I'm in IT. As exhausting as it can be, it provides rich mental rewards if done right, and to many that's addictive. I've had this addiction for so long, I'm way past therapy or treatment. I'm a lifer.
But maybe I can split the difference now and again. I think I'll spend the next few hours under that tree - with a laptop. There are storage migrations to perform, after all.
Venezia is InfoWorld's Deep End columnist

Opinion: How security became mission impossible

It's been quite a month for network and computer security folks. Sony's network was hacked — what, a half-dozen times? I've lost count. Then apparently everything from the CIA's website to your grandmother's embroidery blog was successfully compromised. It's almost like someone wants to prove a point.
The fact is that, even with the proliferation of computer and network security tools, it's easier than ever to compromise a network. Couple the economic downturn, which has resulted in the layoffs of thousands of skilled IT workers, with willy-nilly implementations of highly public internet applications and frameworks — plus the extreme effectiveness of today's hacking tools — and you have big problems.
It does require skill to crack into a corporate server, but that expertise need only be possessed by a few people who write the tools that let anyone halfway adept get into the game. After all, known exploits are known exploits, and if you can figure out how to use public proxies or other anonymising tools and fire up a few apps, you too can get in on the fun.
On the other side of the fence, there are — and should be — a whole bunch of very worried CSOs. These execs should be aware not only that their networks are going to be targeted, but because they know they simply aren't equipped to deal with the problem. Simply, the weapons on the other side are more effective than theirs. They're outgunned and outnumbered.
Part of the problem with working IT security is proving your worth. Bean counters can easily dismiss IT security as a money pit because we haven't been hacked. Proving that negative when budgets are tight can be challenging. I've actually heard the argument that IT security staff should be laid off because "who would bother hacking into our company?" Of course, the answer is obvious -- hordes of 14-year-old kids armed with underground hacking tools and boredom. Or, if you're unlucky, a Russian criminal organization that decides you're worth hacking after all.
The best way to try to protect against attack is to hire competent security people — but also to make sure that new projects are not rushed into production in an effort to meet some kind of deadline defined by nontechnical management. That's exactly how big security holes are created and how everything falls apart very quickly.
Also, make sure you're conducting regular internal and external security audits from highly reputable firms. This should include everything from external penetration testing to training employees to avoid social engineering ploys. In addition, regularly scan for rogue access points and keep close tabs on what goes into and out of the datacentre — and what's actually in there. After all, a SheevaPlug looks like a wall-wart power supply and could be doing all kinds of nasty things while affixed to a wall behind a desk when nobody was looking.
I know this advice sounds like your dentist admonishing you to floss three times a day and brush five, but it's good practice, even if these measures won't protect you from a few thousand loosely organised teenagers armed with Low Orbit Ion Cannon and IRC.
Let's face it, protecting an Internet-connected network of any size is no simple task, and it'll only get harder. If you've never been compromised, it's probably not that your security is all that great, it's because you haven't been noticed — yet.

XP will be around a while yet, despite Windows 7

According to a ScriptLogic study, 60 percent of all companies surveyed said they will not be moving to Windows 7 any time soon. Thirty-four percent said they'd probably deploy by the end of 2010, but even that number may be optimistic. This means that by 2011, for the first time ever, a 10-year-old operating system will still be the most-used desktop OS.
Of course, Microsoft's licensing means that this unfortunate fact won't cut too deeply into the company's bottom line. While Microsoft's OS market may be stagnating, hardware is hardware, and it will fail and need replacing. That's when they'll manage to sell you yet another licence that can be downgraded to XP.
As the recent support extension for XP shows, Microsoft does see that users aren't falling all over themselves to upgrade to Windows 7, just as they weren't for Vista. The fact that many seem to hail Windows 7 as a far better OS than Vista doesn't really make a difference – the real problem isn't that Vista or Windows 7 aren't ready for the enterprise, it's that for the vast majority of business cases, Microsoft XP with Microsoft Office 2000 is all that's necessary – possibly for quite some time.
After all, why do you think that Office 2007 had a massive UI change? Because that was one of the only ways to differentiate it from Office 2003. The back-end stuff, like support for the OpenDocument Format, could have been added to Office 2003 as it was to Office 2007. Office 2007 was basically a "New and improved!" sticker on Office 2003.
As far as business desktop computing goes, that's a novel idea. For the past 15 years companies have been upgrading constantly, moving from Windows NT 4 to Windows Server 2000 to Windows Server 2003 or, on the desktop, from Windows 95 to 98 to 2000 to XP. And that's where they sit.
As a consultant during those fiery days, it was upgrade or die, and I was on the front line – the inherent problems in NT and 2000, and with Windows 95, 98, and 2000, made yearly upgrades essentially a requirement. If I had a dollar for every Windows NT-to-2000 migration, or Windows 2000-to-2003 migration I ever did – actually, I have more than a dollar for each one. Never mind.
The reality is that in any industry that grows as fast as business computing has, there will come a point of "good enough". That's where we are right now. The vast majority of ISV applications in use support XP and still don't officially support Vista. Nine-year-old XP is still the sweet spot.
I recently spoke with an IT manager who was budgeting for an Office 2010 upgrade from Office 2003. I casually asked him what features he had deemed important enough to justify a US$100,000 budget item. He thought for a minute and admitted that he couldn't think of a single one. So I asked the logical follow-up: Why are you buying it? He had no answer for that either. The $100,000 line item disappeared. He is also sticking with XP.
This isn't Microsoft's fault, necessarily, unless you believe that they caused this by finally coming up with a somewhat stable and secure desktop OS (that statement includes a huge grain of salt). The company certainly did delay far too long in releasing Vista, which was late, slow, buggy, expensive and essentially DOA, and then compounded the issue by hyping Windows 7 a year before its release. This provided some cover for the Vista debacle, but also ensured that several more years would pass before most companies would move beyond XP – a modern-day example of the Osborne Effect. The economic downturn just solidified this situation.
The past 15 years have been a whirlwind of innovation, expansion, invention, and production. The next 15 will be the same – but not in the same places. The corporate desktop is mature in both hardware and software. Same for servers and network architecture – very few companies actually need 10G. The new frontiers are portable productivity, interconnectivity and virtualisation. Unfortunately for Microsoft, they're far behind in those categories. After all, it's hard to quickly move in new directions when your saddlebags are full.

Are sealed-in laptop batteries a good idea?

When Apple introduced its new MacBooks recently, it touted a doubled battery life -- but noted that the laptops' batteries were sealed into the case, not user-swappable as is the norm on laptops.

Windows 7 first impressions and Microsoft's XP quandary

After downloading the Windows 7 beta last week and tossing it on a VM, I finally made the move and selected it as my default Windows installation. Normally, my Windows desktops are VMs that I RDP into from my Macs, via the VMware Workstation console on my big Linux workstation where they run. However, I have an IBM Intellistation zPro running Vista that's my 'big box' Windows install. It has been powered off for a few weeks now. Instead, I opted to take an HP 2710p tablet and turn it into my physical Windows 7 box, at least to start with.
There was nothing special about the installation, except for the fact that it thankfully only takes a few clicks. Much to my surprise, however, when the 2710p booted post-installation, just about every piece of hardware was accurately detected and available. A few pieces missed the cut, such as the fingerprint reader, but Windows 7 helpfully pointed that out and even gave me a link to download the Windows 7 driver and software for it. That was very handy.
The new taskbar is supposedly the killer feature of Windows 7, and it's certainly an improvement over every other taskbar implementation Microsoft has introduced. Unlike some others, however, I don't see this as besting Apple's Dock, but it's definitely a better alternative to anything found in Windows XP or Vista. However, the fact that clicking on an open application icon in the taskbar doesn't actually bring that application window into focus if it has multiple open windows drives me nuts. You have to click the icon, then select the window you want — too much mousing around.
After reading Tim Sneath's "Bumper List of Windows 7 Secrets," I opted to place the taskbar to the left of the screen, and I find it handier on smaller screens.
The eye candy when hovering over active applications is nifty, and it makes window selection simpler to some degree, with the caveats noted above. Microsoft even introduced single-app window selections, a la Mac OS X's Cmd-~ switcher, but it's really annoying to access. To only switch between windows in the active application, you must hold down the Ctrl key and repeatedly click the icon in the taskbar. This is less than useful. There's an Alt key and a tilde key on this keyboard — use them.
Fast application switching is a problem on Windows now and will continue to be in the future. The issue is the Alt-Tab method of switching between open windows, which is only useful if you have a small number of open windows. When you have dozens, it's fairly useless. When this switching method was first introduced, there simply wasn't enough horsepower to have dozens of windows open anyway and so it didn't matter. Times have changed, however. Apple's method of using Cmd-Tab to switch between applications, and Cmd-~ to switch between windows in that application is a much better design.
All the Vista-compatible applications I've tried have successfully started and run under Windows 7 beta, including several that were problematic under Vista, such as the ShoreTel client. There are the normal UAC annoyances, but they can be disabled just like Vista if you like to live on the edge. As with all my Windows installations, I paid the performance penalty and installed a virus scanner and Windows Defender to keep an eye out for viruses and malware.
This being a tablet, I was able to test those features too. It's much the same as with Vista, no problems there. Of course, with my newfound use of the left-hand taskbar, I had to move the Tablet Input Panel to the right-hand side of the screen. It might be a little faster than Vista, but I haven't noticed any significant increase in performance on this laptop and I'm not going to run benchmarks on beta code.
So Windows 7 is very nice and all and perhaps it is the best iteration of Windows yet, but there really isn't that much there, other than the improved taskbar. When Apple released Leopard, there were significant feature additions and improvements, such as Spaces, Time Machine and so forth. That type of step forward seems to be missing from Windows 7. In addition, there's still no compelling reason for businesses to switch from Windows XP and that's Microsoft's biggest problem of all.
For the majority of business users, Windows XP and Office 97 or 2000 are more than enough to handle day-to-day business tasks. In this economy, upgrading functional server and desktop operating systems isn't in the budget, especially when there's no business reason to do so. This isn't a secret, since the corporate response to Vista was lukewarm at best. Those who jumped on the Vista bandwagon initially, certainly aren't going to be thrilled to make another leap to Windows 7 so soon. I do know that I'm in absolutely no hurry to move away from Windows XP as the corporate desktop except in some extreme cases — it's just not worth it.
This leaves Microsoft selling to home users and power users, while also relying on the OEM installations to push copies of Windows. The problem there is that by working on Windows 7 so soon after Vista's launch, Microsoft has effectively orphaned Vista — why buy a copy now when Windows 7 is right around the corner?
In a nutshell, did Microsoft peak with Windows XP?

Low-end storage benefits both home and business

I can recall a time when my 45MB Priam RLL hard drive was too massive for comprehension. It was huge: 5.25 inches, full height, as loud as a siren, and a great addition to my 386SX/16. Ahh, the good old days.
Nowadays, 45MB is a reasonable size for, say, the EULA on most large commercial software packages. Terabyte hard drives are becoming the norm, and the future looks bright ... but who's controlling all that storage? In most cases, people at home are attaching large drives to their PCs or Macs and sharing them from Windows or Mac OS. This is a pretty simple solution, but it isn't really the best idea, especially when PC-introduced viruses or other corruption can destroy all your music, photos and videos in the blink of the proverbial eye. Those of us who are tech-savvy know the value of backups, as does anyone that's already had the misfortune of losing years' worth of data to an errant drive head and a scratched platter. But what does it take to back up a terabyte drive in your house? Another terabyte drive, of course.
These days, if you're buying a large disk for storage of any sort, you really must insist on at least RAID1. Those big external drives that are actually two physical disks in a RAID0 stripe are a really bad idea. It doubles the available space, sure, but it also doubles the risk of complete data loss, because when one disk goes, all the data on both drives is gone. I've documented my trials with cheap reliable storage over the past few years, including my now-defunct 3ware-powered central server, up through my current central file server, the Synology CS-407 that I reviewed earlier this year. I'm coming up on nearly a full year of continuous service from that CS-407, with no problems to report — which is obviously the ultimate goal of these devices.
In the past year, more and more people have realised the importance of small-scale redundant storage. Devices like those from Synology are benefiting from this knowledge, and are trying to meet the demand for easily configurable, reliable personal central storage. LaCie, NetGear, Buffalo, and Cavalry are all offering home NAS systems with a variety of configurations. The upshot is that if you have more than one PC in your home (or any of the more modern console gaming systems), you should be skipping the USB and FireWire drives and heading straight for the home NAS appliances.
Synology is also branching out into the SMB market with a few new devices that offer the same platform as their home units, but with rackmount and redundant power options. The upshot is that it's possible to plunk down only a few thousand dollars to get a multiterabyte storage server that supports Microsoft AD authentication, CIFS, NFS, AFS, FTP, and HTTP file sharing, all wrapped in a RAID5 package with SATA drives. It certainly won't hold a candle to a high-end SAS storage server, but it's also about one-tenth the cost for reasonable performance. The big storage vendors are also seeing this lower-end push; NetApp, for example, is currently steering its StoreVault line to the SMB market.
In the larger datacentre, these devices may not be suitable for mission-critical storage due to their lower performance as compared to their higher-end cousins, but they can still be used in a wide variety of applications that don't require high throughput. These include: Disk-to-disk-to tape backups; image storage for PC and laptop images; nearline backups of highly available central storage; a holding ground for test/lab VMs from Xen, VMware, or any other virtualisation platform; a catalogue of music to be pumped at midnight during planned maintenance on datacentres; whatever — there's always a need for big, cheap storage in every organisation.
At the end of the day, no matter what the EMC sales rep says, you don't need to store backups of ex-employees' mailboxes on high-end Fibre Channel storage arrays — a Synology RS408 is likely to be more than adequate.
Speaking of the Synology RS408, I'm currently testing one in the lab now for an upcoming review. All I can say so far is that the new AJAX web interface for the Synology NAS line may be the most attractive, navigable and usable appliance interface I've ever used. It really is impressive.
So as the high-end consumer and low-end enterprise storage offerings merge, and you suddenly realise that you've exhausted the storage on yet another external hard drive, remember that friends don't let friends run RAID0. The same goes for colleagues and especially IT directors.

Know-how makes Unix work a breeze

On my to-do list one day last week was to migrate an aging Fedora Core 3 server to new hardware running CentOS 5.2. At first glance, it seemed to be a pretty straightforward task. If the old server had been running just a single app or service, it would have been simple, but the reality was that this server was running eight FLEXlm licence servers, several small web applications, a smattering of network telemetry tools, functioned as an NIS slave, served as a loghost and hosted a local CentOS yum repository.
I had an edge here due to the simple fact that I built this server four years ago, but that's far from making this a slam dunk. Many, many things have changed at the OS level between Fedora Core 3 and CentOS 5.2 (which is basically RHEL 5), the number of services provided by the server had quintupled, and I had only a five-minute window to make the swap.
However, these are Linux boxes. That makes all the difference.
The first order of business was building the new box. Using rpm, I built a list of installed packages from the old box, formatted in such a way as to pass that list directly to yum on the new box, pulling from the yum repository on the old server. Within five minutes (with only a few tweaks for packages that had changed names or been eliminated), I had all the packages installed on the new server that I'd need. They ranged from compat libs to MySQL, to ypserv. I then rsync'd the /var/www/ tree, the /usr/local/licenses tree, the /var/yp tree, and pulled over the ntp, snmpd, nrpe, yp.conf and ypserv.conf files, among others. All of those services fired up without complaint. I then rsync'd the custom tools from /usr/local/bin and /usr/local/sbin, in addition to all the custom /etc/init.d startup scripts for the FLEXlm licenses, brought over the required /etc/httpd/conf.d/ includes, and added necessary crontab entries to the new box. I copied over the various NFS entries from /etc/fstab and wrote a quick script to make all the necessary directories to mount those shares. Since the licenses were bound to the specific MAC address of the original server, I added a MACADDR=xx:xx:xx:xx:xx line to /etc/sysconfig/network-scripts/ifcfg-eth0 to spoof that MAC, and arranged both interfaces to assume the IP addresses of the old server on reboot. A few modifications to the startup scripts with chkconfig, pulling over the original SSH keys, and an edit of /etc/sysconfig/network, and I was basically all set.
I wrote a quick script on the old box to turn down the physical interfaces, stop all the licence servers, and re-IP the old server. When the cutover window arrived, I rebooted the new server, causing it to come up with the same name and IP addresses as the original, and simultaneously ran the turndown script on the old box. When the new box finished booting, all required services were running, all licence servers (save one) were active, and all the web apps worked. The old box was sitting at a different IP (with a MAC address of DE:CA:FF:C0:FF:EE), and the new box had successfully assumed all the responsibilities of the old box. The NIS maps pushed without issue, YP clients functioned normally, the various Perl and PHP telemetry tools were happy, and all was well except for one license server that was old enough to have been compiled against glibc 2.2.5. Lacking the source, I poked around Google for a few minutes and found an open directory containing the required server daemons compiled against a much more recent glibc, and a few minutes later, that licence server was up and running.
The downtime for this cutover was less than one minute. Nobody using the bevy of NIS-bound servers and FLEXlm-licensed applications even noticed. In short, this was a major cutover that happened during prime time, and flew completely under the radar — exactly as it should be.
I spent more time double-checking my work than I did actually preparing the server for this transition. Had I gone for broke and not checked anything, I might have put two hours into this whole procedure from start to finish. As it was, it took around five hours of prep time (including a considerable amount of navel-gazing to run through mental checklists) to complete the whole transition. Nagios never even noticed.
This is how it should be. This is how these projects should go. This is why I'm a fan of Unix-based operating systems. To some, they make easy things hard, but to those of us who know how to bend them to our will, they make hard things easy.

Toybox: MacBook Air is light as, well, air

The MacBook Air isn't designed to be a desktop replacement system, and it doesn’t have desktop-like specs, unlike the MacBook Pro and other 2.4GHz-plus Core 2 Duo laptops on the market. Could I live without a bevy of ports and a DVD drive? Could I use the Air to do real work?

Asterisk emerges from VoIP’s wild open-source frontier

Nearly three years since Jon “maddog” Hall, the executive director of Linux International, predicted that “VoIP using an open source solution, such as Asterisk, will generate more business than the entire Linux marketplace today,” open source VoIP for the enterprise remains a wild frontier.

Anticipating the new Longhorn Active Directory

Just as Windows Server 2003 made significant improvements to Active Directory, Longhorn promises to follow suit. When AD was first deployed under Windows 2000, managing a Windows domain became much easier. With Server 2003, Microsoft kicked it up a notch, adding such functionality as group editing, simpler object editing and a more fluid management interface.

Sun adds management capabilities to JDS

The newest incarnation of Sun Microsystems 's JDS (Java Desktop System) is a visually pleasing desktop loosely based on SuSE Linux, though it departs from standard Linux distributions in mildly annoying ways. That said, the biggest change in this revision is the management back end.

NetIntercept delivers scrutiny for less

SAN FRANCISCO (09/26/2003) - True real-time network forensics implementations are generally a rich company's game. The high-performance devices are capable of deep-packet analysis, session recreation, reporting, and auditing -- and that generally comes at a high price.

[]