Posts Tagged ‘host files’

Using Host Files for UNC Paths

Friday, March 15th, 2013

Years ago I strongly advocated the use of host files for “better environment portability and mobility”. Over the years I’ve used this trick to simplify WCI patch installs, database migrations, and server upgrades.

But, one use of host files has always proven a bit troublesome – the UNC path for published content in WebCenter Interaction Publisher:
publisher-unc-path-alias

Specifically, in Windows Explorer, if you’re trying to access a share called “publish” on a machine called PROD-FILESERVER, you can type this in Windows Explorer:

\\PROD-FILESERVER\publish\

If you are back-filling your database from production to a development environment, this name is also synchronized, and you certainly don’t want your Dev environment connecting to your Prod file server. The solution would be to use the host name aliases we’ve already discussed, so you could use a name like this:

\\WCI-NAS\publish\

… and then have WCI-NAS resolve to the respective file server in each environment.

This tweak, though, doesn’t work unless you use the trick found here: Disable Strict Name Checking.

NETBIOS enforces “strict name checking”, so connecting to a machine by name that’s not the actual machine name is prohibited by default. But that’s exactly what we want – our Production, Development, and Staging environments all using \\WCI-NAS\publish\ as the file share. That way, when we sync the database between environments, we don’t have to change the UNC paths all over the place.

In other words, to make this host file aliasing work work with UNC paths, you can just add a DWORD with a value of 1 called DisableStrictNameChecking to HKEY_LOCAL_MACHINE\ System\ CurrentControlSet\ Services\ LanmanServer\ Parameters.

Changing the Server Name for Automation Server

Thursday, July 29th, 2010

It’s not without some controversy (OK, “spirited discussion”), but I’ve strongly recommended the use of host files to aid environment portability.  If you’re a believer in this “alias” approach, you’ll find that for some components, it isn’t very obvious how to set up those aliases.  This isn’t quite a host file hack, but serves the same purpose:  when you migrate the database from one environment to the other, you want to avoid having to change as many settings as possible.

One of these settings is the ALUI Automation Server: in “Select Utility: Automation Service”, you get a list of servers running the Automation Service, and can set which administrative folders are associated to which Job (aka Automation) Servers.  If you migrate the portal database between environments, you might have one entry show up for “PRODPORTAL3″ (in prod) and another for “PORTALDEV9″ (in dev).  But then in the dev environment you have to re-register every one of the folders that was associated with the prod folder. 

What if you could just create an alias that worked in both environments?  Fortunately, you can, and the tweak is easy:  Just edit %PT_HOME%\settings\configuration.xml in both environments, and change the value below to be the same thing.  Then, when the automation server in either environment starts up, it’ll look for jobs registered with that same name:

<component name="automation:server" type="http://www.plumtree.com/ config/component/types/automation">
<setting name="automation-server:server-name">
<value xsi:type="xsd:string">WCI-AUTOMATION</value>
</setting>

Oh, and if you’re a “UI” kind of person, you can achieve the same result by changing the name through the Configuration Manager:

Use Host Files for better WCI security, portability and disaster recovery

Tuesday, June 8th, 2010

When configuring a WebCenter Interaction portal, it’s highly recommended to use host files on your machines to provide aliases for the various services.

For example, instead of referencing Publisher’s Remote Server as http://PORTALPROD6.site.org:7087/ptcs/, create a host file in C:\Windows\System32\drivers\etc\hosts, and add a line like this:
wci-publisher 10.5.38.12 #IP Address for Publisher in this environment
… then set your Remote Server to http://wci-publisher:7087/ptcs/.

I’m always surprised how many times the knee-jerk reaction to this suggestion is that this is a poor “hack”, or something worse like this:
“Host files??? Host files on local servers need to be avoid and you should use DNS in AD for the Portal servers. Host files, again, are an antiquated and unmanageable configuration in this day and age and, in my opinion, should only be used when testing configurations—not for Production systems. I haven’t seen host files used locally on servers in a decade…is that how you are configuring this portal system? If so, I would highly recommend you try to use the AD DNS instead.”

Yes, that’s an actual response from an IT guy who prefers telling others what idiots they are rather than actually listening to WHY this approach is being used.  In all fairness, most knee-jerk reactions are based in the reality that host files are more difficult to maintain on many servers rather than DNS entries on a single server.  But hopefully, if you’re reading this blog, you’ve got an open mind, and will agree with this approach once you see the list of benefits below.

Benefits of using host files in your portal environments:

  1. Security.  When you access a service through the portal’s gateway, the name of the remote server shows up in the URL: http://www.integryst.com/site/integryst.i/gateway/ PTARGS_0_200_316_204_208_43/ http%3B/wci-pubedit%3B8463/publishereditor/ action?action=getHeader.  For most people, this isn’t a huge problem, but allowing the name of the servers to be published in this way can be perceived as a security risk.  By using host files, you’re essentially creating an alias that hides the actual name of the server.
  2. Service Mobility.  Take the NT Crawler Web Service, for example.  When you crawl documents into the portal, the name of the server is included in the document open URL.  Now suppose the NTCWS is giving you all sorts of grief and you decide to move it to another server.  If you use host files, you can just install the NTCWS somewhere else and change the IP address that the wci-ntcws alias points to.  This way, the portal has no idea the service is being provided by another physical system  If you used a machine name, all documents would get crawled in as new the next time you ran the crawler, because the card locations will have changed.
  3. Maintainability.  This one’s a pretty weak argument, but is based on the fact that most of the time, the Portal Admin team doesn’t have access to create DNS entries and has to submit service requests to get that done.  By bringing “DNS-type services” into host files, the portal team can more easily maintain the environment by shifting around services without having to submit “all that paperwork” for a DNS entry (your mileage may vary with this argument).
  4. Environment Migration.  Here’s the clincher!  Most of us have a production and a development environment, and occasionally a test environment as well.  Normally, code is developed in dev and pushed to test, then to prod, but content is created in prod, and periodically migrated back to test and dev, so those environments are reasonably in synch for testing.  This content migration is typically done by back-filling the entire production database (and migrating files in the document repository, etc.).  The problem is, all kinds of URLs (Remote Servers, Search, Automation server names, etc.) are stored in this database, so if you’re using server names in these URLs, your dev/test environments will now have Remote Servers that point to the production machines, and you need to go through and update all of these URLs to get your dev environment working again!  If, however, you use host files, then you can skip this painful step:  your Publisher server URL (http://wci-publisher:7087/ptcs/) can be the same in both environments, but the host files in dev point to different machines than the ones in production.  Cool, huh?
  5. Disaster Recovery.  This is essentially the same as the “Environment Migration” benefit:  When you have a replicated off-site Disaster Recovery environment, by definition your two databases are kept in synch in real-time (or possibly on a daily schedule of some sort).  If a disaster occurs and you lose your primary environment, you’re going to want that DR site up as soon as possible, and not have to go through changing all those URLs to get the new environment running with new machine names.  Of course, unlike “Environment Migration” (where your dev, test, and prod environments typically share the same DNS server), this argument is also slightly weaker.  Since the DR site will likely have its own DNS server, you could conceivably just use different DNS entries at the two different sites and all will work fine.

So that’s it – hopefully you’re convinced that host files are the way to go for configuring ALUI / WCI portals; if so, stay tuned for helpful tips on how to set this up for various servers.  While Remote Servers are a no-brainer, configuring things like Automation Server and Search can be a little trickier.