Archive for June, 2010

Cool Tools 6: IE Web Developer 2

Thursday, June 24th, 2010

After upgrading from ALUI 6.1 to WCI 10gR3, all of our portlets looked … wrong.  The background color had reverted the blue, and they were cutting off on the right side so you couldn’t see the toolbar buttons.  Strangely, this was only happening in IE, so we weren’t able to use Firefox’s FireBug.  Fortunately, there’s a similar type of tool offered by IEInspector Software called IE Web Developer 2.

Similar to FireBug, it offers basic HTTP tracing, JavaScript debugging, and, in this case, DOM/CSS analysis.  This allows you to higlight an item on the page – in this case, a portlet – and view all the styles applied to that item.  it also shows you where the style definitions are coming from:

Using it, we were able to determine that the CSS files had changed, and there was an addition of “table-layout:fixed’ and a ‘background-color’ definition in the CSS definitions for the column layouts.  Removing these definitions from the CSS restored the look and feel back to the way we had prior to the upgrade.

How did we update the hundreds of CSS files we had?  Well, that’s a post for another day… (more…)

Bug Blog 6: Fix Broken File Downloads in 10gR3 (Part Trois)

Sunday, June 20th, 2010

*sigh*.  You’ve upgraded from ALUI to WCI 10gR3, and your Knowledge Directory links got all screwed up, didn’t they?  HTML files now throw an open/save dialog, some documents don’t open, you can’t copy links by just right-clicking them and choosing “Copy Shortcut”, and IE throws a popup blocker when you click a link in the Knowledge Directory, doesn’t it? 

You got the “To help protect your security, Internet Explorer blocked this site from downloading files to your computer” blues, huh?

I’ve tried creating a Plumtree Filter, and that worked pretty well, but not quite enough.

I’ve tried tweaking the Portal’s Javascript files, and THAT worked pretty well, but again, not quite well enough.

So, today, my friend’s, third time’s a charm:  rather than trying to fix this on the server side, we’re going to knock out this issue once and for all on the client side.  Check out those previous blog entries for a more detailed description of the problem, but basically, it’s because the portal uses a crazy convoluted way of opening documents via javascript. 

What we’re going to do today is stick some javascript in the footer of the page.  This JavaScript is going to simply find all those back-asswards links and replace them with NORMAL <a href=xxx target=_new> links.  If you add it to the footer of your site (specifically, the footer used for the KD, but the code is smart enough not to do anything if it’s on all pages), it should be able to take care of the rest.  Just make sure you add it to the Presentation Template and not the Content Item, because you-know-who only knows what a mess the out-of-the-box rich text editor is with javascript and adaptive tags.

The code after the break should be self-explanatory, but as always, if you’ve got a question or comment, feel free to post it here.  Also note that this code only fixes the download links; it doesn’t kill that “Open/Save” dialog box for things like HTML files.  For that you’ll still need the Plumtree Filter. (more…)

The dependencies of WebCenter Analytics

Wednesday, June 16th, 2010

Analytics straddles this weird middle ground between being a patch and a full release (you download it from, not like a typical patch from

In the release notes, it looks like kind of a pain to install, since there’s mention of having to install other software before you can install Analytics:

And, given that the versions provided are significantly further back than what’s currently available (as of this writing, Hibernate is now at version 3.5.2 and Cewolf is now at 1.0), you might be inclined to hold off on the upgrade until the next release – I know I was initially.

The good news is that now that I’ve actually been through this upgrade, it turns out that it’s not as hard as it sounds.  Contrary to the release notes, you don’t need to “install [the apps] before you install Oracle WebCenter Analytics”.  You just need to download the zips, run the installer, and point the installer at wherever you’ve saved those files.  Pretty straightforward, actually…

Kill those pesky MSXML 5.0 warnings in ALUI and WebCenter Interaction

Saturday, June 12th, 2010

This is annoying, right? 


Ah, we’ve all seen that wacky IE warning: “This website wants to run the following add-on: ‘MSXML 5.0 from ‘Microsoft Corporation’…”  Sure you could just accept it and move on, like we’ve been doing for years:


… but your users were trained better than that, right?  They know not to accept an add-on unless they know what it does.  And, well, since even I have no idea what this does, let’s take a look at how to just get rid of this message.  I found this little gem of a fix in Oracle’s Support Center; apparently this has been a bug since Jan 2007: (more…)

Use Host Files for better WCI security, portability and disaster recovery

Tuesday, June 8th, 2010

When configuring a WebCenter Interaction portal, it’s highly recommended to use host files on your machines to provide aliases for the various services.

For example, instead of referencing Publisher’s Remote Server as, create a host file in C:\Windows\System32\drivers\etc\hosts, and add a line like this:
wci-publisher #IP Address for Publisher in this environment
… then set your Remote Server to http://wci-publisher:7087/ptcs/.

I’m always surprised how many times the knee-jerk reaction to this suggestion is that this is a poor “hack”, or something worse like this:
“Host files??? Host files on local servers need to be avoid and you should use DNS in AD for the Portal servers. Host files, again, are an antiquated and unmanageable configuration in this day and age and, in my opinion, should only be used when testing configurations—not for Production systems. I haven’t seen host files used locally on servers in a decade…is that how you are configuring this portal system? If so, I would highly recommend you try to use the AD DNS instead.”

Yes, that’s an actual response from an IT guy who prefers telling others what idiots they are rather than actually listening to WHY this approach is being used.  In all fairness, most knee-jerk reactions are based in the reality that host files are more difficult to maintain on many servers rather than DNS entries on a single server.  But hopefully, if you’re reading this blog, you’ve got an open mind, and will agree with this approach once you see the list of benefits below.

Benefits of using host files in your portal environments:

  1. Security.  When you access a service through the portal’s gateway, the name of the remote server shows up in the URL: PTARGS_0_200_316_204_208_43/ http%3B/wci-pubedit%3B8463/publishereditor/ action?action=getHeader.  For most people, this isn’t a huge problem, but allowing the name of the servers to be published in this way can be perceived as a security risk.  By using host files, you’re essentially creating an alias that hides the actual name of the server.
  2. Service Mobility.  Take the NT Crawler Web Service, for example.  When you crawl documents into the portal, the name of the server is included in the document open URL.  Now suppose the NTCWS is giving you all sorts of grief and you decide to move it to another server.  If you use host files, you can just install the NTCWS somewhere else and change the IP address that the wci-ntcws alias points to.  This way, the portal has no idea the service is being provided by another physical system  If you used a machine name, all documents would get crawled in as new the next time you ran the crawler, because the card locations will have changed.
  3. Maintainability.  This one’s a pretty weak argument, but is based on the fact that most of the time, the Portal Admin team doesn’t have access to create DNS entries and has to submit service requests to get that done.  By bringing “DNS-type services” into host files, the portal team can more easily maintain the environment by shifting around services without having to submit “all that paperwork” for a DNS entry (your mileage may vary with this argument).
  4. Environment Migration.  Here’s the clincher!  Most of us have a production and a development environment, and occasionally a test environment as well.  Normally, code is developed in dev and pushed to test, then to prod, but content is created in prod, and periodically migrated back to test and dev, so those environments are reasonably in synch for testing.  This content migration is typically done by back-filling the entire production database (and migrating files in the document repository, etc.).  The problem is, all kinds of URLs (Remote Servers, Search, Automation server names, etc.) are stored in this database, so if you’re using server names in these URLs, your dev/test environments will now have Remote Servers that point to the production machines, and you need to go through and update all of these URLs to get your dev environment working again!  If, however, you use host files, then you can skip this painful step:  your Publisher server URL (http://wci-publisher:7087/ptcs/) can be the same in both environments, but the host files in dev point to different machines than the ones in production.  Cool, huh?
  5. Disaster Recovery.  This is essentially the same as the “Environment Migration” benefit:  When you have a replicated off-site Disaster Recovery environment, by definition your two databases are kept in synch in real-time (or possibly on a daily schedule of some sort).  If a disaster occurs and you lose your primary environment, you’re going to want that DR site up as soon as possible, and not have to go through changing all those URLs to get the new environment running with new machine names.  Of course, unlike “Environment Migration” (where your dev, test, and prod environments typically share the same DNS server), this argument is also slightly weaker.  Since the DR site will likely have its own DNS server, you could conceivably just use different DNS entries at the two different sites and all will work fine.

So that’s it – hopefully you’re convinced that host files are the way to go for configuring ALUI / WCI portals; if so, stay tuned for helpful tips on how to set this up for various servers.  While Remote Servers are a no-brainer, configuring things like Automation Server and Search can be a little trickier.

There’s a WCI App For That 3: Automater

Friday, June 4th, 2010

Let’s face it: sometimes even the simplest actions in the portal can take WAY too many clicks.  Take creating a single community:  you need to pick a community template, create pages, create Publisher portlets and add them to the pages, create Collab Projects, set security, etc.  And many times, most of this information is the same for every new community.  Community Templates help, but it still requires a LOT of clicks.

Integryst’s Automater product tackles this problem by allowing you to script multiple portal actions, only prompting for the unique information each time.  For example, at one client they wanted to create a “Network”, which consists of 7 communities, dozens of portlets, and multiple admin and regular users.  It literally would take over 100 clicks each time – and eventually there are going to be 100+ of these “networks”.  Tha’ts a lot of clicks!

Not using Automater, though.   Administrators just enter the information that’s unique to each network, and those 100 clicks are rolled into one:

All kinds of actions can be scripted – and secured so that only certain groups have access to specific actions:


Analysis Paralysis

Wednesday, June 2nd, 2010