Using PSloggedon to troubleshoot BlueCoat Single Sign On (SSO)

I won’t go into an introduction of what SSO is and Bluecoat’s implementation of it, there are some good KB articles on this subject. The following article (hopefully to be published soon on the bluecoat KB) explores how SSO interacts with windows and how to use the sysinternals PSloggedon tool to troubleshoot this.


When using Windows SSO, authentication intermittently fails with:

“Last Error: The user could not be determined by the Single Sign-on agent.”

FTP clients will show the following error:


A policy trace on the proxy will show the same error:

“Last Error: The user could not be determined by the Single Sign-on agent.”

The proxy will not show anyone as logged in via SSO

(https://a.b.c.d/Auth/User-Logins/Summary/Realm/ ):


This happens even though the user is actually logged into the domain.

Explanation and troubleshooting:

The reason this is happening is actually down to the inner workings of windows. SSO has two modes: “Domain Controller Querying” (DCQ) and “Client Querying”.

DCQ uses an API provided by Microsoft called “NetSessionEnum”. Even though the user is logged in, BCAAA does not see the user as logged in because this API does not see my user as logged in.

A useful tool you can use to check what users the API is returning is “PSloggedOn”, a Microsoft tool available from their website:

This tool utilises the same API that BCAAA does when checking the users logged in on a server remotely (hence the heading “resource shares in the screenshot below). In this particular example, we see that the tool returns the following:


Note that Psloggedon must be run using the same user BCAAA uses and should be run on the domain controller, without any arguments.

From the client PC, we then access a shared folder on the domain controller, and run the tool again on the Domain Controller:


Now we check the proxy and we see the users logged in correctly:


If we check the BCAAA debug logs (see : for more information on the SSO debug logs) we also see the correct user:

2011/04/14 09:30:41.034 DAVVAS CHILD

Why is the issue intermittent?

The windows API does not always return the user. It seems like after a period of inactivity, the API “times out” the user. BCAAA mitigates this by using a “time to live”. This setting is controlled in the sso.ini file:


But if the TTL expires and the API still does not return the username, the error is displayed.

Why does it work more reliably when using IWA?

When using IWA, not SSO, BCAAA does not use the same windows API, but actually does try to logon to the domain controller (using another API – NetLogon)

What can be done to mitigate this:

– Enable both Domain Controller Querying and Client Querying.

To do this, you need to first change the ProxySG configuration > Authentication > Windows SSO > Agents:


This will cause BCAAA to not only rely on that API we spoke about previously, but also to check who is logged in on the client (using yet another API – NetWkstauserEnum). There are a couple of pointers here:

  • Advantages
  • – More current data than DCQ

  • Disadvantages
  • – Workstations must have port 445 open (possible security risk)

    – Remote registry service should be enabled (disabled by default on Vista and later)
    – BCAAA service must run as a domain user / admin that can query the clients.
    – Client query will fail if the workstation reports that more than one user is logged in

Again, PSlogged on uses the same API. To test, run PSloggedon on the domain controller, using the same BCAAA user and adding the \\w.x.y.z as an argument (where w.x.y.z is the client IP)

If any of those requirements are not met, PSloggedon will show:


If all the above conditions are met, PSloggedon will show:


If multiple users are logged in you will see multiple users under “users logged on locally” and the proxy will throw an error.

Some other useful links when troubleshooting BCAAA SSO:

Single sign-on failed due to Appliance Error (configuration_error)

Window SSO realm authentication failed, browser may received error message "The user could not be determined by the Single Sign-on agent."


SQUID + GreasySpoon : enhancing your proxy deployment with content adaptation

When comparing the two proxy solutions I am most familiar with, these being BlueCoat ProxySG and SQUID, the most striking difference is the capability of the bluecoat to easily change and modify the traffic passing through it. For the Bluecoat-savvy of you, adding a “Web Access” and “Web Content” layer in policy allows you to modify traffic such as adding Headers, cookies, notify pages, and so on. This sort of modification is what is known as “Content Adaptation”. A SQUID article explains the various options available to SQUID users in doing this:

It definitely doesn’t look very easy to do this. The easiest way I’ve found is using an Internet Content Adaptation Protocol (ICAP) server to modify the traffic for SQUID. I wont go into much details on ICAP, in a nutshell the SQUID proxy sends traffic of interest (such as HTTP) over to the ICAP server, which then parses it, modifies it, and sends it back to the SQUID server. This opens up a lot of opportunities for achieving the same sort of Bluecoat functionality I mentioned previously… such as adding headers, cookies, inserting company headers within a website, and much more.

The easiest and most flexible open source ICAP server i’ve come across is GreasySpoon:

It requires some programming knowledge so it’s not as easy for first-timers but the upside is the possibilities are endless… apart from having a good performance and being cross-platform. If you are going to go through with setting this up, I advise reading through the website, they have some good documentation and script samples.

In this article I’ll be logging my test setup where I’ve used a CentOS 5 machine to host a SQUID proxy and a GreasySpoon server. As a test case, I wanted to instruct GreasySpoon to insert a header into YouTube server responses to force clients to use the HTML5 version of the YouTube site.

– Setting up SQUID proxy server

The first steps is installing a SQUID proxy on the server. In order to include ICAP functionality you need a later SQUID version (3.x). The SQUID versions I found in the CentOS repositories where v2.x, so this necessitated building SQUID from source. The only two pre-requisite packages I needed to download to do this were gcc and gcc-c++. From there, the process is quite normal:

  • Download latest package (v3.1 in my case)
  • Run ./configure –enable-icap-client to enable ICAP functionality on the SQUID proxy
  • Run make and make install
  • It should install successfully. Modify the squid conf file to your needs and start the proxy
  • Test the proxy by using a browser pointing to the proxy IP

– Setting up the GreasySpoon ICAP server

The installation of GreasySpoon is a breeze, the only issue to point out is to make sure you have the correct Java version installed, else the inbuilt JavaScript engine will not be enabled, leaving you with no languages to use to modify the traffic.

In my case, CentOS already had OpenJDK installed, but I ran into problems anyway because the JavaScript language was not selectable. I needed to download the Sun Java package and install that.

Download the tar.gz package from greasyspoon, extract this, and modify the file “greasyspoon” to point the JAVA_HOME variable to the java home directory, in my case /usr/java/jre1.6.0/.

To start greasyspoon give executable permission to the greasyspoon file: chmod +x greasyspoon. Then start the server: ./greasyspoon start

You should now be able to reach the admin interface of the server by visiting http://localhost:8088 on the server.

To double check, run netstat to make sure the server is listening on port 1344.

– Set up the SQUID + GreasySpoon interaction

This part of the setup informs SQUID to send traffic over to the GreasySpoon server. The basic instructions to do this are already explained here:

In a production environment you may want to modify this configuration a bit so not all traffic is sent to the ICAP server, usually only a subset of traffic should be sent.

– Writing a GreasySpoon script to modify HTTP traffic to

In my case, I wanted GreasySpoon to check for the presence of a certain Cookie (pref=f2) and if not there, instruct the client to set this cookie using the Set-Cookie HTTP header. This cookie controls if youtube should be seen in HTML5 or not.

In the greasyspoon admin interface, navigate to the tab “greasyspoon scripts” > responses scripts > new script. Add a name and leave the language as ECMAScript. Here’s the script itself:

// This is a GreasySpoon script.
// ——————————————————————–
// WHAT IT DOES:force HTML5 version of youtube
// ——————————————————————–
// ==ServerScript==
// @name           youtube_HTML_5
// @status on
// @description    force browser to request the HTML5 version of youtube
// @include        .*youtube.*
// @exclude       
// @responsecode    200
// ==/ServerScript==
// ——————————————————————–
// Available elements provided through ICAP server:
// —————
// requestedurl  :   (String) Requested URL
// requestheader  :  (String)HTTP request header
// responseheader :  (String)HTTP response header
// httpresponse   :  (String)HTTP response body
// user_id        :  (String)user id (login or user ip address)
// user_group     :  (String)user group or user fqdn
// sharedcache    :  (hashtable<String, Object>) shared table between all scripts
// trace          :  (String) variable for debug output – requires to set log level to FINE
// —————
    headerstring = "Cookie: ";
    c = requestheader.indexOf(headerstring) + headerstring.length;
    c1 = requestheader.indexOf("\r\n", c);
    var Cookiestring = requestheader.substring(c,c1);

            if (Cookiestring.indexOf("f2")<0){
                responseheader = responseheader + "Set-Cookie: PREF=f2=40000000; path=/;;\r\n";

Most of it is just comments but pay attention to the include and exclude comments since they control which sites the script will be applied to.

The rest of the comments describe which variables are available to you the programmer to use.

The actual script (starting headerstring=”Cookie: “) is an adaptation of a sample script on the site which basically just checks for the presence of the cookie, and if not found, sends the Set-Cookie header to the client.

Save and enable the script.

That’s about it. You can see if the script is being applied from the  “Data” tab > logs > access logs section of the greasyspoon interface.

This was just an example, just to show that with a bit of persistence and programming a free, open-source solution can match the functionality and flexibility of much more expensive commercial solutions. Of course, it’s not as easy to setup, use and maintain, but I still think this is a fantastic tool and setup that gives any network admin great granularity of control over his proxy traffic 🙂

PS : greasyspoon can serve as a very flexible ICAP server for Bluecoat also… all that’s needed is a web content rule that forwards traffic via ICAP to the GreasySpoon server.