BotDetect CAPTCHA ASP.NET Persistence Requirements & Options FAQ (BotDetect v3.0; deprecated)

This page answers frequently asked questions about BotDetect ASP.NET Captcha persistence requirements and options.

Table of Contents

Is ASP.NET Session State Required for BotDetect Captcha?

Yes, ASP.NET Session State is a requirement for BotDetect.

Please note that it doesn't have to be the default InProc memory Session State – you can use any Microsoft or custom Session provider you like (database, state server, file system, etc.) BotDetect doesn't require a specific mode of persistence, and it will work as long as it can save data somewhere.

Since ASP.NET provides a way to customize the exact mode of persistence and hook it up to the default Session object (as explained at http://msdn2.microsoft.com/en-us/library/aa479034.aspx), all persistence calls in the BotDetect code are made using the Session object.

Why Exactly is ASP.NET Session State Required for BotDetect?

BotDetect Captcha requires Session state to function properly. Let's examine the Captcha workflow to see why this is the case. To keep things simple, we'll disregard Captcha sounds and other non-essential elements.

  1. The user opens your form / the user's browser issues a GET Http request for Default.aspx (for example).

    Since the Captcha control is included in your page, it's Html markup will be added to the Http response. The Html markup includes an <img> element representing the Captcha image.

    At this stage, no Captcha code has been generated yet, but various Captcha settings may have been stored in the Session State, if you set any Captcha properties (e.g. CodeLength) in your page code-behind etc.

  2. The user's browser parses the Html response, and when it comes to the <img> element, it issues another Http GET request for the Captcha image (BotDetectCaptcha.ashx?get=image...). The random Captcha code is generated on the server, then an image containing that code is rendered and sent to the user's browser.

  3. When the user types in the Captcha code and submits the page, a third Http request (POST) is sent from the user's browser to your server. The user's input is compared to the code stored on the server.

Basically, since Http requests and responses are stateless, the Captcha code and other BotDetect values pertaining to the current Captcha for each visitor have to be stored somewhere between requests - otherwise, how could we compare the user input with the correct Captcha code?

Viewstate and other forms of client-side persistence are simply not suitable for Captcha persistence, since they are inherently insecure - all data sent to the client can be read and tampered with by malicious clients, and storing the Captcha code on the client in any form would significantly lower the Captcha security.

This leaves us with server-side persistence. Or more precisely, considering the type of data stored and how it's used, we need server-side per-visitor short-term storage. While it would be possbile to implement a custom storage mechanism that meets these requirements and doesn't implement the ASP.NET Session provider interface, it would be a Session provider in everything but the name.

Why doesn't BotDetect offer a client-side persistence option? After all, other Captcha services don't store everything in the Session, but rather transfer an ID or a hashed token to the client.

That's exactly how BotDetect v1.0 used to work, and we abandoned that approach because it was a major security flaw.

Captchas with any kind of client-side persistence can be bypassed easily. For example, if a hashed token of the Captcha code is kept on the client, a simple replay attack can reuse a single correct Captcha code thousands of times. Implementation details vary, but all forms of client-side persistence can be tampered with and attacked by malicious clients.

BotDetect uses server side persistence (regardless of the actual medium used to store the data: server memory, state server, database, ...) by design - and we chose that approach because we believe it to be the right way to implement a secure Captcha.

Why does BotDetect require specifying the <sessionState> sessionIDManagerType attribute in web.config?

The custom SessionIDManager BotDetect uses is very simple, and shouldn't affect any other parts of your application.

The BotDetect CustomSessionIdManager Purpose

We found a problem with Captcha sounds on clients using certain combinations of browser, OS and media player versions (for example, Safari 5 on Windows with Quicktime 7, or IE 7.0 on Windows Vista with Windows Media Player 11), resulting in Captcha sound Http requests being sent without the ASP.NET Session cookie.

We just made a small workaround that passes the SessionID (encrypted, to prevent Session hijacking) in Captcha sound Urls as an additional querystring parameter. This processing only happens for BotDetect Captcha sound paths, and minimizes the effects of any possible invalid input values.

Please note that the problem is caused by third parties, and we can not fix the issue since we don't control the source code of that software. The best we can do is to provide a workaround for the problem – and specifying the CustomSessionIdManager in web.config does the trick.

Integration Options

If this doesn't suit you for any reason, please note that you don't have to use the BotDetect CustomSessionIDManager. You can achieve the same effect (working around cookie limitations of some clients) by using Cookieless ASP.NET Session State, for example.

Our ASP.NET application already uses a custom SessionIDManager. Will this cause a clash with the BotDetect custom SessionIDManager?

Since the BotDetect custom SessionIDManager type is public, you can always integrate into your custom GetSessionID() implementation. Instead of calling

string sessionID = base.GetSessionID(context);

and then overriding the value based on your custom logic, you can simply call

BotDetect.Web.CustomSessionIdManager bdManager = 
    new BotDetect.Web.CustomSessionIdManager();
    
string sessionID = bdManager.GetSessionID(context);

and leave the rest of your code unmodified.

Can I use BotDetect ASP.NET Captcha in a Web Farm / Web Garden?

Yes, but keep in mind that you have to ensure proper ASP.NET Session state functionality in load-balanced scenarios.

When using the default Session configuration for example, if the user loads the Captcha image from one server, but then the code he types in as the solution is sent to another server, the Captcha validation will fail even for people who type in the correct code. Or if their Captcha sound request lands on a server that didn't generate the Captcha image first, the spoken Captcha code will be wrong.

There are two general ASP.NET Session state options appropriate for load-balanced uses.

Option 1: In-Process ASP.NET Sessions + Sticky Connections

You can keep ASP.NET Session state saved in the IIS process memory on each server, as long as you configure "sticky connections" on your load balancer. State can be kept in server memory only if requests are sent to the same server for all Http requests coming from the same user.

This will depend on your load balancer settings and configuration options. If you are having problems, one step that could help with troubleshooting is to add a temporary debug Http reponse header with different values on different web servers (website properties in IIS manager, HTTP Headers tab, Custom headers). For example, X-Debug-Server: server1, X-Debug-Server: server2, X-Debug-Server: server3, etc. This way you can test the sticky connection settings and check are the servers being switched on consecutive requests or not.

Please note that there are a few scenarios in which this is not an option, and you should use a single server for Session State persistence on all of your load-balanced servers (see Option 2 below).

Roaming Users

For example, AOL users are connecting through client proxies, and don't necessarily have the same IP address on subsequent requests. Depending on your load balancer settings, this might cause "sticky connections" to stop working for them. This causes Captcha requests to land on the wrong server, without the Session variables they need. This could also be solved by the hardware load balancer adding a cookie of its own or using a similar tracking mechanism that works even when the users change their IP address and/or User Agent. Please check your load-balancer documentation to determine if this is an option.

Web Gardens

You should also check your Application Pool settings, namely the "Maximum number of worker processes" parameter (when editing application pool properties in IIS 6.0 Manager, it's the last setting on the second tab, "Performance") - it should always be set to "1". If you have multiple IIS processes running and incoming requests can be handled by any particular process instance, keeping Session values in process memory won't work.

Option 2: Central ASP.NET Session Persistence Server

You can keep ASP.NET Session state on just one server for all load-balanced servers. If all servers store their state in a globally-accessible location, the problem will also be solved. This will mean using either StateServer or SqlServer session state modes, for example:

<sessionState mode="StateServer" 
  stateConnectionString="tcpip=192.168.1.5:42424"
  cookieless="AutoDetect" timeout="20" 
  sessionIDManagerType="BotDetect.Web.CustomSessionIdManager, BotDetect" />

or

<sessionState mode="SQLServer" 
  sqlConnectionString="data source=192.168.1.5;Trusted_Connection=yes" 
  cookieless="AutoDetect" timeout="20" 
  sessionIDManagerType="BotDetect.Web.CustomSessionIdManager, BotDetect" />

You will also have to configure either the ASP.NET State Service or the ASP.NET Session State Database on the server chosen to keep Session information (and of course, replace 192.168.1.5 in the above configurations with the correct connection strings). You can find more information about this topic in MSDN documentation about the available Session State modes.

External Resources

For more information on these topics, consult the following online resources:

How do I enable ASP.NET Session State in SharePoint server? I keep getting SessionTroubleshooting: SessionHttpModule not running errors.

BotDetect requires ASP.NET Session state to function properly, which should be enabled for your SharePoint site. Also, to ensure Captcha sounds work properly in all browsers, a custom SessionIDManager (implementing an optional but recommended improvement, as explained in the BotDetect FAQ) should be registered.

In the <system.web> section of the Web.config file, locate the <sessionState> element if it exists, or add it if it doesn't. Add the following attribute to the declaration:

<sessionState mode="InProc"   cookieless="AutoDetect" timeout="20" 
  sessionIDManagerType=" BotDetect.Web.CustomSessionIdManager,
  BotDetect, Version=3.0.16.0, Culture=neutral, 
  PublicKeyToken=74616036388b765f" />

If you want to use a different Session State mode or options, you can change any settings except the sessionIDManagerType – which should point to the BotDetect class as specified above, if you want Captcha sounds to work reliably.

Make sure the ASP.NET Session State HttpModule is allowed in the <system.web>  -> <httpModules> section of the SharePoint Web.config file - if the following element is missing or commented-out, please add it:

<httpModules> 
  <add name="Session" 
  type="System.Web.SessionState.SessionStateModule" />

The same HttpModule should also be added the <system.webServer> -> <modules> section:

<modules runAllManagedModulesForAllRequests="true">
  <remove name="Session" />
  <add name="Session" 
  type="System.Web.SessionState.SessionStateModule" />

Make sure ASP.NET Session State is allowed in the <pages> section of the SharePoint Web.config file - the enableSessionState attribute value should be set to true:

<pages enableSessionState="true" enableViewState="true" 
  enableViewStateMac="true" validateRequest="false" 
  pageParserFilterType="Microsoft.SharePoint.ApplicationRuntime.
    SPPageParserFilter, Microsoft.SharePoint, Version=14.0.0.0, 
    Culture=neutral, PublicKeyToken=71e9bce111e9429c" 
  asyncTimeout="7">

We're trying to use BotDetect Captcha on a form that is <iframe>-d into another page that's on a different domain. The Captcha works in Firefox, but not in IE (version 6 or 7). Have you seen this problem before?

The Captcha doesn't work in this scenario because ASP.NET Session state is not being persisted properly for the user. The problem appears in IE because of the following:

  • On all requests after the first, the user is returned to the appropriate Session state using the ASP.Net Session cookie
  • Your Session state cookie is set for Domain #1 (with the form containing the BotDetect Captcha)
  • The main page (containing the <iframe>) is located at Domain #2
  • Cookies located at different domains than the main page are known as 3rd party cookies; they are a known cause of security issues, and are blocked by default in IE
  • When opening your page in IE 7, you can confirm this is really causing the issue by changing your IE settings to allow 3rd party cookies ( Tools > Internet Options > Privacy > Advanced ). When you make this change, the Captcha should start working properly

Solution

You should definitely modify your code not to use cross-domain cookies, since many users will have them blocked by default (and for good reasons).

The simplest solution is to configure your application (the form containing the Captcha) to always use cookieless ASP.NET Sessions - persisting the current SessionID in an Url fragment, instead of a cookie.

You can do this by changing the <sessionState> element declaration in the web.config file (the same one where you registered the BotDetect HttpHandler). For example, if you are using a declaration like:

<sessionState mode="InProc" cookieless="AutoDetect" timeout="20" />

Please change it to:

<sessionState mode="InProc" cookieless="true" timeout="20" />

Note that this will dynamically change the Url of the form containing a BotDetect Captcha to contain the SessionID fragment (something like /(S(vi1maxipelkbz2ahryodzirf))/), but shouldn't affect the Urls of your main page (unless you change that page's web.config file too).

After adding BotDetect CAPTCHA to my form, I've noticed that the first time I access the page the URL modifies itself by adding a querystring: AspxAutoDetectCookieSupport=1. This querystring disappears on the next request, but I'm wondering is it possible to hide it completely?

The AspxAutoDetectCookieSupport=1 querystring is added automatically by ASP.NET during the cookie support detection phase. Since <sessionState> cookieless attribute in the web.config file is set to "AutoDetect", the ASP.NET runtime tries to detect whether the user's browser supports cookies, and the querystring parameter is added during that process. If cookies are supported, the Session ID is kept in a cookie, and if not the Session ID is sent in the Url of all future requests by that user.

The only way to remove the querystring is to set the cookieless attribute to either "true" or "false" in your web.config file. But in that case, all Urls will be dynamically changed to include the ASP.NET Session identifier (cookieless="true"), or CAPTCHA validation will always fail for users who have cookies disabled in their browser (cookieless="false").

Since this is built-in ASP.NET behavior, you will have to decide which cookieless value best suits your application. You can read more about cookieless Sessions at http://msdn.microsoft.com/en-us/library/aa479314.aspx.

Why Exactly Does This Querystring Appear?

Any Session state implementation needs a way to recognize returning users, since Http requests from different users can come in any order, from different IP addresses and even with changing user agents for some users. The ASP.NET Session state can combine two types of recognizing that a Http request belongs to a particular previous visitor: cookies and the Url fragment used in Cookieless Sessions. Depending on the "cookieless" attribute values, 3 cases are possible, each with a positive and a negative side:

  1. Cookieless=false. Cookies are always used to store the Session ID. There are no Url modifications at any time, but users who have cookies disabled won't be able to solve the Captcha.
  2. Cookieless=true. Cookies are not required or detected, but all Urls in the application are different for each new user, and Session hijacking is somewhat easier than with cookies.
  3. Cookieless=AutoDetect. Users with cookies will have clean Urls, and users without cookies will fall back to Url-modification. This option is the most robust (it works for the most users), but the price is a single extra redirect and the "?AspxAutoDetectCookieSupport=1" querystring used for cookie detection.

As you can see, no option is perfect for all use cases, and they all have specific advantages - which come at a specific price. These are the only 3 options with ASP.NET Sessions, so you have to decide on one of them.

I Need To Remove It, It's Messing Up My Google Listings!

While it would be theoretically possible to come up with a custom mechanism for recognizing returning visitors' Http requests as belonging to the same Session – which wouldn't use cookies or querystrings – this would hardly be practical (there is a reason why most web frameworks still use cookies).

Depending on your priorities, you can choose either option 1 (if only a small percentage of your visitors disables cookies) or 3 (if you want the most robust Sessions which work for the widest audience). The querystring used in case 3 is simply how ASP.NET works, and we are not aware of any way to remove it.

If your only concern with "?AspxAutoDetectCookieSupport=1" is duplicate content, you'll be glad to hear that to the best of our knowledge, it never causes any duplicate content penalties in Google or any other major index. Most websites on the internet built with ASP.NET have the exact same querystring, and they are not penalized for it. Google is quite good at recognizing the same page regardless of querystring variations (especially when there are only two Url variations involved).

If this is still a major concern for you, you can simply use the <link href="/bd3/doc/aspnet/faq/..." rel="canonical" /> element in your page <head> to tell Google which Url to use for indexing.

Adding BotDetect to the application broke my ASMX SOAP web service residing in the same project. What can I do?

Quick fix: replace <sessionState mode="InProc" cookieless="AutoDetect" in the Web Service project web.config with <sessionState mode="InProc" cookieless="false".

Explanation: ASP.NET tries to detect if the client supports cookies, and redirects it to either cookie-based or cookie-less Urls accordingly. This makes sense for regular browsers and improves the user experience, so we use it by default. However, it appears the 302 redirect it uses to achieve this (or some other aspect of cookie detection) messes up service action binding using the SOAPAction header.

While the SOAP client could theoretically be modified to handle even this redirection, it's simplest to disable ASP.NET cookie auto-detection and set the cookieless attribute to either "false" or "true".

We currently use your BotDetect ASP.NET CAPTCHA control and I have a question regarding ASP.NET Cache. I have noticed a number of items being added to our web application cache named 'LBD_CaptchaRequestTracker_...'
Could you tell me why are these cache items created, and what are their timeout parameters?

The Cache values you mention are used for an optional Captcha security feature. Considering the way BotDetect works during normal user interaction, Captcha images and sounds should almost never reuse the same querystring:

  • captcha identifier ("&c" querystring parameter) stays the same for each Captcha control on each page in the application
  • instance identifier ("&t" querystring parameter) changes on each page load (postback or otherwise)
  • client-side timestamp ("&d" querystring parameter) changes on each Reload or Sound button click

Considering the above, the only visitors that will request multiple Captcha images or sounds with exact same querystrings are simple bots.

When we can detect such behavior, we don't have to bother generating the Captcha image or sound, we just send an Http 400 Bad Request error response.

This way, we can avoid using system resources on Captcha generation for obvious bots, and we also limit the number of Captcha images or sounds such simple bots can access at all (a minor security improvement).

There are a few scenarios where legitimate users will sometimes request the Captcha image or sound with a repeated querystring (for example, where re-opening the page from browser cache), so we don't absolutely forbid repeated requests.

By default, we limit the number of allowed BotDetect Captcha image or sound requests with the same querystring to five requests, which should be enough for any special cases regular human users might encounter. If you want to change this number or turn off this filtering completely, you can use the following settings in the <botDetect> web.config section:

<captchaRequestFilter enabled="true" allowedRepeatedRequests="5"/>

The Cache values you mention are used to keep the request count for each Captcha querystring combination. Since this is low priority data (the worst that could happen if the cached value is lost is that a bot will get an extra image or sound), we only keep it for the same period as the configured ASP.NET Session timeout (20 minutes by default).

Please Note

The information on this page is out of date and applies to a deprecated version of BotDetect™ CAPTCHA (v3.0).

An up-to-date equivalent page for the latest BotDetect Captcha release (v4) is BotDetect v4 Captcha documentation index.

General information about the major improvements in the current BotDetect release can be found at the What's New in BotDetect v4.0 page.