Search: Go
 Transact SQL
 Other Articles
 Software Reviews

 Canon EOS 300D Samples
 Akihabara Maids!
 More Galleries...

 2009: China
 2008: Tokyo
 2007: Tokyo
 2006: Hong Kong
 2005: New York City

 Search Engine Optimisation
 Build an ASP Search Engine
 My Tropical Fishtank
 SQL Month Name
 SQL Get Date Today
 SQL Year Month
 Other New Stuff...

 Regular Expressions
 Index Server & ASP
 JavaScript Ad Rotator

Home > ASP.NET Articles

Improving ASP and ASP.NET Website Security - Part Three

Ideas for improving the security of ASP and ASP.NET web applications.

Part 1 | Part 2 | Part 3 | Part 4 | Part 5

Use emails to report web application errors

Building an error reporting facility into your web applications can be beneficial when improving site security. It also has the added benefit of being able to notify the web developer as soon as bugs arise, enabling problems to be fixed and the web application made more robust. If the error reporting makes use of e-mail to send the errors, then the application will benefit from a near real time reporting system of errors and suspicious website activity.

An error reporting e-mailing function for classic ASP was described in this ASPAlliance article: An ASP Error Report Emailer Function.

Error handling in ASP.NET is much improved. There is an Application_Error subroutine in the Global.asax that is called whenever an error is encountered within an ASP.NET page within the web application. There is also a Page_Error even that is called should there be an error on an individual page. ASP.NET also offers improved tracing of errors, such as the ability to view the line number that raised the error (an application must be compiled in Debug mode in order for line numbers to be present in error reports).

Once the error reporting e-mail function has been incorporated into the website, the e-mails can then be monitored in order to detect security issues. Depending on the way the application was coded, failed login attempts, attempted SQL injection attacks or other suspicious activity will often cause error reports to be generated.

Note that if the web application has a high level of traffic, it is advisable to build in a limit to the number of e-mail error reports that are sent in a specified time period. A variable within the ASP Application object can be used to keep a count of the number of e-mails sent in a specific time period.

Check SQL Server user permissions

This is basic security advice, but a surprising number of developers embed the SQL Server system administrator (sa) account credentials within their application connection strings. This leads to two major issues:

  • The account credentials are visible to anyone who has access to the application's source code.
  • Should the website be compromised the malicious user may be able to delete tables, drop databases and be able to do all manner of other undesirable things. It is, therefore, highly recommended that a new SQL Server user account be created for the Internet user. This user should only be given access to the objects they are going to need to access. If they only need read access for a table for example, then they should only be given SELECT permission and not INSERT, UPDATE or DELETE permission.

The use of stored procedures is highly recommended as a means of improving security because then the user only needs to be given EXEC permissions on the stored procedures they need to use.

Alternatively, it is possible to use Windows authentication for the SQL Server access, in which case for applications using anonymous access, the IUSR_machinename could be configured as a SQL Server user and given the minimum level of object access.

Be wary of exposing sensitive information through Index Server

Index Server on Windows NT servers and Indexing Services on Windows 2000 servers offer a good "out of the box" functionality for building website search engines. Unfortunately, Index Server suffers from a few issues which can cause security problems on a server:

  • Index Server itself had a number of security flaws, which were resolved with a number of service packs from Microsoft.
  • Since Index Server catalogs files on the file system, it is possible for content to appear in search results that you may not want.
  • Index Server is unable to differentiate between content files and website structure files. Consequently, it is possible for website include files and other structural files to appear in search results.

I have built an add-on for Index Server called the Index Server Companion that uses a web crawler to save content from a website's content and make it available for cataloging by Index Server. The advantage of this system is that since the website itself was crawled rather than the files, the content of the pages appears exactly as the end user would see it (i.e. all Include files are included and ASP interpreted) and there is no risk of unintentionally indexing content that should not appear in search results.

The other advantage is that the Index Server Companion obeys web server robots.txt files conforming to the robots exclusion protocol as well as the robots meta tag in individual website pages.

Microsoft's Site Server 3.0 has similar web crawling functionality available by using the Gatherer component, but unfortunately Site Server is no longer available. Some of the functionality has been transferred to Microsoft's Share Point Portal Server, but sadly it does not do exactly the same as Site Server used to do.

Part 1 | Part 2 | Part 3 | Part 4 | Part 5

  Site Map | Privacy Policy

All content is 1995 - 2012