Hi, I’m Pete Warden (that’s not me in the photo!), and as a fun project, I decided to create a search mashup that let me search the way I wish I could.
While I was doing that, I discovered that there was a big restriction on using XMLHttpRequest and AJAX. You can only request pages from the same server you’re on, as a security measure. This obviously makes doing a mashup of pages on other servers much more difficult.
The standard ways of working around this involve setting up a way to use your server to fetch external web pages. There were several reasons I wanted to avoid this:
– It doesn’t scale with the number of users, since everything has to go through your server.
– A big goal of the project was to discover if the pages found in the search were accesible to the user. This isn’t possible if it’s a remote server doing the checking.
– Setting up the server to act as a proxy requires at least some knowledge of scripting and Apache
The main reason that client-side proxies haven’t been done before is the potential for security holes that it opens up. Chris Shiflett has a great article that covers the problems if XMLHttpRequest were opened up to allow cross-domain requests, which is equivalent to what MashProxy allows.
I’ll discuss the security policy I adopted in my next post, including the safeguards against abuse I’ve implemented and possible remaining problems.