Bike the Las Llajas/Rocky Peak/Chumash loop in Simi Valley

Lasllajasoilwell

There’s miles of wild hills north of Simi Valley, but few marked trails since the area’s mostly still owned by oil and gas companies. We did some exploration on our bikes yesterday, and found a challenging 9.5 mile loop that heads through there. It starts off near the north end of Yosemite Avenue, winds up along the Los Lllajas fire road, switchbacks up a hillside, connects with Rocky Peak fire road, and you can then take Chumash single-track back to your car. Here’s the map.

View Larger Map

From the 118, take the Yosemite Avenue exit and drive north for about 1.5 miles. Turn right onto Evening Sky drive and go about another half-mile until you see a gate across a fire road on your left, together with a small information sign. Park on the road, there is usually plenty of room.

Take the fire road for a short distance until it joins the main Las Llajas route. At this point it’s paved, though inaccessible to vehicles. Turn right and head down the hill. You’ll now stay on this road for around 3 miles, as it winds along by the creek. It climbs gently, and there’s some good shade from oaks along the way when you need to rest. About 2 miles in there’s a first gate, which should be unlocked but kept closed.

At the 3.5 mile point, you should see another dirt road heading off to the right. If you miss that, you’ll come to a second gate festooned with ‘Keep Out’ signs, the entrance to Las Llajas ranch, and you’ll need to head back a few hundred feet.

This dirt road climbs up the hillside and links up with Rocky Peak fire road. A little way up you should see an old oil pump. Checking the gauges it seemed like there was still pressure underground, and it seems in good shape. It’s around a mile up to Rocky Peak road, and I found it a pretty brutal climb, especially in the sun.

When you reach the top of the ridge 4.5 miles in, the road connects with the Rocky Peak trail. It looks like you could head towards Oat Mountain if you turned left, but to loop back to the trailhead turn right. There’s a short climb and drop, and then a long, tough slog to the summit.

Chumash single-track starts on the right side of the Rocky Peak fire road a ways after the summit, around 6.2 miles from your start. There isn’t a sign, but there is a large charred post with some illegible blue paint on it, and the trailhead itself is fairly obvious. It’s about 2.5 miles long, and the first half is fairly technical, with some tricky rocks to hop around. You’ll drop down into Chumash Park, and there’s lots of different variants on the main trail in there. Most of them take the same basic route, but you should watch out for a small side-trail heading to the right when you’re nearly back to the main Chumash trail head at the end of Flanagan Drive. Take the side-trail over to Evening Sky Drive and head half a mile back to your car.

 

The strange transformation of Bradford and Bingley

Bradfordandbingley

Amongst all the other dire news today, I heard that Bradford and Bingley had failed. This brought the crisis close to home, since I’d had an account there since I was a toddler. My parents had set it up for me when it was still a ‘Building Society’ (a customer-owned organization like a credit union). Back then, the only way you had a chance at a mortgage was if you had been banking with someone for decades, so starting an account early was essential.

By the time I was a teenager, things had changed. The loan process had started to become automated, and the loan officers were given less and less say in the decision. A decade ago, the mortgage business had become so profitable that B&B gave up its mutual structure, distributed shares to its customers and jumped into high-risk, high-profit areas like liars loans and buy-to-let mortgages.

I don’t have the expertise to enlighten anyone on what’s happening in the markets, and as Megan McArdle says, it’s amazing how all the things you were against before the crisis, caused it. What does astonish me is how far we came in three decades. When I was born, you had to have a massive deposit, glowing references, a long history with the firm and the personal approval of your local bank manager to get a mortgage from B&B. A year ago all you needed was to turn up and pass the ‘fog a mirror with your breath’ test.

What your mother didn’t tell you about game programming

Irmother
Photo by Cobalt123

Out of nostalgia I was looking through an article I wrote for Gamasutra back in 2001. I still stand my focus on "actually getting the damn game shipped", and it was fun to see how my thinking was moving towards agile programming methods, though I’ve never worked in a team that embraced it 100%. My favorite part was rediscovering the classic essay How to write unmaintainable code by Roedy Green.

"In the interests of creating employment opportunities in the Java programming
field, I am passing on these tips from the masters on how to write code that is
so difficult to maintain, that the people who come after you will take years to
make even the simplest changes. Further, if you follow all these rules
religiously, you will even guarantee yourself a lifetime of employment,
since no one but you has a hope in hell of maintaining the code."

Everyone writing code should read it if they want to avoid killing their projects with toxic code.

How to POST an HTTPRequest in C#

Tuningfork
Photo by Daddyoh

I’ve been learning C# over the last few weeks, and I’m very impressed. A lot like the Objective C/Cocoa combination on the Mac, it’s focused on making GUI development very fast and easy. The Visual Studio integration is a lot slicker than Apple’s Interface Builder, without the complex, obscure and manual wiring together of components and code that IB requires. I still found myself hunting around in endless property windows for the right member or event, but touches like being able to double-click on an event name and have VS insert an empty handler function in your class really sped up my development. Combining that ease-of-use with Add-In Express’s environment for building Office and IE plugins has helped me make great progress.

One of the things I need to do a lot is communicate with a remote web server. The  XMLHttpRequest JavaScript interface has become the standard for web APIs, and C# has its own version, WebRequest/HttpRequest. I’ve included an example class below that implements a synchronous POST request on top of this. To use it in your own project you’ll need to add System.Web as an external reference if you don’t already have it. It’s also possible to use HttpRequest asynchronously, but I’ve left that out of this code. The interface takes an array map of variable names and values to pass as the POST  variables, and the current error logging is through a MessageBox alert, which you’ll want to change in production!

Download PeteXMLHttpRequest.cs

using System;
using System.Collections.Generic;
using System.Text;
using System.Net;
using System.IO;
using System.Windows.Forms;
using System.Web;

namespace MailanaOutlook
{   
    class PeteXMLHttpRequest
    {
        public static string dictionaryToPostString(Dictionary<string, string> postVariables)
        {
            string postString = "";
            foreach (KeyValuePair<string, string> pair in postVariables)
            {
                postString += HttpUtility.UrlEncode(pair.Key) + "=" +
                    HttpUtility.UrlEncode(pair.Value) + "&";
            }

            return postString;
        }

        public static Dictionary<string, string> postStringToDictionary(string postString)
        {
            char[] delimiters = { ‘&’ };
            string[] postPairs = postString.Split(delimiters);

            Dictionary<string, string> postVariables = new Dictionary<string, string>();
            foreach (string pair in postPairs)
            {
                char[] keyDelimiters = { ‘=’ };
                string[] keyAndValue = pair.Split(keyDelimiters);
                if (keyAndValue.Length > 1)
                {
                    postVariables.Add(HttpUtility.UrlDecode(keyAndValue[0]),
                        HttpUtility.UrlDecode(keyAndValue[1]));
                }
            }

            return postVariables;
        }

        public static string postSynchronous(string url, Dictionary<string, string> postVariables)
        {
            string result = null;
            try
            {
                string postString = dictionaryToPostString(postVariables);
                byte[] postBytes = Encoding.ASCII.GetBytes(postString);

                HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(url);
                webRequest.Method = "POST";
                webRequest.ContentType = "application/x-www-form-urlencoded";
                webRequest.ContentLength = postBytes.Length;

                Stream postStream = webRequest.GetRequestStream();
                postStream.Write(postBytes, 0, postBytes.Length);
                postStream.Close();

                HttpWebResponse webResponse = (HttpWebResponse)webRequest.GetResponse();

                Console.WriteLine(webResponse.StatusCode);
                Console.WriteLine(webResponse.Server);

                Stream responseStream = webResponse.GetResponseStream();
                StreamReader responseStreamReader = new StreamReader(responseStream);
                result = responseStreamReader.ReadToEnd();
            }
            catch (Exception ex)
            {
                MessageBox.Show(ex.Message);
            }
            return result;
        }

    }
}

Is Google stuck in the mud?

Stuckinthemud
Photo by misfitgirl

There’s been a round of blog sparring about the state of Google, with ReadWriteWeb’s Bernard Lunn claiming that they’re spreading themselves too thin, and Tim O’Reilly firing back a defense that they’re making strong strategic moves. I think they’re both wrong.

Microsoft terrified people in the 90’s because once they moved into a market they would carpet-bomb their way to dominance, using their deep pockets and distribution with the OS to crush competitors with feature-rich applications. Once they’d won, the products would often fester, but while there was still a race they would keep improving with every release. Look at Internet Explorer vs Netscape for the classic pattern. IE 3 was awful, IE 5 was pretty darn good for its time and killed Navigator. Then nothing much happened for years until Firefox came along and goaded MS into doing good things with versions 6 and 7.

Google’s moved into several major markets, come out with an excellent initial product, and then left it mostly unchanged. Look at Gmail, which I track obsessively because I think they did such a good job at launch, and I keep expecting them to follow up with some mind-blowing innovations. Instead the recent Labs section mostly contains UI tweaks. Google Documents is the same, a great launch but years later there’s still the same bugs and limitations I expected to get fixed quickly. Blogger is stagnating too, I could go on. In almost every major area Google has expanded into outside search, they’ve implemented or bought a great initial product, and then neglected it.

Unlike Bernard I don’t think it’s inherently a problem that they’re spread so widely. Microsoft in its prime showed that it’s possible to win a lot of independent markets simultaneously. Google just can’t seem to execute on their strategy. And unlike Tim, I think you can’t say they’ve got a strong strategy and ignore their operational track record. Strategy is useless without execution. Aiming to catalog the world’s information is a great strategy. Building an open-source smart phone OS is a clever move as part of that. The trouble is that the implementation isn’t looking great. Android already alienated a lot of developers before a single phone was been released, and they won’t be supporting ActiveSync amongst other things.

Google have produced amazing innovations for years. I’m just hoping they can keep their incredible momentum going, and capitalize on their initial releases by pushing the products forward. If they don’t, Zoho may not have millions of customers today, but there’s a lot fewer barriers to switching in the web world than there ever was on the desktop.

How I got things done at Apple

Brilliantgenius
Photo by luckyfish

One of my obsessions is finding experts. Before Apple, I’d never worked at a company larger than 30 people. Once there, I rapidly realized that almost any engineering problem our team was looking at had already been solved by somebody else in the company. For example, we’d developed a fantastic image comparison command line application to use in our automated testing. Over the 5 years I was there, I ran across several other internal groups who’d written tools to solve the exact same problem, each taking between 3 to 6 engineer-months.

This drove me crazy, we were all being paid by Steve so there’s no reason to waste that effort. Since I’m very curious (some might say nosey!) and I love chatting to people about what they’re working on, I made it my mission to get to know folks across the company and keep up to date with what their groups were doing. When I heard they were hitting an issue our group had already tackled, I could hook them up with the right people on our team, and do the same for my immediate colleagues with developers in the rest of Apple.

I felt like this was helping me get things done within the company, moving the projects I was involved in to completion, but it was also a frustrating process.

Totally manual

It was completely informal, relying on water-cooler chats and building personal relationships. There was no way of finding the experts in an area without relying on word-of-mouth.

No credit

When people did help me, it was very hard to give them credit with their management. They took time out of their day to help my projects meet their deadlines, but all they were assessed on was their own department’s success. There was no mechanism to capture how much they’d done for the overall company by helping across organizational boundaries. I ended up trying to finagle ad-hoc rewards like taking other teams out to dinner or mailing them cookies from Harry & David.

Part of the reason I left Apple is that I think there’s massive productivity gains to be made by giving people within large companies better tools for this. There’s already been some work by companies like Tacit or even Microsoft’s Knowledge Network protototype, but I’m convinced that an effective expertise location service can save massive amounts of time and money for large companies. There’s two main characteristics it will need that haven’t been achieved so far:

Automatic

It must have broad coverage without much effort. In practice this is likely to mean basing expertise tags on automated analysis of emails, with employees then tweaking the generated profile. Interestingly McKinsey actively use manual versions of profiles like these, but in most companies they won’t get created and updated without McKinsey’s strong emphasis on collaboration and expertise.

Rewarding

There has to be a way to reward participants in a meaningful way. There must be some tracking of ‘assists’ that somebody offers to the rest of the company through the system, in a form that can come up at annual reviews.

If you want to hear more about how I’m solving this with Mailana, join me at Defrag in a few weeks and I’ll give you a demo!

How to get a U720 EVDO wireless broadband USB modem working with OS X

Radiodial
Photo by Infra-Ken

I’m extremely conservative when it comes to changes to my development machines. As a geek my natural tendency is to install all the whizzy new software and hardware I come across, but that’s a massive time-suck away from actual development, since inevitably there’s driver and compatibility issues I end up debugging. I try to stay firmly on the well-beaten path, so others before me will have stepped on the land-mines.

Unfortunately I had to break my policy yesterday, and I paid for it. I’ve been contemplating getting a wireless EVDO device for my laptop, primarily so I have a backup to Wifi for my product demos. As a last resort I’ve got a canned movie I can show to customers and investors, but the online version is much more effective. The large databases (eg 5GB) I’m dealing with also make it tough to set up a local server on the same machine. I’d also like to stop paying the extortionate airport Wifi fees when I’m travelling.

I did my research, chose EVDOInfo as my vendor, since they have a good reputation for Mac support, a USB devicee from Novatel, the U720, that was widely used, and Sprint since I’m stuck with AT&T on my iPhone and wanted a different carrier for this. The ordering process was painless, I actually did it online through my iPhone whilst waiting to get seated for breakfast. It turned out I’d mis-keyed the expiry date on my credit card, but a very helpful salesman phoned me up and sorted out the mixup. They also preactivated the device for my account which was very handy.

Once the modem arrived, I unpacked the box and looked through the documentation. There wasn’t an obvious quick-start guide, so I inserted the provided CD and looked through the manual. It sounded like I should have the modem connected to a USB port when I ran the installer, so I plugged it into the side of my MacBook Pro. That brought up a prompt mentioning that a new network device had been found, and asking if I wanted to install the drivers? This automatic discovery sounded perfect, so I clicked through the OS’s native installation process (this wasn’t from the CD). That’s when the nightmare began.

After that process, large parts of the OS stopped working. I could no longer open most preference items, it would just hang indefinitely when I did. I also couldn’t run Software Update, or even su from the terminal. My first reaction was to do full backups of everything important on my machine, which ate up an hour or two. Then I booted from DVD and ran a full cycle of disk and permission repairs, which didn’t solve the problems. At that point I cut my losses and did a clean OS install on a different partition, taking about 2 hours including copying over all my backups, running all the software updates and reinstalling applications.

I tried the installation process again, this time running the CD SmartView package from Sprint. This ran successfully, but bringing up the new application and trying to connect failed. Checking in the system console I saw this message:

9/23/08 5:51:27 PM Sprint SmartView[192] SERIOUS WARNING : All 3 Connection Attempts Have Failed
9/23/08 5:51:57 PM [0x0-0x17017].com.roamingclient.cell.mac.roamingclient[283] /Users/hms/Projects/pctel.1.4/Modules/MoreSCF/MoreSCF.c:1640: failed assertion `(err != noErr) || (servicesDict == NULL) || (*serviceOrder == NULL) || (CFDictionaryGetCount(*servicesDict) == CFArrayGetCount(*serviceOrder))'

A lot of Google research later, I finally found a workaround here. Ignore the initial steps for removing and reinstalling the drivers he describes, the important part for me was:

– Go to Network in the Preferences
– Select the Novatel CDMA network device in the left pane
– Click on the Advanced button
– Go to the WWAN tab
– Choose Novatel as the vendor and CDMA as the model
– Click OK, then Apply
– Click on the Connect button

That was enough to give me a wireless broadband connection. I still can’t use the SmartView software which means I can’t see my monthly usage totals, but at least I can get online.

The case against transparency

Bubble
Photo by istargazer

Eric just posted on the advantages of transparency. I’m a fanatical believer in the power of more openness to transform businesses and my whole email startup is based on the idea that there’s hidden information in our emails that’s worth revealing. The problem is, within the tech community ‘open’ is a synonym for ‘good’, and that gets my contrarian antenna twitching. As Fred Wilson says, you don’t make money by doing the same thing as everyone else, so here’s a couple of examples of transparency gone wrong.

Misleading metrics

The mortgage industry moved from a centralized business model to one where different stages were handled by separate firms. Landing clients was handled by mortgage brokers, firms like Countrywide would then write the mortgage, but the money itself was provided by investors through securitization. In the old days a single firm would handle all of this in-house, which meant they had deep and immediate access to all the information about a borrower at every stage. To make the decentralized model work, an open and standardized way of categorizing the quality of the loan was developed. Statistics and measures to cover the credit history, income and collateral offered by the borrower were passed up the chain. These were then used by the agencies to rate the loans and split them into tranches of risk. It looked like a model of transparency, increasing the efficiency of a whole industry.

The problem was the metrics were systematically false. Brokers had massive financial incentives to inflate collateral house values through friendly assessors, and help borrowers inflate their income. Unlike the old single-firm approach, there was no real accountability for the true quality of the loans, they would still get their commission. The rating agencies relied on the broker data, and had similar incentives to grade the loans favorably.

Part of the reason we’re in this mess is that the appearance of transparency made everyone complacent. A manager of a team of sales people in the single firm model would get fired if her team were fudging originations. Her management would have a strong incentive to prevent lax lending standards because that would lose the firm money. There just wasn’t an accountability mechanism to go along with the new open model, and so the apparent transparency was a dangerous illusion.

Destroying the magic

Walter Bagehot said about royalty "The monarchy’s mystery is its life. We must not let in daylight upon magic." The same could apply to Apple. One of the distinctive elements of its culture is the obsessive secrecy. This isn’t just the usual bureaucratic urge to hide information, it’s a deliberate part of their marketing strategy. The impact of any announcement is so much larger when it’s a surprise. When nobody knows what Apple’s really working on, people’s imagination runs overtime anticipating what could be coming next. Any projects that went south before release were never known to the public, helping us look far better in comparison to more open companies.

There’s massive downsides to this too, I always struggled with simple things like getting trusted developers onto beta programs because of the secrecy, but it’s hard to argue with the results.

Tarantula Hawk

Tarantulahawk

Apologies to any arachnophobes, but last night I was lucky enough to run across a really gruesome bit of nature I had to share. We often spot Tarantula Hawks flying around, but I’d never seen how they got their name. They’re enormous wasps, several inches long, and the adults live on nectar. They’ve worked out an ingenious business plan for feeding their larvae:

1- First hunt down a wandering Tarantula.
2- Paralyze it with your venom.
3- Dig a hole, shove the spider into it, and lay an egg inside its body.
4- Cover up the hole.
5- The larva hatches, first sucks all the juices from the still-living spider, and then eats it from the inside, saving the vital organs until last so it stays alive and fresh as long as possible.

The photo is from a wasp we came across that had just paralyzed its victim, and was getting ready to drag it to its lair. How cool is that?!

Tarantulahawk2