Back to the Future: Desktop Applications

One of the best prepared talks I saw at PyCon this year was on Phatch, a cross-platform photo processing application written in Python. Stani Michiels and Nadia Alramli gave a well rehearsed, compelling talk discussing the ins and outs of developing their application for Linux, Mac OS X, and Windows. The video is available from the excellent Python MiroCommunity.

The talk reminded of a blog post I saw late last year and never got around to commenting on, Ruby for Desktop Applications? Yes we can. Now I’m only a year late in commenting on it. This post caught my eye for two reasons. First, the software they discuss was commissioned by the AGI Goldratt Institute. I had heard about Goldratt from my father, whose employer, Trusted Manufacturing, was working on implementing constraints-based manufacturing as a way to reduce costs and distinguish themselves from the rest of the market. More interesting, though, was their discussion of how they built the application, and how it seemed to resonate with some of the work I did in my early days at CC.

Atomic wrote three blog posts (at least that I saw), and the one with the most text (as determined by my highly unscientific “page down” method) was all about how they “rolled” the JRuby application: how they laid out the source tree, how they compile Ruby source into Java JARs, and how they distribute a single JAR file with their application and its dependencies. I thought this was interesting because even though it uses a different language (Python instead of Ruby), GUI framework (wx instead of Swing/Batik), and runtime strategy (bundled interpreter instead of bytecode archive), the thing I spent the most time on when I was developing CC Publisher was deployment.

Like Atomic and Phatch, we had a single code base that we wanted to work across the major platforms (Windows, Linux, and Mac OS X in our case). The presentation about Phatch has some great information about making desktop-specific idioms work in Python, so I’ll let them cover that. Packaging and deployment was the biggest challenge, one we never quite got right.

On Windows, we used py2exe to bundle our Python runtime with the source code and dependencies. This worked most of the time, unless we forget to specify a sub-package in our manifest, in which case it blew up in amazing and spectacular ways (not really). Like Atomic, we used NSIS for the Windows installer portion. On Mac OS X, we used py2app to do something similar, and distributed a disk image. On Linux… well, on Linux, we punted. We experimented with a cx-freeze and flirted with autopackage. But nothing ever worked quite right [enough], so we would up shipping tarballs.

The really appealing thing about Atomic’s approach is that by using a single JAR, you get to leverage a much bigger ecosystem of tools: the Java community has either solved, or has well defined idioms for, launching Java applications from JARs. You get launch4j and izpack, which look like great additions to a the desktop developer’s toolbox.

For better or for worse, we [Creative Commons] decided CC Publisher wasn’t the best place to put our energy and time. This was probably the right decision, but it was a fun project to work on. (We do have rebooting CC Publisher listed as a suggested project for Google Summer of Code, if someone else is interested in helping out.) Given the maturity of Java’s desktop tool chain, and the vast improvements in Jython over the past year or two, I can imagine considering an approach very much like Atomic’s were I working on it today. Even though it seems like the majority of people’s attention is on web applications these days, I like seeing examples of interesting desktop applications being built with dynamic languages.

date:2010-03-30 09:04:03
wordpress_id:1517
layout:post
slug:back-to-the-future-desktop-applications
comments:
category:development
tags:cc, ccpublisher, python

Perfection is not an option

My friend Andy, a successful organizational development consultant, moved to Ohio last year to pursue an MFA in playwriting. I spoke to Andy yesterday for the first time in about a month. We talked about his first year play, and he told me about how he’s felt blocked for a few weeks. After an initial reading with actors, he received consistent feedback in one area: he had to decide if he was writing a farce with some serious undertones, or a serious piece with some moments of light. Andy told me that this weekend he finally gave up trying to have it both ways, made a decision, found himself unblocked creatively, and spent the entire weekend writing.

“I realized that I don’t have time to fix everything I want to fix this time around. I need to fix the big things, get it in front of actors, and let the little things take care of themselves. I can’t afford to wait around for it to be perfect.”

Perfection is not an option.

How many projects do I have in some sort of blocked state right now because I need to find the time to do them right? How many ideas are only half-executed because I haven’t figured out how to finish them the “right” way? I have friends who agreed to read some of my writing in December, whom I haven’t followed up with because I haven’t “finished” revising. And I never will, at least not as long as I hang onto a belief in perfection.

This should sound familiar to anyone who’s worked on a software project using one of the agile methodologies. You don’t try to perfect — finish — the software, and then show it to the users. You take care of the big stuff, and get in front of users. And then the big stuff changes, and you can take care of it again. Rinse and repeat.

Perfection is not an option.

Perfection is, of course, quite alluring. I know that I want to be recognized for my work, want to be seen as someone who has intellectual authority and gets things done. At the same time, I’m afraid that I’ll be seen as less than that: as a failure, a poser, someone who just doesn’t get it. So I hold onto this belief that if I do it perfectly, I can somehow control how people see my work. This is a lie. I can not control how people see my work, and can not control their reactions. If someone reacts negatively, it’s possible they’re reacting to the quality of my work (I am not for one moment arguing I shouldn’t care about doing my best), but it could also be something else entirely. The way I present myself. Their own concerns and fears about their work. Things I can not control. So I remind myself of the truth.

Perfection is not an option.

Just like I’ve tried to stop worrying about having the perfect tools, I want to let go of believing the output has to be perfect before I share it with others. [Reading this blog, you can be forgiven believing that I let go of that belief long ago; I do better here, but there are still drafts from 2006 that I’ve never published because they weren’t “right”.] By definition what I put out, no matter how much time I put into it, is not going to be perfect. If I can accept that, embrace it, I can spend my time and energy actually doing the work, instead of worrying about the output.

Perfection is not an option.

date:2010-03-29 08:04:46
wordpress_id:1610
layout:post
slug:perfection-is-not-an-option
comments:
category:my life
tags:meta

Read: “Invisible”, by Paul Auster

There are a few authors who I’ll follow just about anywhere; Paul Auster is one of them. Over the past couple years as I’ve read his catalog, I’ve enjoyed his description of even the darkest and bleakest situations. Leah described his work as “primarily meta-fiction” when she first introduced me to Auster — and he definitely excels at that — but that’s only part of the appeal. In works like `Invisible <http://en.wikipedia.org/wiki/Invisible_(2009_novel)>`_, Auster uses creates a fictional world that he then uses to explore how we think about identity, shared experience, and stories.

[Warning, the following may contain spoilers, although I don’t think they would degrade the actual reading experience.]

Invisible begins in 1967, when the protagonist, Adam Walker, meets a visiting college professor, Born, and his girlfriend, Margot, at a party. This chance meeting gives rise to a business deal, the celebration of which is marred by a mugging that turns violent. It isn’t until the second section of the book that we realize the narrator is not Adam, but a college friend, James, now a successful author. James has received the preceding section from Adam much later in their lives, as the first part of a book Adam hopes to write. This book, like Invisible, will have four sections — spring, summer, fall, winter. The sending of pages, the recipients admiration for the original author (James believed Adam would go on to greatness), and the eventual responsibility for publication all echo the story of Fanshawe in The Locked Room, part of The New York Trilogy, one of Auster’s earlier works.

Invisible depicts a progression, both mechanically and for its characters. The characters deal with a push-pull of good (intellect) and evil. The book describes an interesting tension between sex and justice, how they interlock and how we distance ourselves from our actions seeking both. Auster uses different voices to emphasize the distance, telling each part of the story in a different voice. The first section is told in the first person, second in the second person, etc. The fourth and final section is told from the perspective of another person through a diary, with Adam, the protagonist, absent except in reference. As the story progresses, the details fall away in another reflection of this distancing.

Invisible works for me on many levels: as a story, as moralistic exposition, as a demonstration of using the mechanics of writing to further a story. Most importantly, it was enjoyable to read and drew me into a world where the line of what I know and what I think I know is never quite clear.

date:2010-03-28 20:04:26
wordpress_id:1622
layout:post
slug:read-invisible-by-paul-auster
comments:
category:reading
tags:2009, fiction, read

Using pip with buildout

I’ve been asked to add a blog to koucou, and this has turned out to be more of a learning experience than I expected. My first instinct was to use WordPress — I’m familiar with it, like the way it works, and I’m not interested in building my own. The one wrinkle was that we wanted to integrate the blog visually with the rest of the site, which is built on Django. I decided to give Mingus a try. This post isn’t about Mingus — I’ll write about that shortly — but rather about pip, which Mingus uses to manage dependencies. Mingus includes a requirements file with the stable dependencies for the application (one of its goals is application re-use, so there are a lot of them). As I mentioned previously, pip is the Python packaging/installation tool I have the least experience with, so I decided to try converting my existing project to pip as a starting point — to gain experience with pip, and to try and ease integration woes with Mingus.

When I started, the project used the following setup to manage dependencies and the build process:

  • Dependencies which have an egg or setuptools-compatible sdist available are specified in install_requires in setup.py

    setup(
        name = “soursop”,
    
        # ... details omitted
    
        install_requires = ['setuptools’,
                            'zope.interface’,
                            'zope.component’,
                            'PILwoTK’,
                            'flup’,
                            ],
    
        )
    
  • A buildout configuration that uses djangorecipe to install Django, and zc.recipe.egg to install the application egg and its dependencies

    [buildout]
    develop = .
    parts = django scripts
    unzip = true
    eggs = soursop
    
        [django]recipe = djangorecipeversion = 1.1.1settings = settingseggs = ${buildout:eggs}project = soursop
    
        [scripts]recipe = zc.recipe.eggeggs =
         ${buildout:eggs}interpreter = pythondependent-scripts = trueextra-paths =
       ${django:location}initialization =
       import os
       os.environ['DJANGO_SETTINGS_MODULE’] = '${django:project}.${django:settings}’
    
  • Dependencies that didn’t easily install using setuptools (either they didn’t have a sane source-tree layout or weren’t available from PyPI) are either specified as git submodules or imported into the repository.

All this worked pretty well (although I’ve never really loved git submodules).

gp.recipe.pip is a buildout recipe which allows you to install a set of Python packages using pip. gp.recipe.pip builds on zc.recipe.egg, so it inherits all the functionality of that recipe (installing dependencies declared in setup.py, generating scripts, etc). So in that respect, I could simply replace the recipe line in the scripts part and start using pip requirements to install from source control, create editable checkouts, etc.

Previously, I used the ${buildout:eggs} setting to share a set of packages to install between the django part (which I used to generate a Django management script) and the scripts part (which I used to resolve the dependency list and install scripts defined as entry points). I didn’t spend much time looking into replicating this with gp.recipe.pip; it wasn’t immediately clear to me how to get a working set out of it that’s equivalent to an eggs specification (I’m not even sure it makes sense to expect such a thing).

Ignoring the issue of the management script, I simplified my buildout configuration, removing the django part and using gp.recipe.pip:


[buildout]develop = .parts = soursopunzip = trueeggs = soursopdjango-settings = settingsdjango-project = soursop

    [soursop]recipe = gp.recipe.pipinterpreter = pythoneggs = ${buildout:eggs}sources-directory = vendor

    initialization =
   import os
   os.environ['DJANGO_SETTINGS_MODULE’] = '${buildout:django-project}.${buildout:django-settings}’

This allowed me to start specifying the resources I previously included as git submodules as pip requirements:

[soursop]
recipe = gp.recipe.pip
interpreter = python
install =      -r requirements.txt
eggs = ${buildout:eggs}
sources-directory = vendor

The install parameter specifies a series of pip dependencies that buildout will install when it runs. These can include version control URLs, recursive requirements (in this case, a requirements file, requirements.txt), and editable dependencies. In this case I’ve also specified a directory, vendor, in which editable dependencies will be installed.

That actually works pretty well: I can define my list of dependencies in a text file on its own, and I can move away from git submodules and vendor imports to specifying [D]VCS urls that pip will pull.

Unfortunately, I’m still missing my manage script. I wound up creating a small function and entry point to cause the script to be generated. In soursop/scripts.py, I created the following function:

def manage():
    “”“Entry point for Django manage command; assumes
    DJANGO_SETTINGS_MODULE has been set in the environment.

    This is a convenience for getting a ./bin/manage console script
    when using buildout.”“”

    from django.core import management
    from django.utils import importlib
    import os

    settings = importlib.import_module(os.environ.get('DJANGO_SETTINGS_MODULE’))

    management.execute_manager(settings)

In setup.py, I added an entry point:

entry_points = {
       'console_scripts' : [
           'manage = soursop.scripts:manage',
           ]
       },

Re-run buildout, and a manage script appears in the bin directory. Note that I’m still using the environment variable, DJANGO_SETTINGS_MODULE, to specify which settings module we’re using. I could specify the settings module directly in my manage script wrapper. I chose not to do this because I wanted to emulate the behavior of djangorecipe, which lets you change the settings module in buildout.cfg (i.e., from development to production settings). This is also the reason I have custom initialization code specified in my buildout configuration.

Generally I really like the way this works. I’ve been able to eliminate the tracked vendor code in my project, as well as the git submodules. I can easily move my pip requirements into a requirements file and specify it with -r in the install line, separating dependency information from build information.

There are a couple things that I’m ambivalent about. Primarily, I now have two different places where I’ve declared some of my dependencies, setup.py and a requirements file, and each has advantages (which are correspondingly disadvantages for the other). Specifying the requirements in the pip requirements file gives me more flexibility — I can install from subversion, git, or mercurial without even thinking about it. But if someone installs my package from a source distribution using easy_install or pip, the dependencies won’t necessarily be satisfied [1] [2] . And conversely, specifying the requirements in setup.py allows everyone to introspect them at installation time, but sacrifices the flexibility I’ve gained from pip.

I’m not sure that we’ll end up using Mingus for koucou, but I think we’ll stick with gp.recipe.pip. The disadvantage is a small one (at least in this situation), and it’s not really any worse than the previous situation.


[1]I suppose I could provide a bundle for pip that includes the dependencies, but the documentation doesn’t make that seem very appealing.
[2]Inability to install my Django application from an sdist isn’t really a big deal: the re-use story just isn’t good enough (in my opinion) to have it make sense. Generally, however, I like to be able to install a package and pull in the dependencies as well.
date:2010-03-28 13:05:22
wordpress_id:1586
layout:post
slug:using-pip-with-buildout
comments:
category:development
tags:dependencies, django, koucou, pip, python, scm, zc.buildout

CiviCon Next Month in San Francisco

I’m honored to be asked to kick off the first ever CiviCon next month in San Francisco. CiviCon is a one day conference for users of CiviCRM, a free software constituent relationship management platform. CiviCRM is a key component of Creative Commons’ infrastructure (we use it as our donor management system), and I’m excited to see the community come together and talk about new features, integration techniques, and ideas for future development.

When I was asked to present, I thought about what I could talk about, beyond simply our deployment and customization of CiviCRM (which the other Nathan will do a great job of during his presentation). Creative Commons is not the first non-profit I’ve worked with, and CiviCRM is not the first constituent/donor management system I’ve worked with. As I thought about my past experience and my experience at Creative Commons, I realized that CiviCRM is a key piece of infrastructure that enables Creative Commons to fulfill its mission, and to do so in a responsible way. Using CiviCRM is not just a question of free vs. proprietary software: it’s a question of responsible stewardship. CiviCRM and other free software allows us to fulfill our mission in a responsible, sustainable way. I think this is important to think about, so I’ll be talking about why I think this is the case. I’ll touch on how CiviCRM fits into Creative Commons, how it supports our mission, why I think FLOSS infrastructure ([STRIKEOUT:including]especially Civi) is essential for non-profits and grassroots organizations, and what I think is on the horizon.

I hope you’ll join me next month at CiviCon; you can register now (space is limited). The list of proposed sessions is online, and it looks like a really interesting day.

date:2010-03-22 06:47:53
wordpress_id:1577
layout:post
slug:civicon-next-month-in-san-francisco
comments:
category:talks
tags:cc, civicon, civicrm, san francisco, speaking

Pre-read: Grok 1.0 Web Development

|image0|Late last month I received an email from Packt Publishing (en.wp), asking if I’d be interested in reviewing one of their new titles, `Grok 1.0 Web Development <http://www.packtpub.com/grok-1-0-web-development/book?utm_source=yergler.net&utm_medium=bookrev&utm_content=blog&utm_campaign=mdb_002632>`_, by Carlos de la Guardia. I immediately said yes, with the caveat that I’m traveling a lot over the next 30 days, so the review will be a little delayed (hence this pre-review). I said “yes” because Grok is one of the Python web frameworks that’s most interesting to me these days. It’s interesting because one of its underlying goals is to take concepts from [STRIKEOUT:Zope 3]Zope Toolkit, and make them more accessible and less daunting. These concepts — the component model, pluggable utilities, and graph-based traversal — are some of the most powerful tools I’ve worked with during my career. And of course, they can also be daunting, even to people with lots of experience; making them more accessible is a good thing.

I’ve read the first four chapters of Grok 1.0 Web Development, and so far there’s a lot to like. It’s the sort of documentation I wish I’d had when I ported the Creative Commons license chooser to Grok1. I’m looking forward to reading the rest, and will post a proper review when I return from Nairobi. In the mean time, check out Grok, Zope 3 for cavemen.

You can download a preview from Grok 1.0 Web Development, `Chapter 5: Forms </media/2010/03/7481-grok-1-0-Web-development-sample-chapter-5-forms.pdf>`_.


1 The CC license chooser has evolved a lot over the years; shortly after Grok was launched we adopted many of its features as a way to streamline the code. Grok’s simplified support for custom traversal, in particular, was worth the effort.

date:2010-03-16 09:14:50
wordpress_id:1567
layout:post
slug:pre-read-grok-1-0-web-development
comments:
category:reading
tags:cc, grok, pre-read, python, reading, zope

Meta: What’s up with all the Reading?

So far this year, I’ve published seven posts with the tag “reading“. Of 24 posts this year (already more than all of 2009!), that’s almost a third of my blogging. Put another way, in the first five years of blogging I wrote four book-related posts; I’ve almost doubled that in the first quarter of 2010.

I’ve always loved reading. In middle school, I’d sit with a novel in my lap, trying to read during class without getting caught. Going into this year, I wanted to try things that I hypothesized would make me a better writer. One of these things is reading, specifically reading and thinking about what makes a book or story work or not for me. Another of the things is blogging1, so it made sense to me that I would start to blog what I read. I also wanted to keep track of what I read a little better. Instead of using this as another excuse to build a tool that I’m not sure I’ll actually use, I’m just using tags on the posts: sfpl for books I check-out from the the San Francisco Public Library, fiction for works of fiction, etc. I’d like to use something more structured for this (probably RDFa), but right now I have enough half finished software projects, so tags it is.

And that’s why my blog seems like a book report lately.


1 I see blogging as a practice: something that I do with regularity, which has immediate and cumulative benefits.

date:2010-03-10 07:04:26
wordpress_id:1559
layout:post
slug:meta-whats-up-with-all-the-reading
comments:
category:reading, yergler.net
tags:meta, reading

For Some Definition of “Reusable”

I read “Why I switched to Pylons after using Django for six months” yesterday, and it mirrors something I’ve been thinking about off and on for the past year or so: what is the right level of abstraction for reuse in web applications? I’ve worked on two Django-based projects over the past 12-18 months: CC Network and koucou. Neither is what I’d call “huge”, but in both cases I wanted to re-use existing apps, and in both cases it felt… awkward.

Part of this awkwardness is probably the impedance mismatch of the framework and the toolchain: Django applications are Python packages. The Python tools for packaging and installing (distutils, setuptools, distribute, and pip, I think, although I have the least experience with it) work on “module distributions1: some chunk of code with a setup.py. This is as much a “social” issue as a technology one: the documentation and tools don’t encourage the “right” kind of behavior, so talk of re-usable applications is often just hand waving or, at best, reinvention2.

In both cases we consciously chose Django for what I consider its killer app: the admin interface. But there have been re-use headaches. [NB: What follows is based on our experience, which is setuptools and buildout based] The first one you encounter is that not every developer of a reusable app has made it available on PyPI. If they’re using Subversion you can still use it with setuptools, but when re-using with git, we have some additional work (a submodule or another buildout recipe). I understand pip just works with the most commons [D]VCS, but haven’t used it myself. Additionally, they aren’t all structured as projects, and those that are don’t always declare their dependencies properly3. And finally there’s the “real” issues of templates, URL integration, etc.

I’m not exactly sure what the answer is, but it’s probably 80% human (as opposed to technology). Part of it is practicing good hygiene: writing your apps with relocatable URLs, using proper URL reversal when generating intra-applications URLs, and making sure your templates are somewhat self-contained. But even that only gets you so far. Right now I have to work if I want to make my app easily consumable by others; work, frankly, sucks.

Reuse is one area where I think Zope 3 (and it’s derived frameworks, Grok and repoze.bfg) have an advantage: if you’re re-using an application that provides a particular type of model, for example, all you need to do is register a view for it to get a customized template. The liberal use of interfaces to determine context also helps smooth over some of the URL issues4. Just as, or more, importantly, they have a strong culture of writing code as small “projects” and using tools like buildout to assemble the final product.

Code reuse matters, and truth in advertising matters just as much or more. If we want to encourage people to write reusable applications, the tools need to support that, and we need to be explicit about what the benefits we expect to reap from reuse are.


1 Of course you never actually see these referred to as module distributions; always projects, packages, eggs, or something else.

2 Note that I’m not saying that Pylons gets the re-use story much better; the author admits choosing Django at least in part because of the perceived “vibrant community of people writing apps” but found himself more productive with Pylons. Perhaps he entered into that with different expectations? I think it’s worth noting that we chose Django for a project, in part, for the same reason, but with different expectations: not that the vibrant community writing apps would generate reusable code, but that they would education developers we could hire when the time came.

3 This is partially due to the current state of Python packaging: setuptools and distribute expect the dependency information to be included in setup.py; pip specifies it in a requirements file.

4 At least when dealing with graph-based traversal; it could be true in other circumstances, I just haven’t thought about it enough.

date:2010-03-09 18:38:54
wordpress_id:1539
layout:post
slug:for-some-definition-of-reusable
comments:
category:development
tags:django, python, web, zope

Read: “Spooner”, by Pete Dexter

Spooner, Pete Dexter’s latest novel, is not as consistent as The Paperboy, but that does not make it inferior. Spooner tells the story of a boy, Spooner, and his step-father, Calmer. Spooner is not smart, is not handsome, and is primarily talented at causing trouble for others. Pissing in the shoes of others, rolling cars down the hill, and throwing eggs at cars: these are the things Spooner is good at. Calmer, a former Navy man, is good at just about everything, and is particularly good at being patient and trying to rescue those in need of redemption. Like Spooner’s mother, Lily, who sees the world first as a personal affront to her.

Dexter uses language in a way that lets you feel the words in your mouth and taste the idiom and “flavor”; in his hands, the language of the South (Spooner begins in Georgia; The Paperboy in Florida) does not feel impersonated or propped up, but real and present. Spooner contains exposition that made me almost giddy with pleasure, re-reading paragraphs out loud on the bus, looking like a crazy person, I’m sure. For example,

There was in every sport Spooner ever played, on every team he ever joined, an outcast. Some kid who had been plucked from the safety of home and homeroom and tossed, often at the insistence of his own father, out into the world. Unprotected. Often this kid was the fattest, dopiest kid in school, someone who had been it every day of his life on the playgrounds, shunned or insulted one day, beaten up the next, and was now introduced to the rest of his life, which was more of the same except better organized, with the degree of abuse he suffered depending mostly on the mercies of the adults in charge.

I don’t know if I was exactly that kid on the team, but I could certainly pick him out a mile away, and knew enough to keep my distance.

Spooner is told in the third person, but Dexter manages to convey the mental confusion and uncertainty the characters express in a way that reminded me of Paul Auster. Characters try to look at themselves and figure out what really happened: Did they really see what they think they saw? Where was the moment things went wrong? Could they have found another way through that situation? That ability to convey the introspection, uncertainty, and inner monologue of a character gives the story a depth: coming to the end of a paragraph is like coming up from under water, and you’re not really sure where you’ve wound up.

Spooner is not perfect; one section, in particular, doesn’t feel like it “fits” with the rest. As a whole it’s a great story about two characters who care a great deal for each other, an original, expansive rendering of the father-son relationship.

date:2010-03-06 10:04:46
wordpress_id:1525
layout:post
slug:read-spooner-by-pete-dexter
comments:
category:reading
tags:2009, fiction, reading, sfpl

Read: “Fordlandia”, by Greg Grandin

Fordlandia chronicles the rise and fall of eponymous rubber plantation established by Henry Ford in Brazil in 1927. I don’t think it’s giving too much away to say that it would be more accurate to say, “attempted to establish”. The book is a chronicle of the money spent, initiatives undertaken, and schemes hatched, all in an effort to wring profit from the Amazon and, at least in some cases, bring better living conditions to its inhabitants.

Fordlandia is really three, interwoven stories. The surface story is about Ford’s efforts to push the limits of his autonomous, vertically integrated manufacturing by establishing a stable source of rubber in the Amazon, along the Tapajós River. Rubber was one of the few raw materials that Ford did not own or control production of, and was concerned that a British-initiated cabal could raise prices in the American market. Grandin gives the reader context in the form of Ford’s previous success with Fordism in the US, which is particularly interesting given the decline of the automobile industry of late.

Below the surface of the main story are two others: the story of the people of the Brazilian Amazon, and their exploitation during Brazil’s rubber boom and bust, and the story of Henry Ford’s personal evolution from industrialist to agriculturalist to paternalistic social engineer. As Fordlândia failed to produce rubber, it increasingly became a social experiment, attempting to export an idealized midwestern social structure to the Amazon. Ford and his managers attempted to impose what they believed to be the optimal structure — both social and corporate — on the workers. The results seem to have been directly, inversely proportional to the amount of control they tried to exert. Ford believed he was saving workers from the exploitative system of indentured servitude pervasive during Brazil’s rubber boom, but failed to understand the social dynamics that would dictate whether his new system was actually a success.

I found Ford’s evolution to be a particularly compelling part of the story. His massively integrated manufacturing system helped move people out of small towns and into urban centers. Despite this and his seeming contempt for the past, he idealized Puritan, small town America in the extreme. This story of trying to re-establish something he was responsible for weakening was one of the more compelling parts of the book.

Grandin concludes with an epilogue, “Still Waiting for Henry Ford.” In it he sounds a cautionary note about ongoing attempts to “modernize” the Amazon. The engaging, insightful chapters preceding this allow it to avoid any hints of panic or exaggeration. The Amazon is still waiting for the promises of Henry Ford to come true.


While Grandin wisely does not attempt sweeping moral interpretation, it does seem that Ford truly believed he was helping the residents of the Brazilian Amazon. Unfortunately a complete disinterest in understanding their social and economic structure led to sub-optimal results.

date:2010-03-05 15:15:41
wordpress_id:1519
layout:post
slug:read-fordlandia-by-greg-grandin
comments:
category:reading
tags:2009, nonfiction, reading, sfpl