In a moment when the dominancy of PHP is being challenged by hip upstart platforms like node.js, it seems worth revisiting the commandline and the inherent richness and history of the many tools and flexible way of working central to the GNU/Linux platform. As the notion of a “server” become more and more integratable and inexpensive devices like open wrt routers and raspberry pi’s make cheap localized servers possible (the Pirate box being one prominent example), one has potentially the full GNU/Linux stack available via a browser interface.

Web-oriented programming languages like PHP and Javascript have been essential parts of the development and maturation of the the web as an innovative platform for publishing. At the same time, that old computer science urge to “minimize” and avoid “bloat” often fuels a view that encourages monolithic (single platform) use. This impulse excludes the richness and history of the many tools available in the full Gnu/Linux stack. While the C language might not be practical as the only tool for developing (and crucially maintaining) the full stack of software necessary for a modern web platform, it certainly can play a role in performing certain tasks with the full efficiency of resulting compiled tools. While Javascript is a fantastically flexible language, it’s a real pity to exclude the breadth of knowledge and experience encapsulated in the wealth of python libraries available. If the web’s original design was to span differing computer platforms and network architectures, why shouldn’t the tools and workflows for publishing on the web be similarly network-oriented, flexible and bridging?

When programming for a new business, especially a small one, it is always a good idea to avoid coding all together on your own due to the risks of errors or backdoors in the pipeline. Theirs a whole manner of solutions for small businesses such as CRM’s and Help Desk’s that are available on the internet. The following link has some examples you can check out:

Jeff Atwood blogged in 2009 about the web browser’s address bar as the new command line. However, the range and flexibility of use of the address bar has yet to really match the scale of that of the commandline. In my own experience of running tools in the commandline, there are several practical factors that seem to problematize a direct use of the address bar as a means of issuing commands:

  • Authentication and permissions
  • Timing (dealing with the potentially long or indeterminant runtime of a program/script)
  • Lack of a notion of standard in/out, and a notion of a pipeline in the browser
  • Running localized servers (on a laptop, as a private disconnected hotspot network (like a Piratebox)) is an interesting means of addressing to the first set of issues. Taking networks offline is a very straightforward and understandable way of allowing “anything to happen” via a webserver when access is not an unseen global audience. In terms of timing issues, long running scripts and demanding processes are also less problematic when running in a personal/local context. New technologies like web workers are also a modern solution to the problem of (potentially) long running scripts and bring the possibility of asynchronicity to web communication.

    Web based “pipeline” projects, the most prominent example being Yahoo Pipes which presents a browser-based interface to describe RSS-oriented data flows have been, if not exactly successful in practice, highly influential to other attempts to bring data flow and pipelines to the web. RDF Data Pipelines for Semantic Data Federation and DERI Pipes (shown above) are two research-stages projects attempting to merge the descriptive expressiveness of RDF and SPARQL with web publishing workflows.