Sometimes you'd like to vote for a ballot issue, but it contains some fatal flaw, such as the use of debt to pay for something that ought to be funded out of current revenues, even if it means being patient, yet voting against it seems like sending the wrong message, because it's the use of debt that you're voting against, not the basic proposal itself.
Sunday, October 26, 2014
An extensive review article published on Nature's website, and described on the UC Davis news website, concludes that no-till farming only results in yield increases in dryland areas, and then only when combined with crop rotation and residue retention, and that it results in a yield reduction in moist climates.
While I have no reason to doubt the conclusions of the co-authors, as far as they go, I do have some concerns as to the scope of the comparisons they've made. However, not having read the full article, I can only pose questions and suggest considerations which may offset or even outweigh the modest yield reductions in moist climates, which they've noted.
It's hard to know where to start; this is such a complex subject. As practiced in western countries, no-till usually also means weed suppression by use of herbicides. It may or may not include residue retention, but if the residue is retained it is likely to be in rough form rather than finely chopped, or retained as the dung of the animals that grazed on it after harvest, never as well-distributed as the residue was in the first place. It may or may not include crop rotation, but almost certainly does not include polyculture (also called intercropping), which has become an all too rare practice.
Allow me to back up a bit, and consider an assumption, as expressed by one of the co-authors: "The big challenge for agriculture is that we need to further increase yields but greatly reduce our environmental impacts." Certainly we need to vastly reduce the environmental damage being done by modern agriculture, but just how much do we really need to increase yields. Population growth estimates not taking into account the predictable reduction in fecundity that accompanies prosperity will result in alarmism, but the reality is that what benefits the global economy has to offer the poorest are slowly finding their way to every corner of the planet, and it's reasonable to think that the world population will plateau, if not at ten billion, then perhaps at eleven or twelve billion. Of course, there is hunger now, even starvation, much of it happening in the dryland areas surrounding the Sahara. Yield increases in this region would be particularly helpful, but are complicated by competing uses, as fuel and as animal feed, for the residues which should be left in the fields. Realistically, the bottom line comes down to this: Can we afford to sacrifice long-term fertility for short-term gains in yield?
That question begs another, does the article published in Nature include any long-term studies, by which I mean at least twenty years, preferably longer? Not only does tillage gradually burn through (literally oxidize) soil organic matter, eventually effecting water absorption and retention and nutrient availability, and increasing the energy required for ongoing tillage as the soil becomes denser, but also it takes time for an ecosystem of animals and microbes to develop that can efficiently incorporate crop residues into the soil, particularly in fields that have a long history of routine tillage.
Were any options other than simply leaving residue in the field or grazing considered? Are there any cases of fine-chopping residue during harvest? What about initially removing everything but the stubble and returning it after processing it through animals (as feed), through anaerobic digestion (producing methane gas for fuel), and/or through composting?
Were the costs of production considered? No-till generally involves the cost of herbicide and its application, but tillage is an energy-intensive operation, and over the long term diesel will only become more expensive. If the fuel must be grown, shouldn't the percentage of the overall crop area required to grow it be deducted from the net yields? How does no-till look after performing that calculation?
Nor have we yet seen the full benefits of no-till, because we have yet to develop equipment appropriate to it. Western civilization is so accustomed to tillage that we tend to be blind to assumptions made stemming from a fundamental assumption that tillage is the foundation of agriculture. We see equipment built to perform tillage at work and don't think twice about it. There have been some adaptations – spraying equipment that is only as heavy at it needs to be for that purpose, and oversized tires for heavier equipment – but nearly all of the equipment in use, even in no-till operations, still deals with land as a bulk commodity, measured in acres per hour, rather than at the level of detail required to, for example, selectively harvest one crop while leaving several others, intermingled with it, undisturbed.
Until recently, this could only be accomplished by hand labor, but with the advent of computing using integrated circuits, and its combination with sensory hardware, sophisticated mechanisms, and software to match the problem space (together comprising the field of robotics), the question of whether such work can be mechanized has been transformed into one of how soon. A significant obstacle to this development is cultural, in that we've all but forgotten how to tend land in this manner, and may have to reinvent the practice in order to program the machines. Certainly many in our agricultural colleges and universities will require remedial education.
Sunday, October 12, 2014
James Gosling, famed software developer who has spent his last several years working at Liquid Robotics, was recently the featured speaker at a CMU Robotics Institute seminar. My purpose here is not to discuss that talk as a whole, but to focus in on particular issues he discussed which are more generally applicable.
At 52:10, he begins the discussion of fault management, describing, among other things, how LR relies heavily upon features of Java that support continuous operation in the face of problems that would cause software to stop abruptly in other environments.
At 54:30, he discusses communication modes and data prioritization, which is an issue for LR because real-time transmission can cost them as much as $1/kilobyte, for a data rate of ~50 baud.
At 57:46, he briefly discusses security issues, which he says he could have talked about at much greater length.
At 58:43, he mentions Java's write once run anywhere advantage, and how LR makes good use of it in writing and debugging their software.
At 1:05:17, he responds to a comment from the audience regarding inclusion of a basic feature, camera panning, the consequences of various approaches to crafting hardware to support it, and how LR has worked around the problem.
At 1:07:59 he launches into the topic of parts availability, or lack thereof, noting that chips LR would like to acquire are only available as part of circuit boards, or in large lots, which constrains their choices in hardware design.
This last item, the lack of availability of what are, in a volume context, standard parts, is my main motivation for going to the trouble of posting this. It holds back not only the development of robotics, but electronics startups of all sorts, and, to a lesser extent, hobbyists (because in most cases those complete boards are what they need).
Wednesday, October 08, 2014
While casting about for some way of putting the phenomenon of the Islamic State in context, it occurred to me that the history of Christianity provides a rough parallel – the Inquisition.
Sure, the Inquisition was organized more like a court than a military operation, and no one was guaranteed a place in Heaven for participating in it, but the idea of harsh punishment for heresy or apostasy was as much a part of it as it is today a part of the Islamic State.
On huge difference is that the Islamic State is, of necessity, also a civil authority, and that among its ambitions are the elimination of foreign influences from the territories is considers to be its domain, and in that it is more like the war of reconquest (La Reconquista), which achieved ultimate success in 1492 and paved the way for the Inquisition.
Perhaps the Islamic State is like La Reconquista and the Inquisition rolled into one.
Friday, August 01, 2014
Wikipedia also has a fairly extensive article on UARTs, the electronic components found at both ends of most serial connections and responsible for encapsulating the complexities of making them work reliably, presenting simplified interfaces to the processors to which they are connected.
Sunday, July 27, 2014
As just about anyone who knows me can tell you, I'm into robots. But what I'm into is way beyond anything I could build myself, given current resources.
Once you get beyond a minimal level of robotic complexity, you start seeing advantages to breaking out parts of the computational load, keeping them relatively local to the sensors and effectors they manage. This means distributed processors, which is fine, until you start trying to get them to talk to each other, at which point you'll discover that you've just become a pioneer, exploring poorly-charted territory.
It's not that there hasn't been any groundwork at all done, but there's nothing close to being a single, standard approach to solving this relatively straightforward problem.
Nor is that so surprising, because until recently there hasn't been much need to solve it, since most devices had only a single CPU, or, if more than one, then they were tightly integrated on the same circuit board, connected via address and data buses, and most of the exceptions have been enterprise servers, with multiple processor boards all plugged into a single backplane.
But the time is coming when, for many devices, the only convenient way to connect distributed computing resources together will be via flexible cables, because they will be mounted on surfaces that move, relative to each other, and separated by anywhere from a few centimeters to tens of meters. But they'll still need fast connection, both low latency and high data rates.
From what I've seen so far, RapidIO is the leading contender for this space.
Tuesday, June 24, 2014
People distrust authority, and for good reason.
There are many examples, both historical and contemporary, of authority being abused for the advantage (whether personal or collective) of those in authority and/or belonging to the power base behind the authority, or for reasons relating to unquestioned dogma. This is true across the board, whether that authority is religious, political, economic, or even scientific in nature.
There are also many examples of upstart movements and theories, deserving of being smacked down, in each of these realms. Aside from the background of nonsense noise, this is a problem in that it can be very hard to differentiate between a quack and the next Einstein, and broad suppression of quackery risks 'throwing out the baby with the bathwater'.
But beyond that, suppression feeds people's suspicion regarding authority, which plays into the hands of the quacks.
To me this appears to be an irresolvable quandary, and that the best we can do is to insure that the public is as prepared as realistically possible to evaluate novel ideas for themselves, and to detect the whiff of quackery wherever it might turn up – even when it emanates from the halls of authority.
Friday, June 06, 2014
Tuesday, May 27, 2014
There is a rumor going around that Apple is (again/still) considering switching to its own ARM-based CPUs in at least its lower-end Macs.
First, consider that platform independence was one of the primary touchstones in the development of OSX, and, from the beginning, Apple maintained parallel PowerPC and Intel builds, for something like five years before finally deciding to take the plunge, driven, in the end, by IBM's unwillingness to continue to invest in energy-efficient consumer versions of its POWER architecture, and Motorola's disinterest in what they viewed as a niche market and heavy investment (eventually leading to heavy losses) in Iridium.
Driven by the need for reasonable performance in a very low energy package, Apple has developed its own line of processors, based on ARM, which they've made and sold by the millions, packaged in iPhones, iPods, iPads, AppleTVs, and perhaps even Airport Extremes. Because it owns the designs, the marginal cost of each additional unit is very low, and it's likely that they can assemble a circuit board bearing four, six, or even eight of their own A-series chips for what a single Intel processor costs them.
That Apple would maintain a parallel build of OSX on ARM is practically a given. Of course they do, and would have been doing so from the moment they had ARM-based chips that were up to the task.
Does the existence of such a parallel build mean that a switch to ARM is imminent? No, but Intel had better watch out that they don't try to maintain profitability by hiking the prices of their processors even higher, because it's very possible that they've already passed the point where Apple could get better performance for less money by using several of their own processors in place of one Intel processor.
And, don't forget that Apple has been through such a transition twice before; it would (will?) be as seamless as possible.