Tuesday, June 24, 2014

why a scientific inquisition would fail

People distrust authority, and for good reason.

There are many examples, both historical and contemporary, of authority being abused for the advantage (whether personal or collective) of those in authority and/or belonging to the power base behind the authority, or for reasons relating to unquestioned dogma. This is true across the board, whether that authority is religious, political, economic, or even scientific in nature.

There are also many examples of upstart movements and theories, deserving of being smacked down, in each of these realms. Aside from the background of nonsense noise, this is a problem in that it can be very hard to differentiate between a quack and the next Einstein, and broad suppression of quackery risks 'throwing out the baby with the bathwater'.

But beyond that, suppression feeds people's suspicion regarding authority, which plays into the hands of the quacks.

To me this appears to be an irresolvable quandary, and that the best we can do is to insure that the public is as prepared as realistically possible to evaluate novel ideas for themselves, and to detect the whiff of quackery wherever it might turn up – even when it emanates from the halls of authority.

Friday, June 06, 2014

T-Square, 25 years later

Arguably, considering the ongoing confrontation off the coast of Vietnam, China has come a long way in 25 years, but not yet so far that they can allow this video to be viewed by their own people.

Tuesday, May 27, 2014

ARM, economies of scale, and maintaining options

There is a rumor going around that Apple is (again/still) considering switching to its own ARM-based CPUs in at least its lower-end Macs.

First, consider that platform independence was one of the primary touchstones in the development of OSX, and, from the beginning, Apple maintained parallel PowerPC and Intel builds, for something like five years before finally deciding to take the plunge, driven, in the end, by IBM's unwillingness to continue to invest in energy-efficient consumer versions of its POWER architecture, and Motorola's disinterest in what they viewed as a niche market and heavy investment (eventually leading to heavy losses) in Iridium.

Driven by the need for reasonable performance in a very low energy package, Apple has developed its own line of processors, based on ARM, which they've made and sold by the millions, packaged in iPhones, iPods, iPads, AppleTVs, and perhaps even Airport Extremes. Because it owns the designs, the marginal cost of each additional unit is very low, and it's likely that they can assemble a circuit board bearing four, six, or even eight of their own A-series chips for what a single Intel processor costs them.

That Apple would maintain a parallel build of OSX on ARM is practically a given. Of course they do, and would have been doing so from the moment they had ARM-based chips that were up to the task.

Does the existence of such a parallel build mean that a switch to ARM is imminent? No, but Intel had better watch out that they don't try to maintain profitability by hiking the prices of their processors even higher, because it's very possible that they've already passed the point where Apple could get better performance for less money by using several of their own processors in place of one Intel processor.

And, don't forget that Apple has been through such a transition twice before; it would (will?) be as seamless as possible.

Sunday, May 11, 2014

deeply encoding real-world knowledge and experience

Why code?

That might seem like a strange question for someone like myself to be asking, but it's an important one.

It has become clear to many educators that some facility with data structures, algorithms, and user interfaces has become an important aspect of literacy. While this is a welcome development, it is nevertheless important to ask "to what end?"

Is it necessary, or even desirable, for all of today's K-12 students to grow up to be programmers? Clearly not. Not only are there many other positions which will need to be filled, but, beyond relatively trivial examples, programming is a subtle craft requiring a concurrence of aptitude, attitude, and knowledge to achieve useful results, and most people who are not professional programmers, even if they know enough to put together working code, are, in most instances, better off leaving the coding to the professionals.

Nevertheless, early exposure can tune one's attitude, and improve one's aptitude and one's chances for accumulating the necessary knowledge. At least as importantly, it will also serve to identify those with a particular gift for coding sooner than would otherwise be the case. But there is value in that exposure that has very little to do with preparation for direct involvement in future programming projects, and a great deal to do with learning to think rationally and to communicate with precision.

Those skills are generally applicable, in all manner of vocations, for reasons having nothing to do with computing, but they become particularly important as decisions formerly made and tasks formerly performed by humans become the purview of machines, whether computers or robots.

For each such real-world context into which some degree of automation is to be introduced, it is vital that there be at least one person who is adept or able to interpret for those who are, and possesses the clarity of thought and expression to guide those who are tasked with developing those cybernetic systems. Without such guidance, in the vast majority of cases, automation also means a sacrifice of competence, as even senior engineers are rarely also domain experts, outside of their specialities, which may or may not apply to the project at hand.

By insisting that all students have some exposure to programming, we are improving the chances of such a person being available to guide the next expansion of the domain of automation, and the next, and the next, and thereby improve the chances that the knowledge and skills of contextual experts will be preserved in the process.

Tuesday, May 06, 2014

passengers demand to be paid for riding first class

Internet backbone provider Level 3 reports that six of the internet service providers it connects to have allowed those connections to remain continuously congested, and that these same ISPs are insisting that Level 3 should be paying them for access to their networks.

[Insert sound of loud, annoying buzzer.]

The problem with this is that it's backwards. If anyone should be paying for access to a network, it ought to be the companies with subscriber income paying the backbone providers, not the other way around.

Wake up and smell the stench of irrational overreaching, people!

Friday, May 02, 2014

if you could only attend one of the two?

The 2014 Apple Worldwide Developer Conference (WWDC) opens June 2nd, in San Francisco, at which the company is widely expected to have something more to say about its reportedly health-related 'iWatch' product. Arch-competitor Samsung just announced its own health-related event five days earlier, also in San Francisco.

I have to wonder just who Samsung thinks is going to attend their event. Local tech journalists with nothing better to do, obviously, but what if you were a tech journalist based somewhere further away than San Jose or Sacramento, and weren't already planning to spend the week leading up to WWDC seeing the sights of San Francisco, and had invitations to both events, but could only reasonably attend one of them, which one would you choose? For most, the choice would be obvious, and it wouldn't be Samsung.

We can presume the coverage of the Samsung event will come from 1) locals, 2) junior staffers sent by their editors, and perhaps 3) a vacationing pundit or two.

So why is Samsung going to the trouble when the most likely outcome is that their event will serve as a set, which merely lofts the ball for Apple's spike?

I see three ways in which Samsung stands to benefit.

If Apple makes no mention of anything resembling an 'iWatch' in the public keynote which opens WWDC, then, for a few weeks or months, Samsung looks like the company that's actually doing something about health, and gains a degree of credibility for being in the market from the beginning, when in fact they are very late entrants.

If, on the other hand, Apple does introduce the 'iWatch', Samsung's event will serve to focus even more attention on it than would have otherwise been the case, drumming up even more hype, and, presumably, expanding the size of the potential market for health-related devices in general, of which Samsung might reasonably expect to eventually inherit a sizable chunk.

However, the real coup for Samsung would be if the state of readiness of the 'iWatch' project is such that Apple would prefer to delay its announcement, but, having been thus challenged by Samsung, opt to go ahead with a pre-announcement, even though product availability is still months away, thus providing Samsung with both a clear target and time enough to pull off one of their rapid cloning acts.

Very clever, actually.

Wednesday, April 23, 2014

the departure of Apple's yes-men

Much has been made of the turnover at Apple since the death of Steve Jobs, with more than a few concluding that Apple's time of amazing success is over, to be replaced by either stagnation or decay. Steve was the source of innovation within the company, they argue, and without him Apple is doomed.

There's no doubt that Steve was a genius, in his own way, and that Apple's turnaround and rapid ascension to contend for the title of most valuable company in the world was, in no small part, his doing. On the other hand, as much as he relished being surrounded by brilliant minds who could steal the spotlight from him, there is a strong tendency for such powerful leaders to become encrusted with others for whom the truth is whatever the leader says it is, who contribute little more than amplification of that leader's insights and predilections.

No more. Those days are gone at Apple, or at least so dramatically altered as to require a wholesale changing of the guard. Tim Cook may not have Steve's charisma, but neither is he as susceptible to flattery, and, as long-time operations chief, he has a great deal of practice in peering through pretense to gauge whether a person, partner firm, or product proposal contributes to the company's health or degrades it.

Any who made a career of being a yes-man for Steve would have a very hard time of it in today's Apple, and I would like to suggest that underlies the departure of at least a few from the company.

Sunday, March 30, 2014

please rebrand Facebook!

Facebook obviously isn't going away, despite having paid far too much for WhatsApp.

So, please, before doing so becomes even more difficult, find a new name for it!

"Facebook" derives directly from the company's origins, but, frankly, it sucks as a name.

My preference, given their recent purchase of Oculus, would be "The Rift", but almost anything would be preferable to "Facebook".

Saturday, March 29, 2014

the myth of unitary authority

"Them" – we've all heard, and probably said it, thousands of times, that vague reference to those who are really in control, whoever they might be.

I no longer believe in "Them", at least not in the sense of a single, mutually aware group occupying the top of the pecking order for all purposes.

Sure, there are people who wield more power than others, particularly in specific contexts, but there are millions of them, and taken together they are so far from being a united force in human affairs that the notion is frankly laughable. Even "Citizens United" only come close to actually being of one mind on a very narrow range of issues. Outside of that context, they're all over the map.

My advice? Spend less time worrying over what "They" might be up to, and more time and energy on figuring out what we all need to be doing in this epoch, and how you can contribute to that.

Thursday, March 27, 2014

making more efficient use of real-world data bandwidths

UPDATE: Almost simultaneously with my posting this, Microsoft announced Word, Excel, and PowerPoint for iPad. While editing requires an Office 365 Home subscription, the free apps work as viewers without that, so they have essentially just shipped a free PowerPoint viewer for iPad. My recommendation? Get Keynote instead. It's fully functional for $10, and it also works as a PowerPoint viewer.

While all of us not in the business or holding stock in one of the major broadband providers would like them to both drop prices and raise the bandwidth above the threshold where it becomes meaningless as a constraint on internet use, we shouldn't be holding our proverbial breath. Prices charged to consumers may come down, and overall bandwidth will surely continue to rise, but unless the FCC sees its way clear to declare data transmission a utility, and those that provide it common carriers, savings to consumers will likely be more than offset by charges to content providers for the full-speed access to networks they require to remain competitive, and those charges will necessarily be passed along to consumers, except where the content providers' price structures already provide sufficient wiggle room to absorb them.

This mainly affects the delivery of streaming media, streaming video in particular, which needs uninterrupted bandwidth to perform as expected. Buffering can help, but to really be sure that a playback won't balk halfway through, the entire program or movie needs to be buffered, which is no longer streaming.

Part of the problem with both streaming media and play on demand is that each instance of delivery is a separate transmission. Multiple data centers allow a content provider to originate transmissions more locally, but thousands of store-and-forward nodes would be required to make them truly local, and the cost of so many network connections at that level could very well prove exorbitant.

An option as old as programmable VCRs is to record the programs you want from a broadcast stream, for later viewing. Digital equivalents exist, but my impression is that they really don't do anything to enhance the quality, like capturing files complete with adequate error correction out of the digital cable stream.

If you could capture a bit-perfect file from a broadcast stream, and I'm certain it's possible, that file could also be encrypted, facilitating paid high-quality content.

While I'm on the subject, I'd also like to mention the glaring absence of a standard multimedia format that combines video sequences, still photos, transitions, programmed graphics and animations, audio, and so forth, using no more data than is required for each. There's Flash, but if it were easy to use why do we see narrated slideshows recorded as video? There's PowerPoint and Keynote, but the same objection applies. Quicktime may have come closer to providing a cross-platform solution than anything else, and if it were to be transformed into a player for Keynote files (including video as a media type) and made available on Android in addition to Apple's platforms and Windows, that might be the best available solution.

With this foundation, Apple would be in a position to challenge YouTube, by providing a better experience per bandwidth consumed, while providing yet another reason for content creators to own a Mac.