Update: It appears Apple may be prepared to reconsider Mr. Fiore's application and risk opening a can of worms.
It must have been a tough call, for whichever of the people Apple employees, to review iPhone/iPt/iPad apps that are submitted to the App Store, who was so unfortunate as to draw Mark Fiore's app. As an Apple employee, there's a pretty good chance they find a lot to agree with in the editorial content of Fiore's work, and would have liked to approve it, were it not clearly in violation of the section of the iPhone Developer Program License Agreement covering objectionable material.
The point here is the precedent. If they were to allow Mr. Fiore's work into the store, on what basis could they refuse to allow Glenn Beck, or any of a hundred (thousand?) others of a similar stripe, to flaunt their drivel on the App Store?
I can't bring myself to blame Apple for not wanting to open up the App Store to such nonsense, nor for deciding that, in the interest of keeping it out, Mr. Fiore's work also had to be excluded.
(see the Dunning-Kruger effect, via Daring Fireball)
Friday, April 16, 2010
Wednesday, April 14, 2010
Dilger scores
Despite his characteristically over-the-top manner, Daniel Eran Dilger does occasionally contribute a point or two that apparently hadn't occurred to others. Such is the case with his five myths piece on the exclusion of Flash from the iPhone.
Tuesday, April 13, 2010
another perspective on the most recent flap
I haven't yet seen the license agreement for iPhone OS 4, because, in my own way, which means on my own time and in fits and starts, I'm working on something for the iPad, which won't be upgraded to iPhone OS 4 until fall. I have some hope of having something ready to ship before then, so I'm still on 3.2. Besides, my phone is an original iPhone, and won't be able to run 4.0, so even if I were working on an iPhone app I'd have to get something new to have a test device, and buying the iPad has already used up that budget for awhile.
But even when the time comes for me to switch to the iPhone OS 4.0 SDK, it won't matter one whit that Apple has precluded the use of middleware platforms, since I've quite enough on my hands learning what I need to know to make use of their own frameworks and have no intention of further complicating the task by bringing in someone else's.
Sure, I can see where someone who's been using Flash right along, and had been hoping to avoid the Cocoa Touch learning curve, might feel like they've been slapped down, but Apple's position on Flash has been plain as day for three years, or was until Adobe began muddying the waters by promising Flash developers that they would be able to compile to native iPhone OS apps using CS5. And it should be pointed out that Apple has already allowed some apps built with beta versions of CS5 into the App Store, for use on devices running pre-4.0 versions of iPhone OS, which right now is every iPhone OS device that isn't provisioned for iPhone OS 4.0 development, which includes all iPads. It's also possible that enterprises doing their own app distribution (outside of the App Store) will still be able to make use of this feature of CS5 even with 4.0, at their own risk of course. Apple has drawn a line in the sand, beginning with iPhone OS 4.0 apps submitted to the App Store.
The wiser pundits seem to agree that it's not so much about Flash as about allowing anything to come between developers and the platform Apple has so carefully constructed, and which they are using to push the state of the art. Any third-party middleware layer that became popular could (likely would) impose constraints on further development of the underlying native OS, if only by being slow to move off of deprecated APIs and to adopt new features.
It seems to me that there are two workable approaches to this, either you take the position that Apple has now taken for iPhone OS, beginning with version 4.0, or else you allow any such middleware platform, relying on the competition among them to force them to be good citizens, and using the native OS to set a high standard, which is the position Apple has taken for Mac OS X. Take Java for example. Java has never been allowed on the iPhone, but on Mac OS X Apple writes their own Java runtime. If they had access to the specifications that would allow them to do this for Flash also, they probably would.
This is probably the direction that Android is headed, but without the unitary native platform, since each manufacturer has their own approach, and Google hasn't yet shown much talent for herding cats. Another three years hence Apple will have an even stronger, unified platform, but where will Android be, lost under a pile of middleware platforms?
But even when the time comes for me to switch to the iPhone OS 4.0 SDK, it won't matter one whit that Apple has precluded the use of middleware platforms, since I've quite enough on my hands learning what I need to know to make use of their own frameworks and have no intention of further complicating the task by bringing in someone else's.
Sure, I can see where someone who's been using Flash right along, and had been hoping to avoid the Cocoa Touch learning curve, might feel like they've been slapped down, but Apple's position on Flash has been plain as day for three years, or was until Adobe began muddying the waters by promising Flash developers that they would be able to compile to native iPhone OS apps using CS5. And it should be pointed out that Apple has already allowed some apps built with beta versions of CS5 into the App Store, for use on devices running pre-4.0 versions of iPhone OS, which right now is every iPhone OS device that isn't provisioned for iPhone OS 4.0 development, which includes all iPads. It's also possible that enterprises doing their own app distribution (outside of the App Store) will still be able to make use of this feature of CS5 even with 4.0, at their own risk of course. Apple has drawn a line in the sand, beginning with iPhone OS 4.0 apps submitted to the App Store.
The wiser pundits seem to agree that it's not so much about Flash as about allowing anything to come between developers and the platform Apple has so carefully constructed, and which they are using to push the state of the art. Any third-party middleware layer that became popular could (likely would) impose constraints on further development of the underlying native OS, if only by being slow to move off of deprecated APIs and to adopt new features.
It seems to me that there are two workable approaches to this, either you take the position that Apple has now taken for iPhone OS, beginning with version 4.0, or else you allow any such middleware platform, relying on the competition among them to force them to be good citizens, and using the native OS to set a high standard, which is the position Apple has taken for Mac OS X. Take Java for example. Java has never been allowed on the iPhone, but on Mac OS X Apple writes their own Java runtime. If they had access to the specifications that would allow them to do this for Flash also, they probably would.
This is probably the direction that Android is headed, but without the unitary native platform, since each manufacturer has their own approach, and Google hasn't yet shown much talent for herding cats. Another three years hence Apple will have an even stronger, unified platform, but where will Android be, lost under a pile of middleware platforms?
Sunday, April 11, 2010
recent discovery: the Robots Podcast
As someone who at least imagines himself to have a reputation for being a robotics enthusiast, you might expect me to be right on top of what are currently the best sources of open information in the field, but you'd be wrong.
My enthusiasm is primarily for service robots, machines that do tasks people find uncomfortable, boring, demeaning, dirty, dangerous, or insufficiently valuable, such that you cannot find people willing to do them for what you can afford to pay in any but the most starkly depressed economies.
This is a category that hasn't received much attention in recent years, with most press/blog coverage going to robots that physically mimic humans to varying degrees, and most hobbyist activity directed towards battlebots.
So perhaps it's understandable that something as excellent as the Robots Podcast could escape my awareness for nearly two years. In that time this biweekly podcast has accumulated a very impressive collection of interviews with some of the most brilliant people working in robotics and closely related fields.
It's available both via RSS and on iTunes. Do check it out!
My enthusiasm is primarily for service robots, machines that do tasks people find uncomfortable, boring, demeaning, dirty, dangerous, or insufficiently valuable, such that you cannot find people willing to do them for what you can afford to pay in any but the most starkly depressed economies.
This is a category that hasn't received much attention in recent years, with most press/blog coverage going to robots that physically mimic humans to varying degrees, and most hobbyist activity directed towards battlebots.
So perhaps it's understandable that something as excellent as the Robots Podcast could escape my awareness for nearly two years. In that time this biweekly podcast has accumulated a very impressive collection of interviews with some of the most brilliant people working in robotics and closely related fields.
It's available both via RSS and on iTunes. Do check it out!
Sunday, April 04, 2010
a western understanding of chi/qi/ki
Physical science, as it developed in the west, revolves around what can be compounded out of a few basic units, primarily mass, time, and distance. Velocity, for example, is simply distance divided by time (meters/second), and momentum is velocity multiplied by mass (kilograms*meters/second).
This approach became far more powerful with the advent of calculus, which addressed rates of change (differentiation) and accumulation (integration). Differentiation and integration are complimentary concepts.
It was through his invention of calculus that Newton was able to arrive at a theory of gravity (that the attraction between two objects varies proportionally to the inverse of the square of the distance between them, 1/r-squared), and combine that with momentum to determine that the planets trace out ellipses as they orbit the sun.
If you have a mathematical expression describing the position (in a single dimension) of an object over time, then velocity is the change in that position over time, and acceleration is the change in velocity over time. Another way of saying this is that velocity is the first derivative of position, and acceleration is the second. Actually computing the first derivative of the original expression will give you a new one which describes the instantaneous velocity of the object at any moment, and computing the first derivative of that will produce one that describes the instantaneous rate of change in that velocity at any moment, or the acceleration.
This first derivative of acceleration, which is to say the expression describing the instantaneous rate of change in acceleration, is variously referred to as jerk, jolt, surge, and lurch.
At this point I'm going to switch from physics to biology, to suggest that jerk is closely related to the effect produced by the firing of neurons to activate skeletal muscles. A single impulse produces a spasm, but little actual movement, whereas an escalating stream of impulses produces a progressive tightening of the muscle, accelerating whatever movement the muscle generates.
But the bodies of higher organisms can't operate through the activation of a single muscle. They make use of cyclical patterns involving many muscles, each of which will alternately contract and relax as it plays its part in the pattern. These patterns are imprinted within and coordinated by the cerebellum, which presents a simpler interface to the rest of the brain.
When you want to raise your arm, you just raise your arm, without having to think about which muscles are pulling on their tendon connections to your skeleton to accomplish this. When you want to run, you may think "left-right-left-right" and/or "faster, faster", but again you're not having to consciously juggle the hundreds or thousands of impulses per second that initiate and sustain the pattern; that's all being done automatically for you by your cerebellum.
The first time you do something new, you're likely to do it very slowly and deliberately, because your cerebellum can only estimate what pattern will work for the new action and your conscious mind is more directly involved to make sure that it stays on track. As the cerebellum gains experience, the conscious mind can safely leave the details to it and simply choose to perform or not perform the action, as well as when, in what direction, and how vigorously.
If you collect a sufficiently complete repertoire of available actions, that your cerebellum knows how to perform, you may find that you are able to string them together in novel combinations, and even to create new actions on the fly, to tie those sequences together. This is partly a matter of the higher brain coming to trust the ‘black box’ of the cerebellum, and to understand how to guide it.
At this point, within the range of the repertoire, the question ceases to be what can you do and becomes what will you do? How will you use it? And your response to that question is your intention.
To get back to the point, I see chi as being the degree of alignment between that intention and the pattern of neural activations the cerebellum produces. Someone whose ‘chi is very strong’ has a high degree of alignment, which is to say that what they intend and what they do are one and the same.
While the concept ‘chi’ is tightly interwoven with physical movement, similar degrees of alignment pertain to the use of speech and the cultivation of emotion. These might be termed truthfulness and respect.
Not the conclusion you were expecting?
This approach became far more powerful with the advent of calculus, which addressed rates of change (differentiation) and accumulation (integration). Differentiation and integration are complimentary concepts.
It was through his invention of calculus that Newton was able to arrive at a theory of gravity (that the attraction between two objects varies proportionally to the inverse of the square of the distance between them, 1/r-squared), and combine that with momentum to determine that the planets trace out ellipses as they orbit the sun.
If you have a mathematical expression describing the position (in a single dimension) of an object over time, then velocity is the change in that position over time, and acceleration is the change in velocity over time. Another way of saying this is that velocity is the first derivative of position, and acceleration is the second. Actually computing the first derivative of the original expression will give you a new one which describes the instantaneous velocity of the object at any moment, and computing the first derivative of that will produce one that describes the instantaneous rate of change in that velocity at any moment, or the acceleration.
This first derivative of acceleration, which is to say the expression describing the instantaneous rate of change in acceleration, is variously referred to as jerk, jolt, surge, and lurch.
At this point I'm going to switch from physics to biology, to suggest that jerk is closely related to the effect produced by the firing of neurons to activate skeletal muscles. A single impulse produces a spasm, but little actual movement, whereas an escalating stream of impulses produces a progressive tightening of the muscle, accelerating whatever movement the muscle generates.
But the bodies of higher organisms can't operate through the activation of a single muscle. They make use of cyclical patterns involving many muscles, each of which will alternately contract and relax as it plays its part in the pattern. These patterns are imprinted within and coordinated by the cerebellum, which presents a simpler interface to the rest of the brain.
When you want to raise your arm, you just raise your arm, without having to think about which muscles are pulling on their tendon connections to your skeleton to accomplish this. When you want to run, you may think "left-right-left-right" and/or "faster, faster", but again you're not having to consciously juggle the hundreds or thousands of impulses per second that initiate and sustain the pattern; that's all being done automatically for you by your cerebellum.
The first time you do something new, you're likely to do it very slowly and deliberately, because your cerebellum can only estimate what pattern will work for the new action and your conscious mind is more directly involved to make sure that it stays on track. As the cerebellum gains experience, the conscious mind can safely leave the details to it and simply choose to perform or not perform the action, as well as when, in what direction, and how vigorously.
If you collect a sufficiently complete repertoire of available actions, that your cerebellum knows how to perform, you may find that you are able to string them together in novel combinations, and even to create new actions on the fly, to tie those sequences together. This is partly a matter of the higher brain coming to trust the ‘black box’ of the cerebellum, and to understand how to guide it.
At this point, within the range of the repertoire, the question ceases to be what can you do and becomes what will you do? How will you use it? And your response to that question is your intention.
To get back to the point, I see chi as being the degree of alignment between that intention and the pattern of neural activations the cerebellum produces. Someone whose ‘chi is very strong’ has a high degree of alignment, which is to say that what they intend and what they do are one and the same.
While the concept ‘chi’ is tightly interwoven with physical movement, similar degrees of alignment pertain to the use of speech and the cultivation of emotion. These might be termed truthfulness and respect.
Not the conclusion you were expecting?
Saturday, April 03, 2010
the Day of the iPad
No matter what fate awaits Apple's new product line, you've got to admit that it dominated the moment this morning.
Yes, I was in line too and bought one. Actually, I needn't have waited in line, as I had one reserved and could have picked it up anytime before 3:00 PM, but I wanted to be there, to participate in the launch.
Is it nice? Yes, very nice.
Am I already an expert user? Not hardly. That will take some time, even though I've been using an iPhone almost since the day they first went on sale.
Will it replace either my iPhone or my Mac? No. I can imagine that some future version might replace both, but not this model.
Will I use it in preference to those other devices for some purposes? Most likely, even when I have all three with me, although exactly which purposes remains to be seen. Watching video on a bus strikes me as a slam dunk; the iPad wins that one.
Is there something essentially right about the iPad, which no other device has previously manifested? Potentially. The idea of putting a color touchscreen on a device roughly half the size of a 13" MacBook's screen, split vertically, is brilliant, and the physical design of the iPad is superb, but so far the OS is still a version of iPhone OS, well evolved for the iPhone, but not yet ready to take full advantage of the iPad. That's sure to change; iPhone OS 4.0 will undoubtedly advance that process, and, what Apple hasn't foreseen will quickly be supplied by developers. It won't be long before we begin to really understand what's so special about such a device.
What I expect we'll discover is that the iPad is better suited than any device before it to bridge the gap between the user and the personal constellation, within the universe of information and connection, which they are drawn to explore.
Yes, I was in line too and bought one. Actually, I needn't have waited in line, as I had one reserved and could have picked it up anytime before 3:00 PM, but I wanted to be there, to participate in the launch.
Is it nice? Yes, very nice.
Am I already an expert user? Not hardly. That will take some time, even though I've been using an iPhone almost since the day they first went on sale.
Will it replace either my iPhone or my Mac? No. I can imagine that some future version might replace both, but not this model.
Will I use it in preference to those other devices for some purposes? Most likely, even when I have all three with me, although exactly which purposes remains to be seen. Watching video on a bus strikes me as a slam dunk; the iPad wins that one.
Is there something essentially right about the iPad, which no other device has previously manifested? Potentially. The idea of putting a color touchscreen on a device roughly half the size of a 13" MacBook's screen, split vertically, is brilliant, and the physical design of the iPad is superb, but so far the OS is still a version of iPhone OS, well evolved for the iPhone, but not yet ready to take full advantage of the iPad. That's sure to change; iPhone OS 4.0 will undoubtedly advance that process, and, what Apple hasn't foreseen will quickly be supplied by developers. It won't be long before we begin to really understand what's so special about such a device.
What I expect we'll discover is that the iPad is better suited than any device before it to bridge the gap between the user and the personal constellation, within the universe of information and connection, which they are drawn to explore.
Thursday, April 01, 2010
25 years and still going
The WELL's home page is a little more entertaining today than usual. I suppose such things are bound to happen given that its birthday falls on April Fools Day.
Subscribe to:
Posts (Atom)