I haven't yet seen the license agreement for iPhone OS 4, because, in my own way, which means on my own time and in fits and starts, I'm working on something for the iPad, which won't be upgraded to iPhone OS 4 until fall. I have some hope of having something ready to ship before then, so I'm still on 3.2. Besides, my phone is an original iPhone, and won't be able to run 4.0, so even if I were working on an iPhone app I'd have to get something new to have a test device, and buying the iPad has already used up that budget for awhile.
But even when the time comes for me to switch to the iPhone OS 4.0 SDK, it won't matter one whit that Apple has precluded the use of middleware platforms, since I've quite enough on my hands learning what I need to know to make use of their own frameworks and have no intention of further complicating the task by bringing in someone else's.
Sure, I can see where someone who's been using Flash right along, and had been hoping to avoid the Cocoa Touch learning curve, might feel like they've been slapped down, but Apple's position on Flash has been plain as day for three years, or was until Adobe began muddying the waters by promising Flash developers that they would be able to compile to native iPhone OS apps using CS5. And it should be pointed out that Apple has already allowed some apps built with beta versions of CS5 into the App Store, for use on devices running pre-4.0 versions of iPhone OS, which right now is every iPhone OS device that isn't provisioned for iPhone OS 4.0 development, which includes all iPads. It's also possible that enterprises doing their own app distribution (outside of the App Store) will still be able to make use of this feature of CS5 even with 4.0, at their own risk of course. Apple has drawn a line in the sand, beginning with iPhone OS 4.0 apps submitted to the App Store.
The wiser pundits seem to agree that it's not so much about Flash as about allowing anything to come between developers and the platform Apple has so carefully constructed, and which they are using to push the state of the art. Any third-party middleware layer that became popular could (likely would) impose constraints on further development of the underlying native OS, if only by being slow to move off of deprecated APIs and to adopt new features.
It seems to me that there are two workable approaches to this, either you take the position that Apple has now taken for iPhone OS, beginning with version 4.0, or else you allow any such middleware platform, relying on the competition among them to force them to be good citizens, and using the native OS to set a high standard, which is the position Apple has taken for Mac OS X. Take Java for example. Java has never been allowed on the iPhone, but on Mac OS X Apple writes their own Java runtime. If they had access to the specifications that would allow them to do this for Flash also, they probably would.
This is probably the direction that Android is headed, but without the unitary native platform, since each manufacturer has their own approach, and Google hasn't yet shown much talent for herding cats. Another three years hence Apple will have an even stronger, unified platform, but where will Android be, lost under a pile of middleware platforms?