Marc Hoffman has confirmed that “Nougat” is to Mac/iOS as “Cooper” was to Java. Some have speculated that this will be based on Mono, but Oxygene has had Mono covered for some time already, so I strongly doubt that this is the case.
Far more likely is that just as “Cooper” was – among other things – an Oxygene compiler with a back-end that emitted Java Byte Code and language extensions and bindings that made it “play nice” with and in the JRE (and, therefore, Android), then “Nougat” will be Oxygene with … what ?… an LLVM back end ? and language extensions and bindings to play nice with the Objective-C runtime and Cocoa/CocoaTouch ?
Details such as these are speculation on my part at this point, extrapolating from what was delivered in Oxygene for Java, but I am excited and interested to see what RemObjects have in mind and how they are going to deliver on this.
Some of this appears to be confirmed – or at least strongly hinted at – in the responses and additional detail emerging in a forum thread discussing the new product.
With Lazarus 1.0 being released recently and now this news, these are exciting times for Pascal developers.
I think I already know what I will be spending the money on that I saved by not renewing my Delphi SA this year. π
This is such a power kick in the borcadero head. Can be KO.
Just look at that evolution: http://www.remobjects.com/oxygene/language/evolution.aspx
Have I been on Mars since those things happened?
And DXperience suite should run under Oxygen as confirmed by http://www.devexpress.com/Support/Center/p/Q262593.aspx
So definitely considering a “Buy-bye Delphi, DevExpress – Hello Oxygen, DXperience!” situation.
Yep. Lucky for us RemObjects have all that revenue from their Operating Systems and Applications divisions that enable them not only to be so innovative but also so open about their current and future plans.
Oh, but wait… π
It must be all that revenue they get from selling C/S capability to Delphi Pro licensees π
Yes. Because Oxygene better said Chrome had not been intended as Delphi competitor. Chrome was an act self defense against the foreseeable result of the VCL.net approach and Borlands .net experiments in general.
We are in IT world and if we think of Delphi’s age – The Delphi Ecosystem can be compared to the Club of the Happy Mummies that still have fun on the beach, some more some less in a positive sense.
EMB cannot simply move Delphi too fast into a different direction. Delphi grew at the times of 4GL. Today you achieve more with OO and the systematic application of design patterns and concepts that go beyond. Since this shift is sealed the programming languages will have to reflect this, the solutions architectures change.
Not considering the Zeitgeist can be harmful as the business you don’t make you don’t see and vice versa.
The Remobjects approach is simply, stay native on the client OS and abstract the communication to the server via an optimized protocol. We do benefit from a certain affinity and commitment to Pascal. Honestly it is better to have your own compiler …
They have lots of revenue from things that are not Oxygene. What’s Oxygene has a much smaller scope than RAD Studio.
1. No IDE to develop/maintain.
2. No RTL to develop/maintain.
3. No VCL to develop/maintain.
etc. etc.
Now, I don’t want to take anything away from Oxygene, but you aren’t telling the whole story.
How far along is their ShineOn effort?
The lack of a common RTL and class library may not be an issue for new projects or short-lived projects targeting only one platform, but having pure-Oxygene collections and class libraries would sound preferable to me to using the .Net / Java / Objective-C specific stuff.
ShineOn looks to be extremely limited to me.
Looks like Rick Burgstaler is pretty much a one-man army on that project (and even he appears to have lost interest almost 3 months ago). π
To be honest, I hadn’t thought about the lack (?) of common RTL until you mentioned it. Up until now I suppose ShineOn has been primarily for people migrating from Delphi to Prism/Oxygene and generally not looking back once they’ve got over the .NET fence, so I suspect that people making that move would be perfectly happy to look forward, and embrace the facilities in .NET.
But with more platforms being supported by Oxygene for which a common code-base makes more obvious sense, the need for a common RTL may become important – that might even explain why ShineOn has gone quiet, if there is perhaps now a more concerted, official effort under way, for example.
Pure speculation on my part, obviously.
> if there is perhaps now a more concerted, official effort under way, for example.
I guess the Mapped Types are about that
http://wiki.oxygenelanguage.com/en/Mapped_Types
However, if that is good enough for the simpler containers, behaviors specific to the classes of each platform are going to leak through for the more advanced classes.
So while it’s a nice kick-starter, I don’t think it’s a replacement for a behaves-the-exact-same-way-everywhere class library.
> a behaves-the-exact-same-way-everywhere class library
Ah, the Holy Grail of cross-platform development. Some say it has yet to be found – others maintain that it is a dream, existing only in myth and legend… π
Not possible for everything, but should certainly be achievable for all the collections, algorithmic or collations stuff π
This is the flip side of standing on top of mature platforms. Yes you can get a very powerful product to market quickly because you don’t need to provide the RTL. But for cross platform, all the RTLs are different. Square that circle!
‘C’ managed it. π
w.r.t Oxygene, unlike ‘C’ it wasn’t originally designed for cross-platform (technically I think ‘C’ was “platform neutral” rather than “cross-platform” as we think of it today) so it’s no real surprise that some of the scaffolding that you would expect for a cross-platform language isn’t in place in Oxygene, but that sort of thing can be retrofitted really very easily.
Yep, as I suspected – RO are “squaring the circle”, with some Sugar frosting. π
The C standard library was the result of an immense standardisation effort. Likewise the C++ standard library.
There is nothing else I know of remotely close the the portability of C and C++. Certainly Oxygene can’t touch C or C++ for portability.
Not sure where sugar comes into it, but wake me up when you find anything else as portable as C or C++.
Google is your friend.
“Sugar” is the provision of a standard, portable RTL for Oxygene to address the very concerns you raise. But Oxygene doesn’t have to be as portable as C/C++, it only has to be portable across the platforms it chooses to support.
That will just result in a weak and compromised RTL, in my view.
So if they don’t provide a common RTL then it’s a non-starter, but if they do then it won’t be good enough ?
And people say I’m negative! π
No, I didn’t say either of those things. My point is simply that native libraries (e.g. C++ standard library), if done well, are better than cross-platform libraries that wrap other libraries.
There are all sorts of trade offs in all sorts of different dimensions.
The C RTL was created at a time when there were no platform libraries to rest upon. The RTL rested directly upon the machine. With modern architectures there is an abstraction between most applications and the machine.
Even with Delphi/Win32 the RTL relies – in some places – on the Win32 abstraction.
In any event, you seem to be rushing to dismiss “Sugar” based on what you think it might or not not eventually become and about which currently, by your own admission, you know little about. Seems a bit premature, that’s all.
> With modern architectures there is an abstraction between most applications and the machine.
In the case of C++, that abstraction is C++.
> In any event, you seem to be rushing to dismiss βSugarβ based on what you think it might or not not eventually become and about which currently, by your own admission, you know little about. Seems a bit premature, thatβs all.
Maybe. I’d be very happy to be proved wrong. But my instincts are that such an endeavour will have to make compromises and they will limit the utility of the end product.
And by choosing their battles they are far more effective as a result. But that does not alter the simple fact that being open and communicative is a choice and an approach, not a cost centre.
La vieille garde meurt, mais elle ne se rend pas π
Enjoy Oxygene. I use it for years now it served me well.
The Spring.net might offer some inspiration here. They have attempted to port over to Dot.net some of the Enterprise code and design patterns found in the open source Java framework.
Rather than come up with a new design perhaps port this popular framework to Oxygene.
Greta! Looking forward to some Coderage sessions on Oxygene!
> With Lazarus 1.0 being released recently and now this news, these are exciting times for Pascal developers.
I’d add the Smart Mobile Studio to the mix. But the cost $399 seems to be rather big for a v1.0 product, vs. $499 for Oxygene for .NET, Java & βNougat”.
If they were using an LLVM back end that could do native codegen to Intel architectures, wouldn’t you do Windows before Mac? Or is Cocoa on iOS the big thing here?
Again, they have already the Windows desktop covered with .NET. Heck, the only meaningful WinRT support in RAD Studio XE3 is courtesy of the inclusion of Prism (i.e. Oxygene for .NET).
I think Cocoa + iOS is The Big Thing, in the same way that The Big Thing (or at least the headline event) w.r.t the Java support was Android. The ability to target OS X is probably a bit of a sideshow, but obviously that’s my interpretation, not necessarily RO’s actual thinking, and apologies to anyone at RO if this hopelessly wide of the marc [sic]. π
That depends on your view point. I can’t see .net being viable for my floating point heavy, performance critical app any time soon. So, targetting the native hardware still matters to some people. But no doubt that matters for only a small minority of devs.
Me neither – but yours or my viewpoint is specifically not what I was referring to, but RO’s viewpoint. π
Of course Benchmarking is a complete quagmire at the best of times, but that said, you might want to try some.
Chances are .Net will give your floating point heavy, performance critical code a good speed INcrease on modern processors.
.Net has a JIT load time performance penalty.
However, on modern processors, runtime performance is generally much better than Delphi.
see
http://webandlife.blogspot.co.uk/2011/12/c-performance-vs-delphi-performance.html
and
http://stackoverflow.com/questions/145110/c-performance-vs-java-c
(Which although delphi specific, is about native vs .Net)
* (Which although NOT delphi specific, is about native vs .Net)
The bigger problem with .NET isn’t that best performance that can be achieved it’s the predictability/reliability of that performance.
The problem for a high-performance application is usually not that they can’t get the performance they need, but that the performance they need might suddenly deteriorate as the optimal runtime conditions for the GC become sub-optimal and the GC is forced to step in and clean things up, putting the real work of the app on hold while it does so. For very many applications this situation might never arise, but for some that predictable, reliable performance is crucial.
You can make a lot of Delphi code run a lot faster by not freeing memory when you’re done with it too. π
I’ve never seen any strong evidence that my app would run as fast managed as unmanaged. I only need it to be 10% slower on managed for it to become a problem. My clients don’t care what language I have to code in. They just care about the bits that they can see.
Have you seen any strong evidence that it would NOT run faster with managed code?
Of course i don’t know the internals of your app, but typically well written managed code runs faster than well written unmanged code (on modern processors).
> well written managed code runs faster than well written unmanaged code
This is course impossible unless the managed code is doing less than the equivalent unmanaged code.
Managed code has two areas where it can make gains over “unmanaged” (although imho there is no such thing as “unmanaged” code – this is a false negative distinction made with the intention to favour the more attractive idea of code being “managed” vs not. The two real distinctions are “runtime managed” and “developer managed”).
1) specific CPU optimisation made on/for the host machine. These are – as far as I know – largely theoretical, not actual. That is, managed runtimes have the potential to use JIT/NGEN-pre compilation to produce hardware specific optimisations on actual deployment hardware, but the impression I get is that this potential has never actually been realised. This isn’t surprising or controversial. Since “managed code” is – even by the creators of the management frameworks (Microsoft) – not intended for high-performance critical applications, why would they go to the lengths of creating high-performance optimisations ?
But beyond this, neither JIT nor NGEN are or can be perfect: JIT cannot fully optimise as the time taken to do the analysis and optimisations would offset any gains, so JIT has to be a compromise. NGEN cannot fully optimise as it does not have the awareness of the runtime environment to be able to make certain optimisations that might be possible (NGEN can, as a result, product code that is sub-optimal!)
2) deferred runtime costs. Specifically, in operations involving memory allocations/deallocations, runtime managed code benefits in a direct comparison with development managed code from the fact that some operations that the developer managed code is performing during those operations are deferred until later by the runtime managed code. The cost is still there in the application, but it occurs later. The problem here is that this benefit relies on and assumes that there will be a suitable “later” period of time when this cost can be incurred when it won’t be noticed, i.e. in an “idle” period. For some applications these “idle” periods don’t occur often or reliably enough, and so sometimes the runtime manager is forced to halt your application while it does some necessary housekeeping that it can no longer defer.
For the vast (VAST!) majority of applications these are edge case considerations that very often don’t apply, but there are certain types of applications (always running, high workload – mostly service-type data processing etc) where the considerations are very real and the impact of runtime management is unacceptable, in particular the unpredictable nature of that impact.
> >well written managed code runs faster than well written >>unmanaged code
>This is course impossible unless the managed code is doing >less than the equivalent unmanaged code.
If the JIT optimises better than the Delphi compiler for the Processor being used then the resultant code will be faster.
On modern processors this appears to be the case in some circumstances e.g.
http://webandlife.blogspot.co.uk/2011/12/c-performance-vs-delphi-performance.html
I’m sure there are plenty of cases showing the reverse.
It does however cast doubt on the widely held belief that .Net code is slooooow.
The usual caveats about benchmarks & the meaning of the words “managed” and “unmanaged” apply . π
> If the JIT optimises better than the Delphi compiler for the Processor being used
That’s a big ‘if’, and the arguments in this area are typically founded on the theoretical capabilities of an optimal JIT implementation, not based on benchmarks of actual JIT performance.
The benchmarking article you linked to is particularly useless except in very niche areas. The author dismisses the impact of memory consumption as insignificant and focusses on FPU operations. This is bewildering as in the overwhelming majority of cases, FPU performance is going to be far less significant in terms of impact on the performance – if not the simple viability – of an application than FPU performance. The only real conclusion that you can draw from that article is that the author had a goal in mind and constructed their methodology to reach that goal.
And he offsets developer productivity savings with application performance. Seriously ?!
That’s like comparing the running costs of cars based not on fuel efficiency or service interval but on how quickly it was built in the factory!
No, I’m afraid the only doubt cast by that article is on the methodology. π
The real-world anecdote with empirical observations that the article begins and sets out to “debunk” is not in fact disproven by the exercise. Far from it. The fact that the real world experience was at odds with the benchmark results merely serves to establish that the benchmark itself is flawed.
Apart from anything else, if FPU performance is critical in your niche Delphi application then you can of course use any number of techniques to get better performance than that offered by the raw compiler/RTL implementation. You can even call upon techniques to leverage GPU’s to get performance that I am sure will leave even the most optimised JIT performance dead in the water.
Which isn’t to say that there are no benchmarks of realistic, representative .NET code vs Delphi, C/C++ etc that do show an advantage to .NET if not parity, but that article isn’t one of them. At least, not a very convincing one. π
You put forward some very good points there.
It’s definately a complex subject.
It’s the widely used casual dismissal of .Net code “because it’s soooo sloooow” mentality that annoys me.
(not accusing you of that, of course π )
If you really need heavy calc stuff like that, there is no reason why you don’t do all the “dirty” stuff in an unmanaged way, mark it as unsafe and wrap it in a class that “guards” the managed code from it on input and output.
http://wiki.oxygenelanguage.com/en/Unsafe_Code
David,
we got Windows covered with, personal tastes and fear of the unknown aside, what IMHO is the best approach to developing for that platform: .NET
Cocoa opens Oxygene to new possibilities; i don’t see what a CPU-native Oxygene compiler for Windows would add to the table, except satisfy the “native code: check” checkmark that many people, mostly irrationally, seem to want.
The funny thing is that doing one would probably be a piece of cake once Nougat is done, but the question would remain, what would you do with it.
Any possibility that you might allow such a project to occur in the open source/community space ? Is the architecture conducive to such an approach ?
Not saying you should, just asking if it might even be possible. Then you could let other people who wanted it (or felt they needed, or heck, would just enjoy doing it) to check that “native” box without yourselves having to lift a finger. π
id say is something we can have let Carlo have a play with once Nougat is out and he’s getting bored again ;).
I think your strategy is good. I’d love to be able to stand on .net rather than Win32. But for me it comes down to performance.
Why do you think, in your case, managed code will be slower?
In my experience, Oxygene run-time performance is mostly superior to Delphi.
Because I’m doing floating point and memory intensive calculations. And all evidence I have seen points to them being slower on .net.
http://webandlife.blogspot.co.uk/2011/12/c-performance-vs-delphi-performance.html
Just read here: http://www.remobjects.com/oxygene/nougat.aspx
Nougat sounds nifty. Can’t wait to try it.
W
Nougat sounds really interesting. I’ve been using Delphi since v1 and have been waiting *very* patiently for it to support cross platform development. Now that they have removed the limited support they had in XE2, I think I am going to seriously consider alternatives. I played with “Chrome/Oxygene” in the past, but now see that it has evolved considerably.
Have downloaded demo version for Java/Android and hope that it will inspire me.
I too may be (reluctantly) saying bye bye to Delphi. Certainly not going to renew SA next time it comes up for renewal. Also not prepared to pay extra for the mobile platform support, which I’m sure will be flaky in the first few versions even if it does materialize next year.
What? ShineOn? Come on…. That was a toy project and NEVER got finished, just download and take a look… There is no RO interest to keep a minimum compatibility with existing Delphi and Pascal code (yes, Pascal is not only Delphi). The RO guys have a different idea of what Pascal should be and that’s the first thing that kept me away from Chrome when I first saw it.
I imagine if RO had used, say Java, instead of Pascal as the language… there would be a RO Java flavor out there completely imcompatible with anything out there today. I’m not telling that they should not evolve and create new things… but a RO-pascal-style project IS NOT PASCAL at all.
That’s nonsense. Delphi is just as much “NOT PASCAL” as Oxygene.
I think that what he is trying to say is that the RO version of Pascal has deviated so far from the standard Delphi version as to make most of the existing base of Delphi code not usable at al in Oxygenel.
Maybe things have changed since I last looked at it, but you couldn’t evan compile a simple Delphi class without makimg major modifications. They may as well as called their language Z++. The only thing in common was begin..end.
And yes, Delphi may not conform to 1960 ISO NW Pascal, but it is generally considered to be the de facto standard of Pascal in the modern era.
Um, there was never any such thing as “ANSI 1960 Pascal”. The ANSI standard when Delphi was released was 1993 Extended Pascal. The de facto standard was Borland Pascal 7 with OWL. I don’t think Delphi was compatible with either actual or de facto standard. But it was embraced because those deviations made the language a better fit for developing Windows applications – the purpose for which its particular flavour of Pascal was intended.
And that’s exactly what RO did with Chrome/Oxygene. And lucky they did too, so that there was at least some variant of Pascal for .NET when Delphi.NET proved to be such a disaster, being neither fully compatible with Delphi/Win32 nor sufficiently a fully capable .NET language, to survive.
If the Delphi language hadn’t stagnated so much in recent years there might be more compatibility between Delphi and Oxygene, but in recent years in particular the approach has been to ignore stagnating syntax and add “new features” by applying generics and RTTI that could be more elegantly (and in some cases more efficiently) solved with deeper, richer core language.
imho. ymmv.
Well Jolyon, AFAIK, Delphi and Free pascal (yes, that compiler that you are so interested these days) are at least 99% compatible (of course I’m not considering VCL/LCL and other framework stuff, only plain pascal syntax). Oxygene has in common begin..end and := for assignment, that all…. Please explain why in the hell “procedure” and “function” had to be replaced by “method”?? There is no technical reason for it. just the preference of Oxygene creator for that word instead of the accepted Pascal standard. Can I compile procedures and functions using Oxygene? Yes sure changing some project option, but this is just stupid IMO. Maybe it is a great compiler and product, but changing Pascal syntax for pure vanity is just stupid for their company.
The only pressure that I have today to leave Pascal/Delphi is that we can’t find any good new programmers that already have some knowledge of Pascal/Delphi. Can you imagine WHERE in the world will I find a Oxygene developer? If someday I have to abandon Delphi/Pascal to go for, say, .NET, of course I will use the mainstream language, not a niche language that can’t even please a die hard Pascal fan like myself.
Which just goes to show – damned if you do, damned if you don’t. π
On the one hand some people complain that Pascal is an old language that contains idiosynchracies that are nothing but the vestiges of it’s origins as a teaching language, and then when someone dares modernise the language to remove those things then other people complain that it was just for vanity.
In original Pascal, the distinction between procedure and function was more than just that one had a return value and the other didn’t. In original Pascal you had no choice but to use or at least store the result of a function – it was illegal to call a function without making use of it’s return value. An attempt to do so would not even compile. This was – presumably – to teach the importance of handling return values when expected/required to do so.
Since RO created the compiler and their language primarily for their own use it is perfectly understandable that they would take a step back and design it for modern use, rather than simply repeating history, for the sake of it.
ime, knowledge of frameworks is more important than knowledge of language syntax. The latter can be learned in a matter of hours. The former can take years to accumulate. So the decision to not create an entire Oxygene “VCL” but to create something close to the platform makes the job of hiring effective Oxygene (.NET) developers very easy: find someone who knows .NET. Whatever language they currently know, learning Oxygene will be easy and all their existing .NET knowledge is simply and immediately usable.
The problem with hiring Delphi developers is not finding people with knowledge of the language but knowledge of the VCL and RTL.
The guys that say “that Pascal is an old language that contains idiosynchracies that are nothing but the vestiges of itβs origins as a teaching language” simply don’t use Delphi/FPC/Oxygene nowadays and will never use it, don’t matter what! This is their way to confess that they prefer C-like languages when they don’t have a single argument to do so…
Can you please give only ONE decent argument that I can use with company board to justify using Oxygene instead of C#?
Let’s see:
– “Language syntax is not as important as framework knowledge” (your words) -> So, there is no point in keeping Pascal there, right? After all, I can pick C# in a week.
– “There is no migration path from Delphi to Oxygene” (my words) -> Maybe because RO guys just have better things to do than creating it, or maybe because they don’t want to contaminate the sacred language with those Delphi things like those ugly T before the class name….
– “Oxygene is not Delphi” (RO guys words) -> So, if there is no minimum code base compatibility and all my 10,000,000 LOC Delphi code goes straight to the garbage can, the BEST to do for the company is to use the mainstream language. Use anything else would be an act of vanity (as changing some Pascal reserved words).
One decent argument? OK (although of course whether it’s “decent” or not is your call – there are others to choose from though π ): Oxygene has historically supported .NET better and sooner even than C#. Oxygene (then ‘Chrome) was the first shipping development language for .NET to support Generics and then later was the first to support Sequences and Queries.
As a bonus I’ll give you another: Oxygene is ObjectPascal based, not C based, so you intrinsically have code that is better organised and easer to read and understand by default. π
On a couple of your other points:
There is a migration path from Delphi to Oxygene. It may not be a “wormhole” path that gets you straight from A to B, but it’s a darn sight shorter path than the one from Delphi to C#.
There is some code compatibility so not all you 10m LOC is garbage. That statement is only true when switching to C#.
On the specific point of procedure/function vs method, they didn’t change any keywords (in this area), they simply introduced one which could be used in place of two redundantly separate ones – if you wish.
“Maybe things have changed since I last looked at it, but you couldnβt evan compile a simple Delphi class without making major modifications. ”
sorry, but that is just silly and wrong.
I’m sorry, maybe I’ve been asleep and missed something important, or maybe I’m just dreaming. Someone please correct me if I have this wrong, but when I went to the RemObjects site to look at Oxygene βNougatβ, I also noticed that RemObjects now seems to be selling Oxygene .NET directly (i.e. with NO reference at all to Prism from Embarcadero).
When did this happen? Did I miss an announcement, or has this been kept a bit quiet? Or am I just wrong? I am sure that when I looked a week or two ago, the RemObjects sites still said Oxygene .NET could only be supplied by Embarcadero as “Prism”.
Now it appears that not only can you now buy Oxygene .NET for $499 (instead of spending megabucks for RadStudio) but in that price you get the βNougatβ and Java versions too. If it’s right that offer looks very enticing, especially with the questionable directions that Delphi seems to be going in lately.
You’re not dreaming π . AFAIK this all happened in the last few days, coinciding with the latest release of Oxygene 5.2.
As I understand it, you can now only buy the standalone Oxygene from RO. But you can still get it as part of Rad Studio from EMBT.
This is BRILLIANT, as licensing is handled by RO, no need for EMBT at all.
I bought mine today.
It get’s better, any Delphi user can buy it via upgrade for $399
Any Delphi Prism XE2 user can renew for $349.
And you get all three targets, .Net,Java & Nougat(when it’s ready).
See
http://blogs.remobjects.com/blogs/mh/2012/09/06/p4717
for the official skinny.
I’ve been using Lazarus for a good long time now to make my games, however Oxygene has got me really pumped as I don’t have to think about how to get my games onto all platforms via rigging Lazarus to each. As much as it’s doable, Oxygene is fully supported.
My only question is what do I do about an IDE as I run a Mac and NOT a Windows machine for my development. I wouldn’t mind the Visual Studio IDE and could deal with the change except that Microsoft hasn’t made it available to Mac. (yet?)
Barring that I’d jump on it right this second for all my future game projects. Essentially it would basically allow me to release a game, if I choose my API right, for: Windows, XBox Live Arcade, Mac OS X, iOS, Android, OUYA and Linux. All one single code-base, no dialect or major porting issues.
I believe you can use Mono Develop with Oxygene….
Interesting. However after just going to Microsoft TechEd the niggles in the back of my head about .Net seem to be coming back to roost.
I’m not saying Microsoft are about to drop .Net but the days of .Net being THE one true way on Windows are over. JavaScript, .NET and C++ are the 3 supported development environments now and when you did a bit deeper you find that the one true way underlying all three is actually enhanced COM with the most direct mapping naturally coming from C++.
I’m talking WinRT here of course, and who knows how long that will last. It’s a fairly blatant attempt to go after Apple and Android’s market share and Microsoft may give up after a year or two. On the other hand there are very, very big dollars at stake, including the future of Nokia and quite possibly the future of Microsoft, so my money would be on them putting everything they can into this for quite a few years. And while WinRT is the focus you can expect to see Microsoft doing everything they can to push their legions of Windows desktop developers into developing for WinRT.
Again, .NET isn’t going anywhere but nor is it Microsoft’s holy grail any more and performance isn’t going to be helped by all the wrappers WinRT applies to it in order to get it to behave like C++ COM software…