The Myth of WORA

For years, businesses have searched for the holy grail of the development world – the notion of being able to write a program, application, app, or whatever they’re now calling it, just once, while being able to deploy it on any number of systems. Hence the term “Write Once Run Anywhere”, a.k.a. WORA.

This notion really started to gain traction, or at least publicity, in the mid-90′s when Java took the business world by storm. Through the Java Virtual Machine (JVM), companies could be assured that their valuable software assets could be reused on Windows, Solaris, Unix, Linux, or Mac operating systems. The problem was that the implementation of the JVM varied across operating systems. In addition, the apps created with the technologies of the day, like Applets, Swing, etc. produced clunky user interfaces.

Now today, we have a whole new set of technologies with the same old set of promises. I don’t know how many times I’ve heard business folks say that want to learn about this HTML5 thing and how it can bring them to the promised land of WORA. After rolling my eyes (once I’ve turned away from them, of course), I proceed to explain how this is something that is more promise than reality. Besides, says I, who would want that anyway? Let me explain.

Be Smart

Central to the idea of WORA is the notion that you can build a user interface once and not have to worry about it again. This may have been fine in the past when all you had to concern yourself about were desktop machines, with roughly the same visual specs (although, even that is debatable – remember VGA vs XGA vs XVGA?). But now, with the proliferation of mobile and tablet devices, we have a real mess on our hands.

For example, I can go to any Web site on my iPhone 4, even something like http://www.bbc.co.uk/. But why would I want to pinch, zoom, and scroll my way around a site designed for a much larger screen? Wouldn’t it be a better user experience if the iPhone user were presented with something targeted to their particular device, or at least class of device? Of course it would. And the BBC thinks so too, since by default, it presents cell phone users with a mobile version of their site.

So the moral of this story is, just because you can do something, doesn’t always mean it’s the right thing to do. Let me repeat that, just because you can… ah, anyway, you get the point.

Targeting Devices

Remember two paragraphs ago when I mentioned the term “class of device”? Of course you do, it was two paragraphs ago… what’s that? ok I’ll wait for you. Anyway, this is a very important term, so let me explain what I mean by it.

There are two key factors in grouping devices together into a “class”. It is not operating system (you’ll hear this a lot — target iOS devices! target Android devices!), nor is it form factor (you’re starting to hear this a lot too — target cell phones! target tablets!). What we should really be concerned about are two things: input and output.

By input, at this point to keep it simple, I mean that a mouse is a much different input device than a finger is, and you need to account for that. Now, it is possible to make visual elements that accept both forms of input, but that may end up restricting the extent to which you can target the user experience.

By output, I mean the combination of three things: screen size, screen resolution and pixel density. Screen size obviously gives you an idea of how much “stuff” you can fit on a page. But hold on, pixel density comes into play here as well. Pixel density is measured in Pixels Per Inch (PPI) that is determined by dividing the diagonal screen size by the number of pixels displayed.

So, for instance, the iPhone 4′s retina display will give you a whopping 326 PPI, whereas a super sharp 720p 42″ HDTV will give you a surprisingly paltry PPI of 35. A better comparison might be the iPhone 4 to the iPhone 3. The earlier version has the same sized screen as its successor, but its resolution, and therefore PPI, is half of what a retina display offers (163 PPI).

What this means is that for two devices with identically sized screens, the app that you create might look substantially different on both. This may show itself in a number of ways — extra whitespace, minuscule text, buttons that are hard to tap, and so on.

So what I’m saying in a nutshell is forget about this notion of writing a user interface just once and forgetting about it. Sure it will work, but you’ll have an awful lot of complaints once you release your product (or worse, no customers at all).

What You Can Do

So now that we know what you can’t, or rather, shouldn’t do, you may be asking is there anything that can be written once? And fortunately, the answer is yes.

First off, the whole point of the previous section, was to show how classes of devices could be targeted together. So, while this doesn’t create a “write once” scenario, it does provide for a “write a few times” situation.

Secondly, what we’ve been talking about is the user interface and the overall user experience. What we haven’t mentioned is all the other “stuff” that constitutes a complete application. Things like events, functions, object models, business logic, and the like can all be reused.

This is where the promise of HTML5 (or HTML, as they now call it) comes into play. Remember, HTML is a misnomer. What we’re really talking about is a set of Web technologies; namely HTML, JavaScript, and CSS. Using these technologies, you can absolutely reuse a good portion of your code.

For example, I am currently working on a project where we’re utilizing backbone.js as the MVC framework. The MVC pattern certainly lends itself nicely to segregating what is reused and what needs to be targeted. Any backbone.js Models, Collections, Events, or Views (which are, oddly enough, really controllers in the traditional sense), can be reused across UI’s. Then, through the use of templates, CSS, or both, one can target the UI to specific classes of devices, as mentioned above.

New Term

In the end, although WORA is pie in the sky stuff, it’s still nice to know that there are some things that can be reused. And of course, some reuse, as realized when deploying Wapps, is preferable to no reuse, as is typically realized when deploying native apps.

How about a term to sum this all up: WUFTEEORA (Write the UI a Few Times and Everything Else Once, Run Anywhere). Catchy, ain’t it?

This entry was posted in Technology and tagged , , , . Bookmark the permalink.

63 Responses to The Myth of WORA

  1. Vidyasagar Venkatachalam says:

    Thoughtful article! Has brought out many interesting points regarding WORA from UI dev perspective esp. monitor compatibilities. Regarding the alternative acronym to WORA – WUFTEEORA , however, i’m in disagreement with you for the following reasons, expand »

    1. I don’t think “Everything Else Once” is correct. e.g. consider a JEE based web app which requires application level security (authentication and authorization). If we opt for container managed security then obviously a security feature which works in Weblogic may not work in Tomcat and viceversa. So a WAR is not completely portable across servers. To solve this either we need to have different deployment descriptors for each server type or use third party frameworks like Spring to take care of the changes transparently. This is similar to using different CSS/Templates to target the UI to specific classes of devices. In essence, the configuration changes problem is not only pertaining to UI but other type of applications as well.

    However, one thing i like to emphasize is that an application which is coded (written) in Java obliges WORA principle atleast at the code level if not at the configuration level.

    2. (Kind of moot point) Sorry to say, i’m not impressed with the acronym WUFTEEORA :-) esp. when compared to WORA which is short and easy to remember.

    • Gene says:

      Thanks for your feedback!

      To respond, first off the acronym (or really, initialism) was just a bad joke. I certainly would never advocate such a monstrosity!

      Regarding your excellent points – I probably should have made it more clear that I was talking about WORA from a mobile standpoint. As such, I am partial to an HTML/CSS/JS solution that utilizes web services, rather than a Java/JSP solution. The run everywhere aspect that I referred to was client side rather than server side.

      Again, thanks for your thoughtful comments.

  2. How about Write Once Complie Individually – kinda like how you build I. Unity 3D.

    You take the core set of code and modify it for each output.

    • Gene says:

      I suppose that’s an option, depending on which language you choose. One consideration though – the nice thing about JavaScript, being an interpreted language, is that you can easily deploy new code at runtime, if need be.

  3. JonO says:

    WORA was a marketing ploy by Sun. It’s not too surprising that a number of bean counters fell for it, but the only developers who ever espoused the concept were the anybody but Microsoft boys.

    • Gene says:

      Actually, bean counters and developers alike, are still “falling for it” today, primarily due to a lack of understanding. I agree, though, that it was heavily pushed by Sun.

    • Darin says:

      First, I loved this article and completely agree with the aspect of “classes” of devices and building UI for those devices.

      But I do have to comment about the “marketing ploy” comment. While any big company can be accused of focusing on marketing instead of practicality… oh wait. Companies are in the business of making money so they ALL market what they are selling. But to accuse Sun of selling WORA as ONLY a marketing ploy that didn’t actually result in value, is absurd.

      I’ve been a developer for more than 25 years and have been working with Java for a little less than 10. We have developed a multi-million line business application for a specific vertical market that has a client and a server component. Either piece will run on Linux, Windows, Solaris or Mac (and probably others) with minor code differences to handle printing and some other OS specific things. This gives our company the option of choosing deployment configurations and features that are not driven by developers.

      Granted, we are looking at developing a separate client for phones and have no intention of attempting to deploy our current client in that environment. With that being said, I think we have an application that is truly WORA. I believe that is what Sun originally meant when it began pushing WORA.

      I get really tired of the “fanboy” attitudes of taking sides. The idea that the only people who “espoused the concept were the anybody but Microsoft boys” is just another way of saying “I’m a fanboy, just on the opposite side”. I just wish I could read an article like this, that contains excelent content, and not have the following discussion littered with fanboy hatred.

      Come on people. Let’s be professional and look at the reality of the tools we use. If you think a tool you use is the only option and everyone else is wrong, you are probably wrong. By the same token, if you think a tool is totally useless and has no value, you are also probably wrong.

      • Gene says:

        Thanks Darin, you make excellent points. I agree that you can have different degrees of WORA, so that maybe what Sun was pushing was not technically a myth, if looked at from a server-side standpoint. But then again, I seem to remember them pushing client-side reuse as well, which does not make a whole lot of sense in this day and age (or maybe not even back then).

        Agreed too on the fanboy comments. You pick a technology and you’ll have a slew of fanboys, that’s just the way it is.

    • Pete says:

      WORA was a marketing ploy by Sun. It’s not too surprising that … the only developers who ever espoused the concept were the anybody but Microsoft boys.

      This is oh so true. I started my software career in the early 1990′s to the present and I must say how interesting it has been (if “interesting” is the right word) to watch the fads and trendiness of the industry over two decades.

      • Gene says:

        Pete, I understand your point of fads and trends, but I’m not sure that I agree that the only developers who ever espoused the concept were the ‘anybody but Microsoft boys’. There have been plenty of other companies who have pushed the same thing, but just called it something different, including Microsoft.

  4. David Boccabella says:

    Well the concept of WORA is somewhat correct. The term for modifying an application to run in a new environment is ‘Porting’

    So its should be WOPORA or Write Once Port Often Run Anywhere

    Take Care
    Dave

    • Gene says:

      I guess it’s a matter of semantics, but whatever you call it, it’s still a myth! ;-)

      I cringe when I think about the UI/UX of ported code. In the desktop world, Eclipse is probably the nicest I’ve seen, and even that is far from ideal.

  5. hi
    i have been referring to websites where i can find stuff to read on programming
    this one is just fine
    okay i am here in a situation where i need to code small modules or big programmes both for linux and windows and i am unable to find the best tool/language
    because the problem is that c# CLR targets the microsoft environment and for JAVA a JVM is necessary
    I tried my hands on python which is cool
    even with python / Ruby you can write apps that support WORA

    However if we talk about objective C or C++ than its the best choice
    as there is no need for an objective JVM or CLR sandbox

  6. NCurrie says:

    Unfortunately for your argument, many of us do develop single Java user interfaces that get deployed to multiple operating systems. Our application is written once and deployed by our customers without change on their Macs, PCs and Linux machines. And magically it looks like a Mac application on a Mac, and a Windows application on a PC. The magic comes from the JVM: each operating system has a different JVM, and it’s the JVM that is responsible for making sure our single Java application works in an OS-specific way.

    As Vidyasagar says, the problem in the Java world is with the Everything Else, specifically deploying J2EE applications in different application servers. But even here, the problem isn’t with the core Java WORA philosophy, but with the configuration of services that these application server containers provide.

    • Gene says:

      But don’t you think that the varying screen sizes and pixel densities of mobile devices changes things? I can’t imagine that your desktop apps that today run on Macs, PC’s and Linux machines are easily used on mobile devices, setting aside for a minute that some major OS’s (most notably iOS) don’t even have a JVM.

      • NCurrie says:

        Of course. But we didn’t design the application to run on mobile devices. The fact that we only have to maintain one code stream for three operating systems (compared to our C++ colleagues who have to maintain two codestreams just for PC and Mac) is such a huge advantage that we’re happy anyway.

        We’re now designing the next generation, which will have to run on mobile devices, and surprise, surprise, we’re using HTML, JavaScript and CSS :-) .

        • Gene says:

          Ah, this hits on a good point that you and others have made – there’s such a thing as WORA for targeted platforms. I still think that’s going a bit against the initial spirit of the acronym, but it’s a valid use case.

  7. Christian Sciberras says:

    Though I’m not easily an advocate for Java (I dislike Java in general), one must admit its merits.
    Although the article in general describes several of my own thoughts about this matter, it simply is not fully true.

    Speaking of Java, have you ever used NetBeans IDE? I use it all the time, be it Windows 7, XP, Linux or a Mac. How better could it get?

    I’ve myself started a project which builds on the ideals of WORA, but it is instead a framework. So far it’s doing a great job with this concept. Sure, there are issues, but the article seems to over-exaggerate on most of the issues.

    Mind you, I rarely use Java, and I’m planning to stay away form HTML5 (and its well known hype) as much as a web developer can.

    Point is, WORA isn’t a myth, it works. Period.

    • Gene says:

      Not sure I fully understand your comment. How does WORA work in your case? See my response to the last comment regarding Java and what happens when you need to also deploy onto a mobile platform. That changes the equation considerably.

      • Christian Sciberras says:

        Gene,

        In my case WORA takes the form of an abstraction interface. Example: If the web developer wants to change the title, he the interface ensures this is done, no matter what unearthly hack it took to get to it. Of course, the interface is intelligent enough to not set a title if the target platform doesn’t support or need titles.

        What I said above works as well for mobiles. If the mobile screen can’t get any larger than 500×200 pixels, it’s a platform issue, and the framework degrades gracefully. This doesn’t make the end application usable. If there’s no way an IDE, like NetBeans, can be thrown in a mobile, it’s not a WORA shortcoming, but the device’s.

        Remember Java’s début in embedded systems? Such as powering a microwave oven’s interface? Again, it’s running an app that can run on computers as well as such ovens, but it’s not possible the other way round – not because WORA fails, but that the device makes the application unusable.

        • Gene says:

          Not sure that I agree with your assumption Christian. Why would it not be a failure on the promise of WORA to not run on all devices? Isn’t that what the ‘A’ stands for?

          Also, the term “degrades gracefully” makes me envision a green screen on my iPhone. Hey, now there’s an idea for an app! ;-)

  8. Knut says:

    The WORA concept was pushed by SUN and we don’t remember why.

    In the end 80′s most enterprise software was for a specific brand, if not a specific machine. Even recompiling for a different machine was not that easy. Taking a module, fiddling a bit and integrating the component into your software was just a dream.

    It would be nice to have it this way nowadays. All software producers would have a lot of work, with a big part of it repeated tasks, which are easier to plan. Even if you are fifty and don’t have a bright idea every day, you can have a nice outcome under this circumstances.

    • Gene says:

      Wait a minute, are you saying that you’d like to go back to the stone ages of software development, or are you just pulling my leg? Maybe we should bring back floppy disks and punch cards while we’re at it!

      By the way, if you liked 80′s coding so much, you just have to check out Objective-C! :-)

      • Christian Sciberras says:

        Actually, Objective-C is pretty modern. I’d suggest trying out C/C++.
        (and in case someone hasn’t noticed, C++ is still heavily in use…)

      • Knut says:

        What’s wrong with the stone ages of software development ?

        The relative payment was better and you could blame the hardware for any kind of delay.

        Working today is a hassle from one task to the other. If you would get paid for this, it would be stressfull, but ok. But most software engineers get just enough to make their living. There are no riches to earn. I don’t know anyone in this job, who would be able to stop working the other day and live on the capital income.

  9. technogeist says:

    Having suffered using Windows CE and more recently Windows Mobile. The availability, or should I say the lack of availability of a freely available JVM made WORA a complete joke. Ditto for Apple Mac users.
    Out of sync versions are another barrier to WORA. (and it’s just as bad with .Net vs. Mono)

    • Gene says:

      Finally, a commenter who just gets it! Mobile does change the equation quite a bit, doesn’t it…

      • Christian Sciberras says:

        Gene / technogeist,

        Unless the article was a way to get disgruntled customers, I don’t see what’s the point bashing a concept (which FYI works well) when the problem isn’t the concept but the end-user devices.

        Oh, and another FYI, if the devices fail, that’s also the customers’ fault. Why? Because the costumer focus is on “social apps” – ie: reimplementing specific PC functions in a mobile – instead of focusing on real issues such as stimulating research into next-gen interfaces.

        Think about it, how much effort do you need to get a 100 character text message and a “like” button on a screen? Sounds like another case of the industry being built from the wrong side up, till some big company gets an “aha” moment and sees what should have been done instead.
        Firefox vs MSIE, Chrome VS Firefox, HTML5 VS plugins and vendor-specific-code are just some of the many examples.

        • Gene says:

          Christian, I’m sure you’re making good points, I just don’t seem to be getting any of them.

          Regarding the industry’s lack of focus on “stimulating research into next-gen interfaces”: if the interface of an iPhone (or even Android devices) isn’t next-gen, I don’t know what is. Throw in some of the research and products in the pipeline from Microsoft (e.g., Kinect, Surface, etc) and others, and I’m not sure you have a leg to stand on!

  10. Alberto says:

    Despite of the acronym, the idea of multiplatform development is always there. There are various options, but the holy grail of only develop once I totally agree that it is a myth.

    Basically just because the user experience depends of the platform you target. Talking about desktop application if you develop for Mac OsX you need to take into account the Human Interface Guidelines of Apple. But if you target Windows, Linux, or any platform there are other things to take into account cannot be applied.

    There are application frameworks that tries to help in these tasks like QT, wxWidgets, etc… but you will always to develop special code for each plataform to give the user the experience needed in each case.

    Lately is true that the trend is to develop with web technologies (HTML5, CSS3 and JS). And it is a good approach cause you are working with a standards that should behave the same way in any platform/browser. And I guess this is the important point, using a standard. But the user interface should always be changed to adapt properly. Anyway it is pretty easy to adapt with just CSS rules. It is close to the WUFTEEORA acronym :)

    • Gene says:

      That’s right Alberto, the real limiting factor is the user experience. And as much as people would like to use standard HTML technologies to make this go away, there are simply too many variables to allow that to happen. Thanks for your thoughts.

  11. Nasir Abbas says:

    Good article.

  12. Ricardo Santos says:

    This does not only apply to applications but to games also. On pc the mouse is king and you have a whole keyboard at your disposition. Yet, I keep seeing developers not taking advantage of it when porting a console game. Adding shortcuts is a one day job that adds to the user experience. Why force the user to fiddle with a circular menu to find the inventory, instead of just pressing the ‘i’ key?

    • Gene says:

      That’s an excellent point Ricardo. There are other things besides the visuals that need consideration when targeting platforms. Touch vs mouse is one, but you just hit on an example that pertains to platforms that have been around for years now.

  13. Bill Wade says:

    With RM/COBOL-85 from Ryan-McFarland (later Liant Software, and now MicroFocus), and our own UI engine, we ran an accounting system on DOS, Windows, and virtually every flavor of UNIX, including AIX running as a guest OS on an IBM 360. Granted, it was a character-based UI…

    • Gene says:

      A second vote for a green screen iPhone app! But seriously, in your case, I’m guessing the UI engine had to be tweaked for each new targeted platform. If so, that’s an approach that is analogous to what I suggested in the blog post, and one that is definitely not WORA. Thanks for your comment Bill!

  14. BrainiacV says:

    I thought it was WOTE (Write Once, Test Everywhere) :-)

    • Gene says:

      Let’s not even go there. :-)

      Good point though. Testing is certainly a consideration and mobile brings its own set of complications to the table. However, some of this can be mitigated with proper architecture and modularization of the code.

  15. Erik says:

    I can’t believe some comments are blaming the new devices.
    They clearly do not understand the point of the argument.

    They need a new acronym – WORAOALTICATRINI
    (Write Once, Run Anywhere, Or At Least Anywhere That I Care About The Rest Is Not Important)

    Your point of Input/Output is spot on. The multiple device types and resoulution
    types should force developers to think more about how they design these apps,
    but it also forces some companies to make decisions about how many they can
    or need to support from a resource perspective.

    • Gene says:

      Erik, that is a fantastic point. Companies are already being faced with such decisions. iOS and Android are currently the dominant mobile platforms. So some are wondering if they need to target webOS or Windows Phone, and if they do, whether or not they can leverage HTML technologies to do so.

      The point they need education on is that even within a single operating system, such as Android, devices vary so widely, that the decision of what to target needs to happen at the device class level.

      Thanks for your feedback.

  16. Brian High says:

    Wonderful article! Having been in the software field for more years than I care to acknowledge, I absolutely agree with your statement that “written once UI components” are so weak they’re typically unusable and therefore, a myth.

    Obviously, this was understood in the concept of tiering applications, or separating out the UI components from the other elements you mentioned such as business logic and data access functionality. Your point seems to be that HTML5 and its related technologies offer this separation in another generic or industry standard “framework” when desired.

    Personally, I’d like to point out where we all seem to limit ourselves in the classification of I/O. As an industry, we’ve been married not only to visual output but also point-and-click input methodology. Implementing more technologies and applications that rely more on audio I/O would remove many of these UI problems because a microphone and speaker(s) operate basically the same across all devices, with the exception of multi-speaker playback and audiophile quality.

    The technology has been available for some time, but the problem is most users aren’t willing to calibrate a device. Without this calibration there’s so much variation across speech patterns that the accuracy rates plummet for input, but output is much easier in that category. If a user could calibrate their speech pattern once and then have that recognized across all their devices (even a watch!) they’d potentially be more willing to invest their time in it. This is where we need something standardized (like HTML5) to provide this functionality and make it available to any application that wants to use it. (Is anyone from Apple, Microsoft or Google reading this?)

    Granted, there are a number of limitations to using audio I/O, however I feel it’s a major component to reducing UI complexity and the extra overhead you acknowledge that’s required for “multi-device class deployment” or MDCD. Only slightly catchier than WUFTEEORA. :-)

    • Gene says:

      Brian, I can’t believe you don’t consider WUFTEEORA to be catchy – what’s the matter with you? Maybe if you heard me speak it, you’d change your mind. :-)

      That’s an interesting solution you’re proposing. But I wonder if the limiting factor of speech recognition is less about training (even though that can be an issue, as you point out), and more about privacy. In the comfort of your own home, it may certainly fit the bill, but in a public place, or even in the workplace, there are issues around being heard. Throw in the fact that you could potentially have a much noisier and distracting workplace, and it may be a non-starter for many corporations. Although, come to think of it, things are pretty noisy as they stand, what with hallway/cubicle conversations and cell phone chatter.

      • Brian High says:

        I didn’t say WUFTEEORA isn’t catchy… just not as catchy as something half as long. :-)

        I agree there are some privacy and environmental issues. Headphones eliminate the output problem (this does seem to be the iPod/MP3 player era) and there are some cases where an audio interaction is preferrable, such as while driving. There are several barriers, as you note, I just don’t feel that technology should be one of them. Because it isn’t in this particular case.

        It is an interesting angle on the MDCD scenario. I’ll keep working that acronym. :-)

        Thanks again for a great article!

        • g bruno says:

          in 1995 I worked alongside a speech-input guy who-spoke-loudly-and -distinctly with-a-pause-between-words felt-like-I-was-trapped-in-a-psycho-ward

          • g bruno says:

            in 2004 it did feel great to walk across the room and run my Java application on RedHat, having built it on wind2000 (2K leaky on t alpha blend !) jpegs, charts, rtf, sql. Clunky? no video but.

  17. Bhaskar Jha says:

    The myth of WORA is still believable if we could define what we refer as “Write Once”. Historically the concept of WORA was to write application specific code (like business logic, events, etc.) that would be common to every platform only once and let VM address porting issues specific to each platform. So the effort, time and cost of development could be saved. The concept was used first by USCD Pascal and later in 90s by Sun to promote JAVA as one for all language.

    In my opinion the UI experiences (device) or server configurations (infrastructure) should not mixed in WORA concept and should be considered as a different category, let say “Device Personalization”, a part of tweak/debug/deployment process.

    However, I thank you for writing this practical guide on UI design decision. Anyway what about this acronym- WOTUDARA (Write Once, Tweak UI Device Accordingly, Run Anywhere)

    • Gene says:

      Another great acronym, but thinking about it now, it seems a bit short. ;-)

      Bhaskar, another aspect however, is that on the client side, you can also structure your architecture in such a way that pieces of it, as outlined in my post, can be reused (the M and the C of MVC, if you will). Agreed though that the infrastructure most definitely is a separate concern.

  18. Ernani Gaspar Santos says:

    Congratulations! Excellent point you have brought to attention.
    It made me remember an article I wrote many years ago (80′s to be specific) about “the myth of reuse” and the subroutine libraries to do lots of things I needed at the time in math and scientific programming and image storage and retrieval. The point was when you have a tough problem you probably have to do some routines from the scratch, sometimes in assembly language to save memory and/or make it faster, or adapting the code where the operating system gave poor support. I readdressed the issue at the end of the 90′s when I though about a framework for mobile robots which would abstract the robot’s hardware itself, using a kind of configuration or metadata file telling it about signals, pins and several data about the processor and the board construct themselves and sensors, motors and so.
    WORA is reuse to the extreme! In a way, your thoughts address “the myth of reuse” to the extreme. Good engineering is always excellent and make our lives a lot easier so many times, but every now and them there is the need to do things that are specific. Moreover in these times you address, when the applications of computing are much more complex then at the ones I have talked about. Although I am still programming, and believe that reuse is good for some kind of standard behavior apps, I gave up to try it like WORA, or WUFTEEORA, or any other name or acronym. My compliments.

    • Gene says:

      Thanks Ernani, sounds like you’ve been saying similar things for years! I think the key is to educate those ‘not in the know’ who are being fed a completely different story by vendors. They are most definitely still propagating the myth.

  19. Ken says:

    I laughed in the 90′s when Sun was touting WORA, and again later when Microsoft was doing the same thing with .NET.

    You’re not actually running on different environments, you’re writing code that works in the JVM or .NET runtime and those have been ported to different OS’s.

    You’re code is running in the JVM not Windows. It’s running in .NET not OS X. So in reality you’re just writing once and running in one place. You’ve just been abstracted one level. Just try to uninstall one of those runtimes and see how far you go…

    • Gene says:

      That is true Ken, although I’m not sure the average business person cares about how it’s being accomplished. They’d be happy just knowing it can be done. I guess what I’m saying is that JVM or no JVM, it’s still not a good idea.

      Thanks for your feedback!

  20. Great article Gene. In the context of targeting multiple classes of output devices while developing web based user interfaces, I think the CSS3 “media queries” feature would be a great fit. Media queries allow you to develop custom stylesheets designed to target output media with specific feature sets. HTML 4 and CSS 2 already allow you to create different stylesheets for the screen and for print – CSS 3 simply extends this idea to include a greater set of filter criteria – including pixel density! Here’s a stylesheet for example, that becomes applicable only on displays that have a pixel density greater than 300 dpi.

    @media screen and (min-resolution: 300dpi) { font-size: large }

    • Gene says:

      That’s a great tip Raj, thanks for the suggestion.

      I didn’t mention it in the post, but it’s also possible to construct a hybrid approach, using native code for the views, while utilizing javascript for the models, controllers, messaging, etc. If you do stick with the HTML stack, however, your recommended technique is certainly a viable approach.

  21. The biggest problem with WORA is working with too low a level of abstraction. As systems get more complex, they allow more micromanagement of the user interface, and you end up with code that depends on the details of the UI.

    In the case of HTML, for example, the less layout you do in HTML and CSS and the more you just let the browser lay things out, the less the UI depends on the details of the interface.

    The flipside of this, of course,is that the less effort you spend on the UI the more primitive and simple the UI you build. A simple UI is more likely to “run anywhere” than a complex one.

  22. Just.Another.Developer says:

    The concept of WORA will work if, one day, everyone that’s creating all the browsers out there will adhere to the W3C standards. With that being said, it’s up to this consortium to write standards that are more strict and leaves no other interpretations. When this happens, companies can then have WORA apps. Until then, I’ll just laugh when someone mentions that things need to be written cross-platform, even for HTML5 and onwards.

    • Gene says:

      That’s true, but as you say, the w3c would need to get its act together, and that’s not likely to happen. Even then, there’s nothing stopping Microsoft or anyone else from thumbing their nose at the world and doing their own thing.

      All the more reason for targeted UI’s. Good point, thanks for your comment.

  23. Jason P Sage says:

    Excellent Article. Used a similar approach for a Symbol Brand Hand Held Unit (Suite of them – all different displays) Had to be creative in similar manner to utilize each model’s particular resolution to it’s advantage without making for murderous code to manage. (simpler times though – char cell)

    I’ve always been intrigued by the write once run anywhere stuff and think it;s still scripted and slower – when FreePascal does the same thing but in pure binary – and I declare you don’t have to change hardly anything (maybe slashes for windows when building dir paths) .. stuff like that – but the whole thing comes with one “library” and it’s solid. Then there is the Lazarus “Delphiish” project that uses the core FreePascal (more solid) but Lazarus is pretty slamming too and of course adds a portable GUI… and it runs on ARM, RISC, INTEL, AMD, and quite a few chipsets really…same code.

    I know major soap box – but – sometimes the new stuff works better when you don’t force it to run bloatware all the time.

    Case and point – take ANY machine running windows anything – install Lucid Puppy (live CD thing – won’t hurt your PC) – Watch what true speed can really be! LOL It applies to these smart devices too because less is more.

    Great Article. And categorizing the “outputs” you’ll support is good I think too… however… it doesn’t mean anything if you core system doesn’t have a nice separation of businesslogic and actual presentation. It’s like Church and State… (why scripted things like asp and php don’t make me jump up and down for joy)… it’s not even the languages – it’s how many developers implement them.

    –Jason P Sage

    • Gene says:

      Wow Jason, you just mentioned a slew of technologies that I’ve never even heard of! I didn’t think that was possible.

      Thanks for the kind words on the article.

  24. Petia Petrova says:

    “CENTRAL to the idea of WORA is the notion that you can build a USER INTERFACE once” – Very heavy lie!!! Which makes the whole article a nonsense.

    • Gene says:

      Petria, while I agree with your statement on WORA and UI’s, I’m not sure I follow your logic as to why that renders my blog post nonsense. Wasn’t that my central point?

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>