Adobe Flash or HTML5?

More as more sites are moving away from Adobe Flash; for good reason. Announced today, Slideshare, an online tool for sharing presentations, have completely rewritten their site to use HTML instead of Adobe Flash.

This has meant that their site is now functional not only without the Flash plugin, but also on iPhone, iPad, Android and any other mobile platform that supports HTML5. They’ve also claimed a 30% speed increase; and it is guaranteed to help with search engine optimisation.

So why ditch Adobe Flash? Since the beginning of ‘Flash’ (formerly Macromedia Flash), has been used to irritate and annoy web users through advertising and animation and frustrate programmers; while also used to delight gamers and help designers.

However, is has a few flaws; notably the site isn’t readable by a spider or search engine, and is only available on platforms that support Adobe Flash.

Adobe themselves claim a 99% installation on PCs; much higher than say Java or Quicktime – we’d love to question these figures, and it is simply not available on mobile platforms.

What are the issues?

  • Flash is a proprietary product; the code for how it works is not available, and Adobe do often change their code and update it. It’s also been famed for causing browser memory issues, so much so that Apple declined to have support on the iPhone and iPad.
  • Not only that, but Flash has been a primary target for slipping malware and viruses, especially via social networks such as Facebook.
  • Using Flash tends to break conventions associated with normal HTML pages. Selecting text, scrolling, forms and right clicking are not part of the browser.
  • Flash player has to be able to animate on top of video renderings, which makes hardware accelerated video rendering more complex than a purpose-built multimedia player.
  • Flash is also often blocked in some browsers; and will only play after being clicked.

 

So what are the alternatives? Naturally HTML5 can solve a great number of these issues, while also preserving your site for web spiders and search engines. However, HTML5 does have issues with video.

Adobe seem to be creeping out of the ‘experience’ part of HTML, and is almost always only used for video streaming and games. The reasons are simple, gaming in HTML5 can be complex, and Adobe has been written almost solely for this purpose.

As for video; there are few options. Due to various disagreements, there is no single standard for HTML5 video – unlike for Flash Streaming (which uses FLV).

The current HTML5 draft specification does not specify which video formats browsers should support. Web browser are free to support any video formats they feel are appropriate. Originally Ogg Theora was recommended since it is not patented and wouldn’t require licencing.

Most browsers will support Ogg Theora, but Safari and Internet Explorer require manual installation of the codec to support playback. Other alternatives such as H.264 are supported in these browsers, but aren’t available at all in others.

A quick guide, from Wikipedia is below; it’s easy to see why so many sites still use Adobe Flash for streaming.

Browser Formats supported by different web browsers
Ogg Theora H.264 VP8 (WebM) Others
Internet Explorer Manual install 9.0 Manual install No
Mozilla Firefox 3.5 No 4.0 No
Google Chrome 3.0 No (removed at 11) 6.0 No
Chromium r18297 No r47759 No
Safari Manual install 3.1 Manual install Manual install
Opera 10.50 No 10.60 No
Konqueror 4.4 Manual install Yes Manual install
Epiphany 2.28 Manual install Yes Manual install

DigiNotar CA hack, and serious weaknesses in security

DigiNotar, the Dutch certificate authority was recently the center of a significant hacking case. On the 19th July, the CA discovered that at least 531 rogue certificates has been issued. However, it was only in August that the attacked became public knowledge.

Security firm, Fox-IT were hired to investigate the breach, and the compromise has incurred financial losses that have caused the DigiNotar to declare bankruptcy.

So what happened? Fox-IT reported to net-security.org:

“The most critical servers contain malicious software that can normally be detected by anti-virus software,” it says. “The separation of critical components was not functioning or was not in place. We have strong indications that the CA-servers, although physically very securely placed in a tempest proof environment, were accessible over the network from the management LAN.”

All CA servers were members of one Windows domain and all accessible with one user/password combination. Moreover, the used password was simple and susceptible to brute-force attacks.

The software installed on public-facing web servers was outdated and unpatched, and no antivirus solution was installed on them. There was no secure central network logging in place, and even though the IPS was operational, it is unknown why it didn’t block at least some of the attacks.

The DigiNotar-controlled intermediates have been blacklisted in Mozilla Firefox and Google Chrome, and also by other browser manufacturers. The Dutch government announced on September 3, 2011, that they will switch to a different firm as certificate authority.

Some of the certificates signed by Diginotar include *.google.com, and even attempted signing certificates for double-wildcarded certificates for *.*.com and *.*.org. There seems to be some confusion whether these “double wildcard” certificates are valid but if they are then no DigiNotar-protected .com or .org sites could be trusted.

In the last week, The Register has reported researchers have discovered a serious weakness in virtually all websites protected by the secure sockets layer protocol that allows attackers to silently decrypt data that’s passing between a webserver and an end-user browser.

If true, the underlying vulnerability  is present in virtually all applications that use TLS 1.0, making it possible to apply the technique to instant messenger and Virtual Private Networking programs.

Although TLS 1.1 has been available since 2006 and isn’t susceptible, virtually all SSL connections rely on the vulnerable TLS 1.0, as The Register reports;
“While both Mozilla and the volunteers maintaining OpenSSL have yet to implement TLS 1.2 at all, Microsoft has performed only slightly better. Secure TLS versions are available in its Internet Explorer browser and IIS webserver, but not by default. Opera remains the only browser that deploys TLS 1.2 by default.”

As also reported by The Register; this handy image explains all;

Support for TLS 1.1 and 1.2 is virtually non-existent, Qualys Director of Engineering Ivan Ristic says via The Register

Support for TLS 1.1 and 1.2 is virtually non-existent, Qualys Director of Engineering Ivan Ristic says via The Register

As reported; “What prevents people is that there are too many websites and browsers out there that support only SSL 3.0 and TLS 1.0. If somebody switches his websites completely over to 1.1 or 1.2, he loses a significant part of his customers and vice versa.”

Schools to teach coding at GCSE level

As an aside from the usual discussions, there is an interesting article on The Register discussing the trial of teaching programming to GCSE students.

Having worked in a number of schools, I’ve seen a drift away from the early days of teaching computing. When I was at primary school, there were a handful of BBC Micro computers were available – and the Logo ‘turtle’ which we had to take turns in using, often waiting months.

Obviously we’ve moved forward in terms of processing power and availability – it seems almost ridiculous that one computer was often shared between an entire school – and only offered 2MHz CPUs with a maximum of 64KB of RAM. Even though the technology has improved, we have radically drifted away from the understanding of computers.

As an aside – some things don’t change; apparently I was often helping the staff out to use these ‘microcomputers’ even in early school. I’d been lucky enough to be taught by my family to not only use computers, but also the importance of logic from a ridiculously early age, and I’m sure this early start was why I was programming before my teens and have such a keen interest now.

Programming is generally now not taught at GCSE level; but is now “Information Technology”. This is essentially often Microsoft based products, Office Suites, and rarely even involves the worst of the worst – mail merging and databases. The first chance many students will have at programming will be at college – should they opt to choose it.

…and, according to The Register; “take-up of IT qualifications has fallen in the past five years: a staggering 57 per cent decline between 2005 and 2010.”

This leaves a generation who find it more difficult to fully understand how computers function and also why software can be confusing. Naturally all developers vary on their strengths and talents, and this affects the resulting software – but the teaching of logic isn’t just beneficial to programming.

Thankfully, as The Register reports, a new trial of 100 students is being tried over two-terms, that may be rolled out further:

Launching the “Behind the Screen” scheme, science minister David Willetts told the British Science Festival in Bradford yesterday that the idea has been in development since 2010.

Willetts said: “[It] will transform the IT curriculum away from computer literacy, which we believe many young people can do earlier, towards instead how they develop software and computational principles; how they can create their own programs.”

The schools chosen are Manchester Grammar, Bradfield College, Reading, Park House School, Newbury, and Townley Grammar in Bexleyheath, Kent.

Google’s CEO Eric Schmidt has been one of its most notable recent critics, for teaching students how to use software, rather than teaching logic:

“I was flabbergasted to learn that today computer science isn’t even taught as a standard in UK schools. Your IT curriculum focuses on teaching how to use software, but gives no insight into how it’s made. That is just throwing away your great computer heritage.”

“Over the past century the UK has stopped nurturing its polymaths. There’s been a drift to the humanities … engineering and science aren’t championed.”

“In the 1980s the BBC not only broadcast programming for kids about coding, but (in partnership with Acorn) shipped over a million BBC Micro computers into homes and schools. That was a fabulous initiative, but it’s long gone.”

We look forward to these developments and look forward to the results of the trial with nostalgia and interest.

Microsoft Windows 8 Developer Edition

Without wanting to turn into a review site; we’ve taken a look at Microsoft’s new Operating System – Windows 8; and we’re genuinely impressed. The beta is available to all, and is valid until March 2012. You can download it directly from Microsoft.

It’s not recommended that this is installed or upgraded on a live machine (and always make backups); it’s likely that it’s almost impossible to downgrade. We’re swapped to a new and blank hard drive on a AM3 Quad Core with 3GB RAM. The system requirements are 1GHz CPU, 1GB RAM and 16GB Hard Drive space.

After a very familiar installation screen, almost identical to Windows 7; the user is presented with a simple black screen with white text showing the current action and percentage – no progress bars here. The new font is clean and crisp; although some have criticised its simplicity; it’s fast and easy to read.

The Metro interface is generally smooth and fast, familiar of JQuery – and likely uses it. Under high load it can stutter, but not so bad as to make it unusable. There’s plenty of Metro tiles included, and the larger download has Visual Studio Express to allow developers to make their own. We’re yet to have time to play with this yet, but hope too soon.

They’ve taken heed of the problems with Vista, and stuck truly to improving on the Windows 7 experience for the home user. Behind the new glossy-style Metro screen; is the familiar Windows, with a few subtle changes in Explorer.

Windows 8 Metro Interface

Windows 8 Metro Interface

The explorer interface has had a face-lift, as previously discussed – and thankfully have taken notice of the complaints and allowed the Office-style Ribbon to be minimized as shown here:

Screenshot of Explorer 'Ribbon' collapsed and uncollapsed

Yes, the new Explorer 'Ribbon' is collapsable

The new Metro interface is an unusual learning curve – many of their user interfaces are slightly different; for changing settings, are counter-intuitive – or cause issues. One we’ve found is using Remote Desktop, it’s difficult to use the Start button since the new ‘Metro’ menu pops up when the mouse is in the bottom left.

Metro also offers a web browser in the form of Internet Explorer 9; however since it’s based as a Metro tile; it will be interesting to see if other web browsers are able to integrate so well into the new interface. Whether they’re able to or not, it might be one of Microsoft’s methods to try and bring confidence back to IE.

There seems few other changes, at least in this release. The new ‘Metro’ interface is undoubtedly going to be controversial; and we’re concerned about the use of Internet Explorer; but Windows 8 certainly seems to be a new direction for Microsoft.

Five Nines availability – you have 5 minutes 16 seconds.

The Register has today reported that the ASA are pursuing Microsoft on their claim of 99.9% reliability:

The Advertising Standards Agency (ASA) is checking out a complaint about claims from Microsoft that it can guarantee 99.9 per cent uptime on its cloud services.

The Business Productivity Online Suite (BPOS) has been prone to outages. And even its successor, Office 365, has gone down twice since its launch in late June, leading some customers to dub it “Office 364”.

While Microsoft not the first company to claim ridiculous reliability, it’s frustrating to see another company guaranteeing the uptime that they cannot manage.

Of course, your website needs the best uptime and the best reliability for your customers, however; there are some important factors to consider

  • 100% uptime is impossible to guarantee, there are too many factors
  • scheduled downtime can be essential to ensure the security and speed of your site
  • most companies who claim 99.9% or more uptime have probably never achieved it; and instead will pay compensation as a proportion of hosting fees by time

 

Often, if your site is unavailable under these ‘guarantees’; you must have noticed and claim for the compensation. Sometimes even requiring proof. Usually the amount paid for a few hours downtime in hosting fees will not compare to the amount of sales and confidence you may lose from your customers; especially at peak times.

We’ve worked with companies before who have claimed ‘Five Nines’ availability, with one notably using it as part of their identity, and ironically suffered with almost predictable downtime every week. So what is Five Nines? It’s commonly taken to mean “99.999%”, i.e. when the downtime is less than 5.26 minutes per year; here’s a short chart; shamelessly taken from Wikipedia’s entry on High Availability:

Availability % Downtime per year Downtime per 30 days Downtime per week
99% (“two nines”) 3.65 days 7.20 hours 1.68 hours
99.5% 1.83 days 3.60 hours 50.4 minutes
99.8% 17.52 hours 86.23 minutes 20.16 minutes
99.9% (“three nines”) 8.76 hours 43.2 minutes 10.1 minutes
99.95% 4.38 hours 21.56 minutes 5.04 minutes
99.99% (“four nines”) 52.56 minutes 4.32 minutes 1.01 minutes
99.999% (“five nines”) 5.26 minutes 25.9 seconds 6.05 seconds
99.9999% (“six nines”) 31.5 seconds 2.59 seconds 0.605 seconds

 

However, like Microsoft, many of these sites can’t achieve anything like this uptime – instead often ignoring their ‘goal’, or occasionally opting to give compensation. However, a few hours downtime is usually only a few quid in hosting fees; when in reality it could mean hundreds or thousands in lost sales and confidence.

It’s clear that even with Cloud Computing, some downtime does happen. Both Google’s Docs and Apple’s iTunes have also suffered with unexpected downtime; and despite Facebook’s geolocated servers – they’ve struggled at times too. Even the Financial industry, arguably the most experienced in the business of reliability, has suffered with downtime previously.

Naturally new techniques have helped massively with reliability; but the general (mis)use of the term gives a very poor indicator. If you see a company advertising or guaranteeing uptime, remember – the compensation for downtime from a poor service does not compare to the sales that would’ve been gained using reliable hosting.

While it is possible to achieve an incredibly high amount of reliability – to be successful requires expertise, not compensation and outdated software.

P.S. Our uptime for the previous 12 months has been ~99.972%; with almost all of our downtime for scheduled maintenance and upgrade.

eMail: web@frag.co.uk
Phone: 07739 315821

Meta


frag.co.uk
on Google+