Web Accessibility Toolbar 2.0 is Now Available

The Paciello Group has recently announced that the Web Accessibility Toolbar Version 2.0 Beta for Internet Explorer is now available.  This tool will check some of  the accessibility features of your Web site, much like the Web Developer Toolbar for Firefox by Chris Pederick.  While I can’t vouch for the tool because I’m a Mac OS X & Firefox user, I have heard it’s great.  Check it out and report back to me.

The W3C CSS Working Group Joins the Blogosphere

I am very pleased to see that the World Wide Web Consortium‘s Cascasding Style Sheet Working Group has decided to join the blogosphere. Hopefully they plan on really utilizing the medium for conversation and transparency and it won’t stay dormant after the first post.

I think all W3C Working Groups should have blogs and update them regularly.  What do you think?

WCAG 2.0: Add Captions to Your Online Video

I recently read some obscene statistic about the HUGE amount of video that is getting uploaded to the Web everyday. It’s a probably safe bet to say that the majority of that online video doesn’t have any captioning.  This is a big problem for people who are deaf or hard of hearing and are trying to understand the message of your video.  According to Gallaudet University, about 8.6% of the American population or 20+ million people have some form of hearing problems.

Captioning takes time and its not easy. I wish there was a magic button that you could press and captions would magically appear on the videos you were making.

Regardless, the W3C’s Web Content Accessibility Guidelines (WCAG) 2.0 has the Success Criteria 1.2.1 which says:

1.2.1 Captions (Prerecorded): Captions are provided for prerecorded multimedia, except for multimedia alternatives to text that are clearly labeled as such.

Well authoring tool vendors and developers have responded to our call for better tools.

In the latest version of Adobe Flash CS3, there is integrated captioning functionality. According to Adobe Accessibility Engineer “delivering captioning in Flash really easy.” While, I haven’t seen this at work. I’m pretty excited that Adobe has made this a priority.

There is also MAGpie, the free open-source tool from WGBH’s National Center for Accessible Media.  If you already have the transcript for your video, you can quickly turn the transcript into the xml file format you need to make captions for your online video.  I have seen it in action.  It’s not super seamless but it gets the job done.

The US Library of Congress has started to integrate the use of MAGpie and Flash video to provide captioning for some of their videos.  Check out the videos for the MacDowell Exhibit. (Full Disclosure: With my government contracting job, I work at the Library of Congress full time.)

One of the most interesting tools I have seen is dotSub.  You can submit your video to the service and then you or any of the members of the service can transcribe and caption the video.  Once you have the initial captioning done,  the captions can be translated into many languages.  This is all done through the wisdom and knowledge of the community.

Lee Lefever did it with his Wikis In Plain English videodotSub really worked for him.  Not only was he able to get his video transcribed and captioned in English.  It was also subtitled into a dozen other languages.  His video is now accessible to people with auditory disabilities where it wasn’t before.

Angela’s Thoughts on Facebook Being Unfriendly to a Global Audience

Angela Randall over at AllFacebook.com wrote a good post about her frustrations with Facebook not being designed better for a global audience.

She makes some great points like:

1. Seasons. If I write that I took a course with another Australian in Spring 2006 (ie: Sept, Oct & Nov 2006) that is going to mean an entirely different time of the year to an American Spring. Not to mention that not many countries outside the US use the term “Fall”. Why not just say the months? Then it’s the same for everyone.

I thought her post was appropriate considering my last post .

Is Flickr Just the Start of More Translated and Localized Web 2.0 Apps?

Flickr UI Translated
My Flickr account translated into Chinese (I think?)

If you haven’t already heard, this week Flickr released seven localized versions of their user interface. It is now available in French, German, Italian, Spanish, Korean, Traditional Chinese, and Portuguese. This is exciting news.

While Flickr may have some more issues to work out, I have a feeling this will play a huge role in Flickr more effectively attracting a much bigger global audience.

Yahoo! VP (and man who the Flickr team reports to) Bradley Horowitz, referring to the recent announcement, recently wrote in his blog, “Flickr is stupid, and late… but redeems itself.” I think we all can learn a lesson from this.

Isn’t it time that all of the popular Web 2.0 applications start moving in the direction of translating and localizing their interfaces? We should be building our applications from the beginning with the understanding that we at some point will be localizing the UI.

It is a WORLD WIDE Web. It won’t take a long for a Web app to get a world-wide audience.

When will Digg or Facebook follow Flickr’s lead? Back in November 2006, there was a post on the Digg blog about how they were internationalizing their databases by moving to UTF8. There has been no sign of Digg taking any big next steps.

Yahoo! Continues Web Accessibility Video Series with Karo Caran Introducing Screen Magnification Software

Last month, Yahoo! posted a great introductory video on Web Accessibility and screen readers with Victor Tsaran. Well they recently posted the next video, in what is becoming a series. It is an introductory video to Web Accessibility and screen magnification software with Karo Caran.  If you have never seen screen magnification software in action before, you really need to check out the video.

Corecomm Web Hosting Services Were Down For 16+ Hours

I’ve aired my dissatisfaction with Corecomm Web Hosting in the past. Well from early last night to early this afternoon, one of the Web hosting accounts that I had with them was down  (around 16+ hour downtime).  Is it just me or is that ridiculous?

Short of the Web server room burning down and them having to re-key all the HTML by hand, what would make that amount of an outage acceptable?  We live in an age where people’s lively hood is derived from their Web presences.   What would happen to them if their sites were down for over 16 hours

Well, it just happens to be that we are in the process of transferring that hosting account to Great Lakes Comnet.  Goodbye Corecomm!

WCAG 2.0: Well Formed (X)HTML is a Criteria for Success

Every Web standardista should be happy to hear that well-formed (X)HTML is a requirement at Level A for the W3C‘s latest draft of the Web Content Accessibility Guidelines (WCAG) 2.0. Success Criteria 4.1.1 says

4.1.1 Parsing: Content implemented using markup languages has elements with complete start and end tags, except as allowed by their specifications, and are nested according to their specifications. (Level A)

This means that you have to follow the rules when writing the markup for your Web site. The possible techniques for meeting this success criteria are:

There is more then one possible technique that would be possible for fulfilling this success criteria. You don’t have to do them all, just one.

One option, as notated in the first technique listed, is that you have valid HTML. You should be able to go to the W3C validator and get the big thumbs up.

Probably the best option, as noted in the second technique listed, is that you write your HTML according to the specification. This is more then just well formed markup. This means you should have meaningful and semantic markup, as specified by the specification.

The final option – the fall back option – is just having well-formed markup, as notated by the last two techniques. You’re HTML tags should properly nest with each other and that every open tag that needs a closed tag has one.

I’m guessing this final option is there for those who don’t want to lose their accessibility conformance because they have an errant miswritten ampersand that shows up somewhere (having worked at a large organization on their web team, this happens often).

I’m good with these options. In the end we are requiring of people that they use well-formed markup, which is a big part of the battle against tag soup.

What do you think?

One accessibility expert wrote in 2006 that…

Even if valid HTML everywhere all the time is unattainable, the fact remains that, in 2006, we have never had more developers who understand the concept and are trying to make it real on their own sites. WCAG 2 undoes a requirement that, were it retained, could be perfectly timed now.

Please share your thoughts.

CNN.com Team, Stop Lurking at Join the Conversation

A referrer from CNN.com interal docs to my blog

(above is a screen capture from my wordpress.com referrers)

Yesterday, I wrote a post about how I thought it was unprofessional of the CNN.com Web Team for their beta redesign site to use some presentational HTML tables and to have invalid markup.

Today I got a few visitors who were referred from:


The page doesn’t load. If I had to guess, it’s an internal wiki. Turner Broadcasting is documenting all of the blogs that are talking about the beta redesign. I’m ticked that they finding important to read blogs and find out what we think but…

Instead of the Web development folks at Turner Broadcasting lurking in the background just reading the post, why do you post a comment… join the conversation?

Explain to me and the rest of us Web standardistas the business justification for publishing a site that doesn’t conform to best practices (valid HTML for content and CSS only for design) that everyone in the Web industry agrees are in the best interest of anyone.