karl groves: my turn. so this presentation iscalled the new hotness, how we use all these things tomake an accessibility testing api.
old woodworking tools scotland, so first, i'm a viking. so i want to everybody do theirbest viking yell-- one, two, three, ah! all right, good.
my slides can be found at new-hotness.karlgroves-sandbox.com i'm an accessibility consultantfor a company called the paciello group. the paciello group islocated in new hampshire. that's not where i'm located. i'm located here with you guys. i actually live in baltimore. we have people, who workat the paciello group, who are in france, ireland,scotland, the uk, australia,
and all over the place. anyway, so i'm also theco-founder and managing member of a companycalled tenon.io. we'll talk a littlebit about that today. you can follow me ontwitter @karlgroves. and you can go to my websiteand read all sorts of stuff about accessibilityat karlgroves.com. can somebody get the door? so we're going to talkabout the modern tool
sets, the modern workflowsof web developers these days. i started workingon my first website back in the '90s,actually when mosaic was the dominant browser. and the tools todaycompared to the stuff then, it's absolutely incredible. and we're going to be talkingabout all sorts of things that can make our workflowsa lot more efficient and increase the quality.
in the accessibilityworld, talking about tools is really bad. and it's because a lot ofthe anti-tool atmosphere came from some stuff i'lltalk about a little bit later in the presentation. but for me, i like tothink of two things as being contributors tothe human race's evolution to the dominantspecies on the planet. that's the creation of tools andthe discovery of fire, right?
with fire, we could thenincrease our intake of protein. it made the actualprotein easier to eat, when we cooked it. and the tools that weinvented, the very first tools that anthropologists havediscovered, were primarily, they think, used for scrapingthe fur off the animals that they kill so thatthey could cook it. so tools are reallygreat, right? everything that surround usright now was created by tools.
i mean, i don't see any trees. and the very thing thatwe're talking about, the web, was created by tools. the evolution of tools iswhy the web even exists. tools are everywhere. tools are great, andwe're going to use them. so i'm going to talk aboutstuff like project boilerplates and scaffolding. i'll talk about templating.
i'll talk about server-side andclient-side framework, version control, unit testing,task automation, and automated build and deploys. obviously, i'm notgoing to talk about all of these things in the hourthat we have with each other. there's not going to bea ton of detail here. but we are going to cover them. so one of the very firstand most easy-to-understand tools that we haveavailable to us
now are thingslike boilerplates. we saw some of this stuffreally early on with templates. and now you just go ahead andgrab your html5 boilerplate at the website for thehtml5boilerplate.com. and you could go there. you can download it, or youcan even create a custom build. and if you createyour custom build, you'll be able to includestuff like the latest versions of jqueryand a standard favicon
and all sorts of other stuff. as a matter offact, there's also lots of additionalresources that you can get to when you go there. and this would berelevant to the guy who i was talking toin the front here. there's some additionalitems here you can grab. you can actually grab someof the other-- anyway, these guys include stufflike apache config files
that you can downloadas part of their work, and jinx config filesand stuff like that. and then, of course, there's theeven better project scaffolding tool called yeoman. yeoman is phenomenal. i'll touch a littlebit more on yeoman later in the context ofgrunt and gulp and stuff. but yeoman-- you cangrab yeoman generators for all sorts of projects.
so for instance, the guy thatwas just speaking in this room was talking about chromecast. you can scaffold out achrome dev tool extension using yeoman, becausesomebody's already done a lot of the work to createthe backbone of the stuff. we have templating stuff. i mean, if anybodyhere writes php, smarty's been around forever. we have handlebars andmustache, pure templates--
has anybody used any of these? pure templates are coolbecause they're like json, but they don't haveany of the logic built into them from stufflike handlebars. and then there'sruby folks here. there's haml. and we have librarieslike jquery, lodash. we make a ton of useof lodash at tenon. lodash is great because it's gota lot of utility functionality
that this is built for you. then we have frameworkslike backbone and knockout. so these give usa great framework for starting in nbcor mbvm-type project. everybody writing notes. i got the slides up. but that's ok. and then version control. i didn't link these two.
if you've used them,you'll know why. [laughter] and then there's then,of course, there's git. and that can help youdo version control. and we didn't havethis stuff when i-- when i startedworking on web stuff, we had cvs, which-- i'mhappy to have survived that era in my life withoutjumping out a window. another thing that theweb has that we didn't
have when i first startedthat regular software devs had was unit testing. and so now we have all sortsof awesome stuff like qunit, and jasmine and mocha and chaiand phpunit and junit and behat and nodeunit. and these are all reallyawesome unit testing frameworks that you're an idiot not to use. i'm going to demonstratesome of these later. so don't worry aboutwriting everything down.
and then there's also allsorts of products out there that can let us doautomated build and deploy of our projects. so we can run something liketraviscl, bamboo, or wercker to automatically buildand deploy our stuff. so that if you're usingftp for, well anything, you really shouldn't bebecause these tools exist out here to do allsorts of great stuff to make sure thatbefore you launch,
you're not launchingbroken stuff. and then there's taskautomation like grunt and gulp. and, fortunately,these things all exist and are great forfront-end developers. and the accessibility communityis totally and completely left behind. they've left themselvesbehind, in my opinion, because they don'ttake advantage of some of this stuff.
and i've been talkingabout this for awhile, mainly in terms of the factthat we aren't taking advantage of the increased level ofefficiency that we get. so for instance,with gulp and grunt, i can do all sortsof great stuff like automatically minimizingmy javascript and my css files. i can compile my saas files. i can lint my code. i can even do my unit testingdirectly from a task in grunt.
and so the accessibilitypeople, they just want to come alongbehind and try to make it look likethey're doing something great and efficient,when they're not. this is for thoseyou can't see, this is a picture of a cardboardcutout of a ferrari testarossa on something else. so it's separate and unequal. the methodologies behindaccessibility testing
these days isreally nowhere near on par with the regular softwareunit testing and acceptance testing methodologies. i want to give you somehistory as to why this is. so when we talkabout accessibility, and we talk about some ofthe challenges people face, the challenge that peopleface from an accessibility standpoint are based uponthe ui and the objects that are presented onscreen in the ui
and the relationshipsbetween those things. back in the day,a long time ago, when people were justinteracting with the terminal, creating a tool that would readout the content on the screen is really trivial. you just take this information. it's going to standardoutput, and you shoot it through the speakersinstead of the screen. but when windows camearound, and especially
with windows 95 and thereal huge proliferation of computers, personalcomputers, that was spawned from windows 95, nowwe had a gui to interact with. and now people whocouldn't see it, who couldn't see where thestuff was on the screen, had a huge problem. what the hell am isupposed to click on? what is this thingthat i'm clicking on? and how am i supposedto interact with it?
it's a completelydifferent paradigm. and that was evenmade worse for users. those users where evenleft further behind, when we started to see the webget really, really popular. now everything was on the web. and now we reallyhave a problem. and so accessibilityadvocates were there in the early days of the web. people like myboss, mike paciello,
were there to tryto say, ok, wait we've got to figure this out. we're got to figure out howto make this stuff accessible. and so we did start seeingautomatic testing tools out there like-- there'sone called bobby. and bobby actually preventedme from getting hired. so i hate bobby. because i was thegenius who decided i wanted to be aweb developer back
in the late '90s in the dc area. and i was like,i'm going to change from being a recreational webdeveloper to a professional. and i'm going to applyto all these jobs. and section 508 came out, when? does anybody know? audience: 1998. when was i trying toget a job in the web? 1998.
and where was i? i was in the dc area. so i'm applying for jobsat government agencies at what we'd callbeltway bandits, which are raytheon and lockheedmartin and stuff. and i'd be like, here's--everybody wanted to see a sample of your work. so i'd send over asample of my work. and they'd be like, sorry, thisdidn't pass the bobby check.
i'm like, whatthe hell is bobby? and i started hearingthis so often, i figured out what bobby was. and it was this frickingtool that analyzed my code to see whether it wasaccessible or not. and that's actually howi got into accessibility. because i gotobsessed with this. i was pissed becausethis stupid tool wasn't helping me get a job.
so for those whocan't see the screen, this is a picture--i'm dating myself. i'm dating a bunchof you guys, too. it's a screen capture fromhistory of the world, part 1. and it's the cavemen whenthey're discovering fire. doesn't this guy like lighthimself on fire or something? yeah? audience: your pictureactually depicts prehistory. karl groves: prehistory.
audience: and you probablyare talking about prehistory. because very few of theartifacts from that era be used online [inaudible]. karl groves: yeah. yeah, and so bobby was createdby-- i forget who created it. but they were bought by acompany called watchfire, who was then bought by ibm. and so it became part ofan ibm rational tool set and all this sort of stuff.
but bobby was anautomated testing tool. and accessibility people, likethe real accessibility people, hated bobby fora couple reasons. one was that it sent fortha lot of these things called false positives. accessibility tool vendorstend to have this kitchen sink mentality where they want totry to find every fricking thing that theypossibly can, even stuff that may notactually be a problem.
and so that was-- you get a lotof vitriol on mailing lists. if you go back to early2000s on the web aim list, you'll see people. you mention bobby--like a new person to accessibilitymentions bobby, and then it's like of a frickingrugby team just jumps on top on him, going,automated tools are bad. and then the otherperson runs away. after tools like that, we sawbrowser plug-ins and toolbars,
like wave or like thefunctional accessibility test or firefoxaccessibility toolbar. and then we saw online checkerslike, cynthia says, fae, and wave, which alsojoined the bobby thing. and then we started seeingwhat i call the monoliths. so not long after the webstarted really taking off did we start seeingcompanies that started creatingweb-based applications. i mean, that's why companieslike oracle even exist.
because they were makingweb-based applications. and accessibility toolvendors are no different. some of the earlyaccessibility tool vendors like dequeand ssb bart group created desktop applications. hisoftware did, too. so accverify was a desktopapplication-- accmonitor. deque had ramp ascend. ssb bart group had infocus.
and then, of course,the sensibility of making aweb-based application struck everybody, and said, nowi can just create one version and deploy it on theweb and all that stuff. and they decidedthat they wanted to be the be-all and end-allof everything accessibility. so that they wouldfeature a spider that could crawl your site. and each page would gettested along the way.
and they had their ownissue tracking system. and ssb, hisoftware, anddeque, all three of them, tried to create featuresthat would automatically fix your code. and like i said, theywanted to be everything. but the thing that they endedup doing though was twofold. first, they createdthis separation between accessibilityand everything else. so accessibility was now doneon a completely different tool
set. so when we started seeingmature issue tracking systems, like hp quality centerand all these other-- all the atlassianproducts, jira, and all this sortof stuff-- you'd have your issue tracking foreverything else on qc and jira. and then you'd have yourextra separate system out here for accessibility. and i would argue thatmade it easy to marginalize
accessibility. and there's anotherhuge massive problem that happened with these toolsthat i haven't yet discussed. and that is the falsepositive problem is exacerbated by thefact that they're not testing for accessibility. they're doing the linting. all these tools did datastatic analysis on the source that you would see, if youviewed source on a page,
not the dom so section 508happened to come out in 1998. but section 508 andthe work on wcag 1.0 was started muchearlier in the '90s. and there was an admonitionin 508 that basically says, you can't use javascript. and that's not really what itsays, but that's what it means. and this was before thedays of dom rendering of assistive technologies. and so they were just likefuck it, don't use javascript.
and that was the admonition. but when we startedseeing dom rendering in assistive technologies, thatwas no longer a problem really. i mean, obviously,it's a problem if you write shitty javascript. but it's not now. you can't just say,don't use javascript. because the fact is,everybody uses javascript. javascript is awesome.
and you should beallowed to use it. but now how do you test for it? so before we couldsay, oh, well, you're not supposed to beusing javascript anyway. so we'll test you-- dothis static analysis. but what if you're tryingto now test the javascript to make sure thejavascript is accessible, or at least the results ofthe javascript is accessible? so this is why westarted to need
headless browsersand their tools. and so this is my pictureof the headless horsemen. and so some of the earliestheadless browsers that existed were rhino. and i experimented a little bitwith the accessibility testing in rhino. and my brain couldn't understand[? zeuhl ?] at the time. but then i came acrossthis project from mit called simile andsimile was basically
a headless browserthat had an api to it. and simile was-- i thinkit was actually rhino. but it was an mit projectthat is now abandonware. but i think somebody else isusing it to create web scrapers and do data mining. and then not too longafter that, well probably, i don't know three or fouryears ago, we saw zombie. anybody ever use zombie.js? zombie's really cool.
i tried using zombie, and ifound it to be really, really, really brittle, evenworse than phantom. but zombie was really brittle. and it just didn'tdo what i wanted. and then we saw phantom.js. and so phantom.jsand node, and grunt are the trifecta of awesome. because everything that runs,like unit testing these days-- if you're using unit testingor especially if you're
doing bdd-styleacceptance testing, it's probably goingto use phantom.js. and phantom is a headlessversion of webkit. and as a matter of fact,tenon uses phantom.js. phantom.js a really huge pain inthe ass for a couple of reasons though, because it does fuckall for exception handling. and so it's one thing whenyou're testing your own code and phantom craps allover itself, no problem. but if you're trying to testthe websites of everybody else,
then you have a huge problem. but anyway, phantom'sreally awesome. slimmer.js uses the same exactapi structure as phantom.js, but it runs a headlessversion of firefox. so ultimately we hadthese monolithic tools. these be-all end-all toolsthat are out there that want to be our issue tracker andour accessibility testing tool and all that sort of stuff. and that creates this,as i was referring to,
this separation betweenaccessibility and everything else. so accessibility is a thingthat we do after the fact. going back to my picture of thecardboard ferrari testarossa, that's whereaccessibility comes in. they're like, ok, allyour development is done. now we're going to pokeit with worldspace or amp. great-- like whatare you going to do? the train on thatinaccessible code's
already left the station. you're doing accessibilityafter the fact. and you're not going to getthat train to turn around. the track doesn't goin two directions. the train has left. and you're not pulling it back. so this is why iwanted it to talk about doing integrationand not separation. when i talk about integration,what i'm really talking about
is i don't want accessibilityor the accessibility tool to be a completelyseparate tool. so if you're doing-- so ifyou've chosen your unit test framework to bequnit, then i want you to be able to usequnit for all your testing. or if it's mocha or whatever--or as a matter of fact, i like to usekarma, which lets me use mocha and chai and all that. but i want it to be,whatever you're doing,
i want to be part of that. whatever you're usingfor any of that stuff, i want to also get into thedevelopment cycle early. if you read about accessibilityand project management, you'll hear peopletalk about, well, we need accessibility touchpoints. and we want our touchpointsin the procurement phase. and we want our touchpointsin the requirements phase. and we want our touchpointsin the design phase
and all that other stuff. and that's great, because ineffective project management that's what you want,especially if you're waterfall. but the problem is that,especially in that waterfall model, you have this touchpointat the end of each milestone. so great-- so now what? so we're at the endof this milestone, we want to get toour next milestone. now what we do ifwe've discovered
everything's inaccessible? or even in an agile workshop,what you're talking about is you're going tohave the qa testers do the accessibility testingafter the code's checked in. that's the lastpart of our sprint is having the qa people pokeat it, or something like that. no! in agile, we want peopletesting their own work. so our definition ofdone is all tests passed
and all code checked in. so where do we putthe accessibility? when the developer'sdoing the testing. so we want to become part ofthe existing processes, whatever that looks like. and the thing is, withan external accessibility testing tool, whatends up happening is that it's, oh, shit. we don't have time for that.
we're going to do that later,or something like that. and so it becomes a separateprocess, or an additional step in the process, and notpart of the process. so we want to work withinany existing automation, any existing test methodologies,any unit testing frameworks. if we're doing bdd, ifwe're doing selenium testing on nightly builds,whatever that looks like, we want to be able to do that. so that's why icame up with tenon.
i wanted a woodworking image. but then i saw this. does anybody know what this is? audience: [inaudible] [interposing voices] karl groves: thewankel rotary engine. karl groves: right. audience: and mazda [inaudible] karl groves: huh?
audience: like themazda [inaudible]. karl groves: probably beforemy time, cliff, sorry. karl groves: but ichose the wankel image because of the fact that itdoes the exact same thing. we have our four-cycle ignitionprocess, only no pistons. we have this bigrotor-crazy thing. and as it turns, it compresses. and then the compressed gas isfired, and then it's ejected. and i forgot about theintake cycle, but whatever.
and so that's what ilike about the image. it's talking about doingthe exact same thing, just in a different way. so here's an exampleof what tenon does. and as a matterof fact, i'm going to skip the slides now, andjust go into what we do to. so tenon is in api. at its root, it's actuallygot a command-line interface. and i can go tenon--
audience: bigger font? karl groves: bigger font. tenon, http, amazon. and then the tenonservices grabbed-- i pass a bunch of options. i inject the bunch ofdescriptions to the page. and i get a json response, whichis really terribly useless when you're looking atthe command line, because that's likegreat, now what do i do?
it tells you how many errors andall that stuff that you have. but the possibilitieshere are limitless now. because all i'mdoing is that i'm sending over a request ofthe page i want to test, and i get the response. now what do i do with that? well, i can make a webapplication out of it. so i can do-- so here's thetenon public testing page. i can either enter source.
or i can enter a website url. so by the way, thiscode for this demo is on bitbucket onan open repository. it's built using react.js. and so you can downloadand play with it. so if you ever wanted tosee a react.js application, that's it. so we now have-- we've basicallytaken the json response and piped into here.
and the json responsetells us all sorts of interesting things,like screen size and so on and so forth. has anyone ever used postman? postman is really cool. so postman allows youto test restful stuff or like api stuff. so here's an example oftesting with postman. now what's coolabout this, though,
is i'm going to show you--first off, that was really fast. i'm going to show you someof the other parameters. so let's say you wanted totest the responsive design. anybody have aresponsive website that they want to share? audience: codinghouse.gov karl groves: what's that? audience: codinghouse.gov. karl groves: codinghouse.
and what are yourbreakpoints set at? so let's do 300 by 600. so we'll send itover at 300 by 600. audience: [inaudible] site down. karl groves: so manyproblems on the site that we're justwaiting and waiting. karl groves: there we go. nope. so let's see here.
so here's view port. it gets returned to us. and we see how manyissues we have. so let's see how many we have. at the bottom of the response isgoing to have our issue counts. let's see, status,density, errors-- so we have total errors,29, total issues, 47, total warnings, 18. so let's go back andtest it at 1024 by 768.
so now we have ourresponse-- and 29,47,18. let's do-- so thisis one, flysfo.com. so this one has 31 and 38. let me change them to600 by 300-- 19 and 18. so something happened intheir responsive design that is-- theybasically hid stuff that had accessibility problemson the desktop view. so it's moreassessable in mobile. now some of the otherthings that we can do is--
audience: now, karl, that's notnecessarily more accessible. it's just fewer problems inthe [? testing ?] automatic. karl groves: yes. that's true. and that's animportant distinction. what cliff is referring tois that just because you have fewer automaticallytestable errors does not always correlate to a moreaccessible experience. but now there's lotsof other opportunities
for something like this. so derk is going togive a presentation on extending, makingstuff in your browser. so i'm going to cornerhim later and ask him to work with me on achrome dev tools extension. but this is a picture of chrispederick's web accessibility toolbar. so you can add tenon to that,and then you click on that. and it'll post right overto the tenon web service.
so it's integrated intothe web developer toolbar. or you can do-- sothis is something that i worked on last night. so this is calledtenon as you browse. and let's see. so what i created-- i actuallyworked on this last night, which is a jquery plug-in thattakes all the api parameters and runs it as a jquery plug-in. you just put it at thebottom of your page,
where your other javascript is. and every page thatgets accessed by users, as they browse, they're actuallytesting your site for you. because it submits thatpost over to the tenon api, and it tests it. now what's reallycool is it takes a hash of the page content. it takes the page content. it hashes it and stores that.
so let's say you had areally high traffic website and your home page gets accessed1,000 times before lunch. it's not going to testthat page 1,000 times. because i don't wantthat on my server. and you don't wantthat traffic lag. so it hashes it. so it only tests that once. if your page changes,because you fixed it, then it retests it.
so simple as that-- asimple jquery plug-in can be used to justto post that request. audience: so howsophisticated is your hash? because if you havedynamic content-- karl groves: sure,yeah, exactly. but what's cool about thatnow is that you can explicitly set the parameter ofsay what parts to test. so you can say, ok, don'ttest the entire page. you can test-- so one ofthings that you can do,
like let's say it'sa wordpress site. all of the posts in a wordpresssite have a class of post. so you put that inyour jquery selector and that's what gets tested. so here's another thing. we can do a grunt plug-in. so here's a gruntplug-in that we created. so here's the task here. the big differencebetween grunt and gulp
is that in grunt youbasically create this plug-in. and then you configure the task. gulp is a differentstory altogether, because gulp canwork on streams. and it's reallyfricking awesome. and i'll show youguys that in a bit. but so here's the grunt tenon. so in my grunt file--and i apologize because phpstorm doesn'tincrease the font size well.
but what we have hereis, in our tenon task, i'm going to have it testall of the items in there, in the static directory. so all my static pagesare in that directory. i'm also going to haveit pass a url here. i'm going to test amazon.com. and i have a timeout for anabsurdly long amount of time. so imagine this, that we're astandard developer workflow. we have a grunttask set up to lint,
to run jslint, jscs,whatever you have. you can even setup as a watch task. and now i can justrun grunt tenon. and it's going to goto against the server. what's that? in phpstorm? yeah. check you out. where's terminal at?
karl groves: good job there. so while we were doingthat, we found our issues. we found a bunch of--woah, that's weirdness. thanks for-- karl groves: let'sdo this again. so we're testing amazon.com. i like testing amazon. they probably hate me becausethey're seeing the tenon ua string over and over.
but anyway, so here's justlike a-- what the hell? just like runningjslint as a grunt task, you're going to get your errorsin a nice blurry presentation. and it's going to tellyou whether you've gotten errors are not. now what you can do is youcan configure the grunt task to give you a pass or fail. so then what you do here--so in this scenario, you're a developer.
you're working inyour ide or whatever. you're going to run the task. and then you're goingto get the results. but let's say if you'rein a-- now here let's say, you're doing anautomated build scenario. and so now you have a grunt taskrunning as part of your bamboo build tasks. and now you wantyour grunt tenon to just pass a true orfalse, a pass or fail.
and then if it fails, becauseyou have accessibility issues, then it turns your build red. so we don't go. or you have aconfigurable threshold of how many errors--whatever the case may be. the idea here is that inthe developer workflow, they're just testingyour work as you go. you want the errors back. you want the information back.
in an automatedbuild scenario, you don't want somethinggetting shipped that has badaccessibility problems. another thing youcould do, though, is you can just--because it's json, you do whateveryou want with it. so if you're runningstuff that you need to be able to usea junit report format, you just take thejson and you pipe it
into the format forthe xml for junit or something stupidsimple like a csv file. so now let's talk aboutsome of the quality stuff. because this is whatthe topic is about, is how we use all thesetools on tenon itself. so this is-- anyphp people here? so this is codeception. this is a gif i stoleoff the codeception site. so we use-- becausethe ui on tenon is php,
we use codeceptionfor all of our tests. and the cool thingabout codeception is it allows us todo both tdd and bdd. so i can do testing on whetheror not the api is actually returning theinformation that i want. and i could alsodo bdd to make sure that the actual workflows work. so there's a gif that showsall those things happening. and then and then we unit testthe api itself to make sure
that it's running everything. so a basic example here--we're running nodeunit. we're testing thecommand line bit here. and we're actually doingall the unit tests here. and we run 67 tests to makesure that the api is actually returning thisstuff that we need. it calculates all those thingslike the number of issues we use wercker for our builds. anybody seen wercker before?
anybody use travis? bamboo? same sort of principle,it's the same principle. you set up your tasksand it goes through it. and so each of our projectshas their own wercker build system, buildscript in place. and it runs all our unit testsand all that sort of stuff. so whenever we commit to master,it runs through these tasks and pushes it, if it's good.
and it dumps it, if it isn't. and then there's the other--here's the other big deal. let's see, howmany slides i got? so i'm going to show youguys two more things. in the ui, has anybodyever use browsersync? who said that? of course-- dirk. so we switched from-- all ofour projects-- we have at tenon, we have about sevendifferent repos
for some of the differentbits and pieces. and we switchedfrom grunt to gulp. and we just haven't looked back. the difference that people talkabout between grunt and gulp is configuration versus actuallywriting the functionality. basically, you can create allyour functionality using gulp. and it's right there for you. we switched right to gulp,and i'm not looking back. but one of the cool thingsi really like about-- one
of things that we builtin-- is browsersync. so i'm going to go to-- i'mgoing to type in gulp watch. so gulp watch is alot like grunt watch in that it will watch forfiles that have changed. and it will spitout a php error. that's actually becausebrowsersync is lagging somehow. so it's running all our tests. it's pushing our stuffto our dist folder. all right, so now ishould be able to go back
to local host 3000. there we go. so this is wherewe saw-- it goes to-- local host 3000is where browsersync wants to launch the site. so i'm going toput this in safari. i have it in chrome. and i'm going toput it in firefox. so what's greatabout browsersync
is you can test all yourwork in different browsers, as you're working. as a watch task, it's going togo ahead and do the watching. so i'm going to go downhere to my saas folder, and let's see, i'm goingto go to styles.css. and what i'm goingto do is i'm going to put a border around body. so i put a red border around it. karl groves: what?
oh, yeah. so there we saw that itinjected the newly compiled css file to chrome and tosafari and to firefox. i love browsersync for that. they also offer the abilityfor you to not use local host, but use another ip that youcan then test stuff with. so that's really awesome. so then the otherpart about this that i wanted to showyou guys-- first off, i'm
going to get rid of thatbefore my partner goes, why is everything red? so i mentioned unittesting before. and we use karma,because it allows us to use it a couple differentframeworks for our testing. and so we can use karmafor both tdd and bdd. one of the things we dois, before we get a test, before we push a testout live to be testing against other people's stuff,we want to unit test our test.
so it's like inception. because we're testing the tests. we're unit testing the testthat tests other people's stuff. and what i love aboutthis, though, what's really great aboutthis approach is it now lets us dothis live debugging. so unit testing is one thing. unit testing isgreat for especially test-driven development,where you're
going to write your test. it's going to failautomatically. and then you're going to fixyour code so that it passes. that's like thebig mantra of tdd. but what else isreally cool is now this is performing atleast how i wrote my test. and then if the time comes thatsomething happens, and you're like, yeah, that's really notmeeting user expectations, you just fix yourcode and retest it.
you adjust your test. you fix the code andso on and so forth. so we had a issuenot too long go with this one, whichis what i call tid 73. and 73 is our accessibilitytest for stupid link text. and stupid link text isstuff like click here, more, read more, allthat sort of stuff. and so we have, let's see, clickhere, read, read more, learn more, continue, go,continue reading, view,
view more, view less,all that sort of stuff. and we had a problem whereaccessibility people will tell me that-- andsomething we didn't think of at the time-- is that theaccessible name for a link is derived from thetext within the link. if the link onlyconsists of an image, the accessible name iscalculated by the at as being the alt attribute. so we came out withour private beta
and immediately had peoplesaying, hey, wait a minute. i have an imagelink that has this-- that's not getting caught. and so we createdthis new-- it's just a new fixture that testsfor the alt attribute. and we had to adjust our test. so we have a fixture here. and then we have achai should assertion. so it says-- andsorry for those of you
who can't see it--so elements.length should equal 73. and so we justadjusted the test. and now i can runnpm test, and it's going to test all this stuff. what's really cool, too, thati like about using it this way, is-- i'll let thisshow you how it passes. so now it says, 79 of 79. the other coolpart about this is
that it opens aninstance of chrome. and if you click onthe debug, and then you go to your console,you'll get an actual test by test feedback in the console. it logs to consolefor each one of these. so tid success 73,unhelpful link text found. and then we unit testthis thing, which we had to create, whichwe call a11y visible, visibility of content.
if it's display none,versus visibility hidden, versus off screen, andall this sort of stuff, we have this utility function. we had to add that. and so that's how we go aboutmaking sure each of the tests are good. and during this, we havealso this build process behind the scenes that createsboth the unit test file and the test filethat gets tested.
and basically, if theunit test doesn't pass, if this list ofpictures doesn't pass, then the test fileis just tossed aside. and we use the old one,which is like our known good. it's our single source of truth. so that's it. thank you very much. i have tenon stickersup here, if you're the laptop stickerkind of person.
i have a bunch of them. so come get them. all right, thank you. [applause]