This project is an HTML-based resume and portfolio for Kinsey Roberts.
- Built on Twitter Bootstrap for responsive UI
- Developed in PHP with MySQL database for quick and easy changes
- Served via AWS S3 as static HTML pages generated by a script
- Aims for compliance with WCAG 2.0 AA accessibility guidelines
- Aims for 7-9 grade reading level
- Project stories use STAR (Situation, Task, Action, Response) structure
- Acronyms clarified using
abbr
tags
- Google Tag Manager not injected if Do Not Track (DNT) is set
- Local fallbacks used for all CDN assets if DNT is set
- Database is backed up to
/sql/
usingmysqldump
andsed
- Assets are built using a Ruby script
- CSS is compiled from Sass and minified
- JavaScript files are concatenated and minified
- PHP configuration and
humans.txt
are updated to reflect build time
- Each page is saved to a static HTML file using a PHP script
- Build time is included as a query string parameter on all links for cache busting
- Page's contents are minified and saved under
/static/
sitemap.xml
is built as each page is processed
- Critical CSS is inlined into each page using a
gulp
task- The
critical
npm package analyzes all 4 Bootstrap breakpoints to inline above-the-fold styles - Pages were split into 6 parallel queues to reduce processing time
- The
This seems like an absurd number of technologies for one build script. The Ruby Sass/JS pipeline was reused from a previous project. The static pages script needed database access, and doing it in PHP let me reuse the data objects I'd written. I couldn't find a Ruby critical CSS gem that considered more than one breakpoint; enter Node.js and gulp. Add some batch scripting to tie it all together and it became a tidy build process.
At first, I just used the AWS CLI's aws s3 sync
command to push the /static/
folder to an S3 bucket. Later, I ran the site through PageSpeed Insights and sonarwhal and found plenty of room for optimization.
Thankfully, the aws s3 cp
command accepts arguments to set HTTP headers. Since I was using cache busting, I set the cache-control
and expires
headers to their maximum values. I tweaked the content-type
header and set the charset
. S3 doesn't offer server-side compression, so I gziped text-based files before uploading and set the content-encoding
header.
Once all of the files are uploaded, the script pings Google and Bing to notify them that an updated sitemap is available.