Friday, August 31, 2012

Find: 119 million Americans lack broadband Internet, FCC reports

As we mentioned in class...

119 million Americans lack broadband Internet, FCC reports

The US is a long way from its goal of making broadband Internet available to all 314 million Americans. In its third annual broadband progress report, the Federal Communications Commission says 19 million Americans have no option to buy fixed broadband Internet service, and an additional 100 million Americans that do live in areas where broadband is available are not subscribers.

The FCC defines broadband as 4Mbps download speeds and 1Mbps upload speed. So, many people have Internet access that isn't counted in the report. But the US is decidedly behind many other countries. A report last year by the International Telecommunications Union showed the US having 27.6 fixed broadband subscriptions per 100 inhabitants, behind 15 other countries including first place Netherlands, which achieved 38.1 subscriptions per 100 inhabitants. 

Exactly how many people in the US have any type of Internet access is not detailed by the FCC. US Census figures from 2010 showed that in 74.2 percent of households, at least one person had Internet access at home, outside of home, or both.

Find: US falling further behind - Ireland calls for minimum Internet speeds of 30Mbps

Ireland calls for minimum Internet speeds of 30Mbps

With the exception of Google Fiber, the United States isn’t exactly breaking records when it comes to high-speed Internet policy. The National Broadband Plan, which was released two years ago, says that there should be a minimum level of service of at least 4Mbps for all Americans. Since then, not much has happened.

But across the pond in Ireland, Communications Minister Pat Rabbitte, has recently decided that that’s not nearly enough.

On Thursday, he outlined a new broadband plan for Ireland that puts the United States to shame. He says that half the population, largely in the urban and suburban cores, should have speeds of 70Mbps to 100Mbps, with service of at least 40Mbps to the next 20 percent of the country. Finally, he writes, there should be a "minimum of 30Mbps for every remaining home and business in the country—no matter how rural or remote."

Find: Firefox 15 - 3d gaming, js debugger, no extension mem leaks

Firefox 15 arrives, supports compressed textures for impressive 3D gaming



The BananaBread WebGL demo running in Firefox 15

Mozilla announced today the release of Firefox 15, a new version of the open source Web browser. The update brings a number of noteworthy enhancements, including new built-in development tools and enhanced support for cutting-edge Web standards that enable sophisticated gaming experiences. Under the hood, Firefox 15 introduces a new optimization that can radically reduce the browser’s memory footprint for users who rely on many add-ons.

As we have discussed in much of our recent browser coverage, modern standards-based Web technologies are increasingly capable of supporting the kind of interactive multimedia experiences that used to only be available through plugins or native applications. The major browser vendors, which are all working to further expand the range of capabilities offered by the Web, have recently taken an interest in enabling game development.

Mozilla has been working on a number of relevant features, including an API for displaying content in fullscreen mode, support for mouse-locking, and sophisticated real-time audio mixing functionality. Earlier this year, Mozilla launched its own real-time multiplayer adventure game called BrowserQuest with the aim of showcasing HTML5 gameplay. The open Web is clearly a serious contender for casual gaming.

Find: a nice review of Firefox dev tools

Thursday, August 30, 2012

Event: TechRevolution Seminar focuses on local startup opportunities - 9/25/12

Billy regularly visits our courses.

---------- Forwarded message ----------
From: Billy Houghteling <billy_houghteling@ncsu.edu>
Date: Wed, Aug 29, 2012 at 3:20 PM
Subject: TechRevolution Seminar: September 25, 2012

The Springboard Initiative and the Office of Technology Transfer (OTT) are excited to present our next innovation outreach event.  I hope you will have time to join us on September 25th for our TechRevolution seminar.  Our seminar topic will be "Funding Environment for University Startups " and will feature Mr. Robert J. Creeden, Executive Director of the Blackstone Entrepreneurs Network.


For additional information and to register, please visit:


http://events.r20.constantcontact.com/register/event?oeidk=a07e6b6t9w416f02b11&llr=crh9l7dab

Best regards,

Billy

Wednesday, August 22, 2012

Find: HTML5, Apps and JavaScript Wrap-Up


HTML5, Apps and JavaScript Wrap-Up

The first TimesOpen event of 2012 was a big success and a lot of fun.

Tuesday, August 21, 2012

Find: Space for startups to debut next month on Hillsborough Street

An exciting, good idea. 

Space for startups to debut next month on Hillsborough Street

The idea grew out of an innovation summit in February where 175 young people, government officials and creative types brainstormed ways to build an innovation-friendly brand for Raleigh.

Assignment: the logistics of getting started

Hey folks,

To get things rolling in class, you have three things you need to do, which make up your first assignment:
  1. Go complete the consent and waiver for online course components, if you haven't already. If you do not wish to consent, then please email us.
  2. Go fill out the collecting your online IDs form. If you do not already have them, this will require obtaining personal IDs from GoogleVoicethread and GitHub. You need not fill this out if you do not wish to consent to part 1 above
  3. Go introduce yourself on ning in the other category of our Forum. Make sure you talk about:
    1. Your professional experience
    2. Your experience working in teams
    3. Your programming experience in general
    4. Your web development experience (js, markup etc.)
    5. Your design experience
    6. What you hope to get from this class
    7. (Optional) One interesting factoid about you!
You should complete this assignment by Tuesday August 28 at class time.

Monday, August 20, 2012

Competition: Mozilla Ignite

Via our lab's collaborator Carol Strohecker:

Easy brainstorm deadline on Thursday, succeed you win money and more development rounds follow.


Benjamin Watson
Director, Design Graphics Lab | Associate Professor, Computer Science, NC State Univ.
919-513-0325 | designgraphics.ncsu.edu | @dgllab

---------- Forwarded message ----------
Subject: Mozilla Ignite

Calling all developers, network engineers and community catalysts. Mozilla and the National Science Foundation (NSF) invite designers, developers and everyday people to brainstorm and build applications for the faster, smarter Internet of the future. The goal: create apps that take advantage of next-generation networks up to 250 times faster than today, in areas that benefit the public -- like education, healthcare, transportation, manufacturing, public safety and clean energy.

Friday, August 17, 2012

Find: Retina displays ripple on - Safari and Chrome now offer support for high-resolution CSS

And retina laptops are just the beginning. 

Safari and Chrome now offer support for high-resolution CSS code

MacBook Pro with Retina display angle (1024px)

Apple's Safari and Google's Chrome web browsers have been updated to offer support for CSS's image-set specification, Webmonkey reports. The update enables support displays with higher pixel densities — notably, Apple's new Retina Macbook Pro. Chrome was optimized for Apple's Retina display in July, but the new standard allows the browser to detect a device's display and select the highest quality images automatically. The image-set specification also gauges a user's bandwidth to determine whether or not to serve Retina-quality images. Of course, adoption of the standard is largely dependent on developer support, but should Retina-quality displays become more pervasive, that may sooner rather than later.

Wednesday, August 15, 2012

Find: The Tech Behind the New Twitter.com

The Tech Behind the New Twitter.com

The Twitter.com redesign presented an opportunity to make bold changes to the underlying technology of the website. With this in mind, we began implementing a new architecture almost entirely in JavaScript. We put special emphasis on ease of development, extensibility, and performance. Building the application on the client forced us to come up with unique solutions to bring our product to life, a few of which we’d like to highlight in this overview

Find: Improving Browser Security with CSP

Improving Browser Security with CSP

If you are using Firefox 4, you now have an extra layer of security when accessing mobile.twitter.com.

Over the past few weeks we've been testing a new security feature for our mobile site. It is called a Content Security Policy, or CSP. This policy is a standard developed by Mozilla that aims to thwart cross site scripting (XSS) attacks at their point of execution, the browser. The upcoming release of Firefox 4 implements CSP, and while the mobile site may not get a high volume of desktop browser traffic (the desktop users hitting that site typically have low bandwidth connections), it has given us an opportunity to test out a potentially powerful anti-XSS tool in a controlled setting.

CSP IN A NUTSHELL

In a typical XSS attack, the attacker injects arbitrary Javascript into a page, which is then executed by an end-user. When a website enables CSP, the browser ignores inline Javascript and only loads external assets from a set of whitelisted sites. Enabling CSP on our site was simply a matter of including the policy in the returned headers under the CSP defined key, 'X-Content-Security-Policy'.

The policy also contains a 'reporting URI' to which the browser sends JSON reports of any violations. This feature not only assists debugging of the CSP rules, it also has the potential to alert a site’s owner to emerging threats.

Find: Twitter’s mobile web app delivers performance

Twitter’s mobile web app delivers performance

As the number of people using Twitter has grown, we've wanted to make sure that we deliver the best possible experience to users, regardless of platform or device. Since twitter.com is not optimized for smaller screens or touch interactions familiar to many smart phones, we decided to build a cross-platform web application that felt native in its responsiveness and speed for those who prefer accessing Twitter on their phone's or the tablet’s browser.

A better mobile user experience

When building mobile.twitter.com as a web client, we used many of the tools offered in HTML5, CSS3, and JavaScript to develop an application that has the same look, feel, and performance of a native mobile application. This post focuses on four primary areas of the mobile app architecture that enabled us to meet our performance and usability goals:


  • event listeners

  • scroll views

  • templates

  • storage

  • Twitter's mobile app architecture

Cassovary: A Big Graph-Processing Library

Cassovary: A Big Graph-Processing Library

We are open sourcing Cassovary, a big graph-processing library for the Java Virtual Machine (JVM) written in Scala. Cassovary is designed from the ground up to efficiently handle graphs with billions of edges. It comes with some common node and graph data structures and traversal algorithms. A typical usage is to do large-scale graph mining and analysis.


At Twitter, Cassovary forms the bottom layer of a stack that we use to power many of our graph-based features, including "Who to Follow" and “Similar to.” We also use it for relevance in Twitter Search and the algorithms that determine which Promoted Products users will see. Over time, we hope to bring more non-proprietary logic from some of those product features into Cassovary.


Please use, fork, and contribute to Cassovary if you can. If you have any questions, ask on the mailing list or file issues on GitHub. Also, follow @cassovary for updates.


-Pankaj Gupta (@pankaj)

Find: MySQL at Twitter

MySQL at Twitter

MySQL is the persistent storage technology behind most Twitter data: the interest graph, timelines, user data and the Tweets themselves. Due to our scale, we push MySQL a lot further than most companies. Of course, MySQL is open source software, so we have the ability to change it to suit our needs. Since we believe in sharing knowledge and that open source software facilitates innovation, we have decided to open source our MySQL work on GitHub under the BSD New license.

The objectives of our work thus far has primarily been to improve the predictability of our services and make our lives easier. Some of the work we’ve done includes:

  • Add additional status variables, particularly from the internals of InnoDB. This allows us to monitor our systems more effectively and understand their behavior better when handling production workloads.
  • Optimize memory allocation on large NUMA systems: Allocate InnoDB's buffer pool fully on startup, fail fast if memory is not available, ensure performance over time even when server is under memory pressure.
  • Reduce unnecessary work through improved server-side statement timeout support. This allows the server to proactively cancel queries that run longer than a millisecond-granularity timeout.
  • Export and restore InnoDB buffer pool in using a safe and lightweight method. This enables us to build tools to support rolling restarts of our services with minimal pain.
  • Optimize MySQL for SSD-based machines, including page-flushing behavior and reduction in writes to disk to improve lifespan.
We look forward sharing our work with upstream and other downstream MySQL vendors, with a goal to improve the MySQL community. For a more complete look at our work, please see the change history and documentation.

If you want to learn more about our usage of MySQL, we will be speaking about Gizzard, our sharding and replication framework on top of MySQL, at the Percona Live MySQL Conference and Expo on April 12th. Finally, contact us on GitHub or file an issue if you have questions.

On behalf of the Twitter DBA and DB development teams,

- Jeremy Cole (@jeremycole)

- Davi Arnaut (@darnaut)

Find: TwitterCLDR - Improving Internationalization Support in Ruby

TwitterCLDR: Improving Internationalization Support in Ruby

We recently open sourced TwitterCLDR under the Apache Public License 2.0. TwitterCLDR is an “ICU level” internationalization library for Ruby that supports dates, times, numbers, currencies, world languages, sorting, text normalization, time spans, plurals, and unicode code point data. By sharing our code with the community we hope to collaborate together and improve internationalization support for websites all over the world. If your company is considering supporting multiple languages, then you can try TwitterCLDR to help your internationalization efforts.

Motivation

Here’s a test. Say this date out loud: 2/1/2012
If you said, “February first, 2012”, you’re probably an American. If you said, “January second, 2012”, you’re probably of European or possibly Asian descent. If you said, “January 12, 1902”, you’re probably a computer. The point is that as humans, we almost never think about formatting dates, plurals, lists, and the like. If you’re creating a platform available around the world, however, these kinds of minutiae make a big difference to users.
The Unicode Consortium publishes and maintains a bunch of data regarding formatting dates, numbers, lists, and more, called the Common Locale Data Repository (CLDR). IBM maintains International Components for Unicode (ICU), a library that uses the Unicode Consortium’s data to make it easier for programmers to use. However, this library is targeted at Java and C/C++ developers and not Ruby programmers, which is one of the programming languages used at Twitter. For example, Ruby and TwitterCLDR helps power our Translation Center. TwitterCLDR provides a way to use the same CLDR data that Java uses, but in a Ruby environment. Hence, formatting dates, times, numbers, currencies and plurals should now be much easier for the typical Rubyist. Let’s go over some real world examples.

Find: Leak Finder - a new tool for JavaScript

Leak Finder: a new tool for JavaScript

Leak finder for JavaScript helps web application developers find memory leaks in their JavaScript programs.

In garbage-collected languages, such as JavaScript, you cannot have traditional memory leaks by forgetting to free memory: when all references to an object are dropped, the object is garbage-collected and the memory is freed.

However, JavaScript programs can leak memory by unintentionally retaining references to objects. For example the references can be pointers to objects stored in a data structure in a JavaScript library (e.g., Closure) instead of the application code. If an object is unintentionally retained, all objects it points to are kept alive as well. This will lead to superfluous memory consumption.

Find: visualizing the evolution of the web

Behind the scenes: visualizing the evolution of the web

Dero

Sergio

This guest post is by Sergio Alvarez, Vizzuality, and Deroy Peraza, Hyperakt, in collaboration with Min Li Chan, Chrome Team

At Google I/O this year, we launched a new version of The Evolution of the Web, a project visualizing the history and pace of innovation in web technologies and browsers. The Evolution of the Web traces how web technologies have evolved in the last two decades and highlights the web community’s continuous efforts to improve the web platform and enable developers to create new generations of immersive web experiences. In collaboration with the Google Chrome team, the team at Hyperakt designed the interactive visualization while Vizzuality built it using HTML5, SVG, and CSS3.

The visualization included 43 web technology "strands" across 7 browser timelines to represent major developments on the web platform. On hover or tap, each strand is highlighted to reveal intersections that tell the story of when browser support was implemented for each new web technology. To provide additional context, we developed a secondary visualization to illustrate the growth of Internet users and traffic.

Find: Access Google APIs with Dart

Dart is google's js replacement. 

Access Google APIs with Dart

Author PhotoBy Sam McCall, Software Engineer

Ever since we launched Dart, the structured web programming language, we've heard developers asking for a way to use Dart to connect to Google's most popular APIs and services. Thanks to Google's 20% time policy, I’m working on an open source library that helps Dart developers connect to Google APIs such as Google+, YouTube, URL Shortener, and many more.

My favorite sample showing you how to build web apps with Dart and Google APIs is roulette. This little app will shorten your URL, or if you're lucky, rickroll you.

This library is currently alpha and under active development. Please report all bugs in the issue tracker and ask questions in the discussion forum. Thanks for the feedback, and have fun!

Sam McCall is an engineer in Google’s Corporate Engineering Team in Munich, and is a part-time Dart tinkerer.

Posted by Scott Knaster, Editor

Find: Startup Zenph relaunches as music online learning platform

Startup Zenph relaunches as music online learning platform

ZOENBacked by $900,000 in new financing with Intersouth Partners as the lead investor, a new company emerges from the assets of Zenph Sound Innovations.

Monday, August 13, 2012

Find: Nice podcast on The Big Web Show - Tantek Çelik on CSS and the responsive web

Find: Nice podcast on The Web Ahead - Smart Responsive Web Design with Scott Jehl

Saturday, August 11, 2012

Find: Webstate by Forrester - 75% us browsers are html5 ready

Forrester report urges HTML5 adoption, says most browsers can support it

A new report from market research company Forrester says that it’s time for companies to embrace the latest Web standards and start building richer Web experiences that take advantage of the capabilities that are supported by modern Web browsers. The report highlights changing trends in browser adoption and talks about how companies are taking advantage of new functionality.

A key issue addressed in the report is the growing market penetration of HTML5-enabled Web browsers. Citing recent browser marketshare statistics, Forrester says that nearly 75 percent of users in North America and 83 percent in Europe are running browsers that support a large segment of the HTML5 feature set. Forrester says that the penetration of HTML5-compatible browsers grew from 57 percent to 75 percent between the second quarter of 2011 and the second quarter of 2012.

Alongside that tremendous growth in modern browser adoption among end users, the pace of innovation has also increased. New standards are being drafted, maturing, and gaining adoption much faster than in the past. Browser vendors are adjusting their release management strategies and moving to a more iterative approach to development in order to accommodate these changes.

Friday, August 10, 2012

Job: App Design and Creation Intern

An internship opportunity with SmartOnline, a local mobile tools company.
---------- Forwarded message ----------
From: Bob Dieterle <bob.dieterle@smartonline.com>
Date: Fri, Aug 10, 2012 at 11:52 AM
Subject: FW: App Design and Creation Intern
Cc: Robert Hancock <robert.hancock@smartonline.com>


Hi Ben,

I hope your summer is going well.  I am looking for great creative graphics design interns that want to learn our platform and build apps for our clients using the platform.  Please see the description below.   I was hoping that you can post this within your classes and/or other ways to promote this.  They will be working under my experts in this field and I think will learn a lot as each client app would be different and challenging.

 

Thanks,

Bob Dieterle

Sr Vice President and General Manager 

Image001

4505 Emperor Boulevard, Suite 320, Durham, NC 27703

T (919) 237 4182  M (919) 667-7747   F (919) 765 5020

 

Tuesday, August 7, 2012

Spotted: Web use on smartphones -- users revisit sites only 25% of the time

Characterizing web use on smartphones

Chad Tossell, Philip Kortum, Ahmad Rahmati, Clayton Shepard, Lin Zhong

The current paper establishes empirical patterns associated with mobile internet use on smartphones and explores user differences in these behaviors. We apply a naturalistic and longitudinal logs-based approach to collect real usage data from 24 iPhone users in the wild. These data are used to describe smartphone usage and analyze revisitation patterns of web browsers, native applications, and physical locations where phones are used. Among our findings are that web page revisitation through browsers occurred very infrequently (approximately 25% of URLs are revisited by each user), bookmarks were used sparingly, physical traversing patterns mirrored virtual (internet) traversing patterns and users systematically differed in their web use.

Find: Announcing TimesOpen 2012

The New York Times has embraced the web big time. In fact, they are a leader in the field. They have speakers from all sorts of web development companies speaking at their events, the first of which happens just before our class. 

Announcing TimesOpen 2012

It's that time of the year again! We've just released our schedule for TimesOpen 2012. As always, we'll have four events leading up to an all-day hack day in December.

Find: Mobile Websites vs Responsive Design: What’s the right solution for your business?

Mobile Websites vs Responsive Design: What’s the right solution for your business?

The following post originally appeared on the Google Mobile Ads Blog.

As more of your competitors Go Mo, building a mobile-friendly site becomes more of a priority for your business. Over the past two years alone, mobile search traffic has increased five-fold. Customers are searching for your business from their mobile phones, and you need to engage them with a mobile experience designed for completing on-the-go tasks from their small screens. Recently many businesses have been asking us about an emerging trend among web developers—responsive design—and if they should use it. While we believe that building a separate mobile website is an appropriate solution for certain businesses, it’s also important to understand how responsive design might fit into your plans to Go Mo.

What is responsive design? It is a website design technique that allows you to create a single website that will adapt to the device on which it’s being viewed, whether it’s a laptop, smartphone or tablet. A site built with responsive design will automatically resize for different devices, but it is up to you to prioritize the content that matters most to the mobile user. For example, a mobile user might need to quickly find your phone number or directions, whereas a tablet user might want a simpler way to make couch-surfing purchases. A site built using responsive design could prioritize click-to-call and click-to-map buttons, while the tablet site would focus on simplifying the shopping cart. For the technical details on how responsive design works for building mobile-friendly sites, read this blog post from the Google webmaster team.

Find: Introducing new Fusion Tables API

Introducing new Fusion Tables API

By Warren Shen, Google Fusion Tables team

Amidst all the excitement of I/O followed by the July 4th holiday in the U.S., many developers missed the announcement of the new Fusion Tables API. The new API includes all of the functionality of the existing SQL API, plus the ability to read and modify table and column metadata as well as the definitions of styles and templates for data visualization. This API is also integrated with the Google APIs console which lets developers manage all their Google APIs in one place and take advantage of built-in reporting and authentication features.

With this launch, we are also announcing a six month deprecation period for the existing SQL API. Since the new API includes all of the functionality of the existing SQL API, developers can easily migrate their applications using our migration guide.

For a detailed description of the features in the new API, please refer to the API documentation.

Posted by Ashleigh Rentz, Editor Emerita

Find: Measure and optimize with mod_pagespeed experiments

Measure and optimize with mod_pagespeed experiments

Author Photo
By Jeff Kaufman, Software Engineer, PageSpeed Team


Making your site fast shouldn’t require lots of manual optimization. With mod_pagespeed, an open-source Apache module, you can automatically apply web performance optimization best practices like cache extension, image optimization, and css inlining to speed up your site without a lot of hassle. As of version 0.10.22.4, mod_pagespeed now supports A/B tests integrated with Google Analytics, allowing you to measure how much it speeds up your site on live traffic and experimentally determine the best settings.

When running an experiment, mod_pagespeed randomly assigns visitors to experimental configurations based on percentages you choose. You can run an experiment on 1% of your traffic, 100%, or anywhere in between without affecting other visitors. It also injects JavaScript to report experiment assignments back to your Google Analytics account in a custom variable. Within Analytics you can track the impact of experimental configurations on page load times, bounce rates, conversions, or any other Analytics metric.

We ran an example experiment, comparing mod_pagespeed running with default settings to mod_pagespeed in pass-through mode, on a small blog. This required adding the following lines to our pagespeed.conf:

ModPagespeedRunExperiment on ModPagespeedAnalyticsID "UA-XXXXXXXX-Y" # half the users get the pagespeed optimizations ModPagespeedExperimentSpec id=3;percent=50;default # half get an unoptimized site ModPagespeedExperimentSpec id=4;percent=50
While this site was static and contained mostly text, it did use some JavaScript and images and had not been manually optimized. We ran the experiment for a month, over which Analytics observed 11K page views, and we saw a 20% improvement in average page load time:

Competition: Google Places API Developer Challenge 2012

Deadline 10/31

The power of places and big data for good: Google Places API Developer Challenge 2012

Author Picture
By Carlos Cuesta, Google Maps API Product Marketing Manager

Cross-posted with the Google Geo Developers Blog

How would you make your community or local government run better? In our first Google Places API Developer Challenge, we’re inviting developers around the world to make something that improves their communities or governments by using the Google Places API and its database of places and tools. The developers of the winning applications will receive a VIP experience at Google I/O 2013.

You might create an app or site that solves health problems, understands crime patterns, or improves commerce. You can use any platform as long as you build with the Google Places API and it benefits your community or government. We’re looking for your best and most innovative ideas.




Built on the comprehensive global database of more than 95 million places that powers Google Maps, the Google Places API enables you to search for information about a variety of nearby places such as establishments, geographic locations and prominent points of interest. You can re-rank place results based on user check-ins, and create new places specific to your app.

To help you develop your ideas and build better apps, we’ve been working with local government officials in Austin, Boston, Chicago, London, Louisville, New York City, Philadelphia, Portland, San Francisco, and Seattle along with the White House to surface a wide variety of data sets for your apps. You can find these data sets and more on the Google Places API Challenge site at http://developers.google.com/challenge and hear more about what cities have to say about the challenge here. You can also follow updates and hangouts about the challenge on +Google Maps API.

The submission window opens on August 15th and closes on October 31st, 2012.

We look forward to seeing what can happen when your imagination and the Google Places API come together!

Monday, August 6, 2012

Find: Turbocharging web sites with new PageSpeed Service optimizations

Turbocharging web sites with new PageSpeed Service optimizations


Kishore


Rahul
By Rahul Bansal and Kishore Simbili, PageSpeed Team

We spend a lot of time working to make the web faster. Last year, we introduced PageSpeed Service, an online service that automatically speeds up loading of web pages.

We are constantly working on new optimizations (rewriters) that can make pages load even faster. Along these lines, we are introducing a new rewriter called "Cache and Prioritize Visible Content". This rewriter enables users to start interacting with the web page and consuming the content much sooner. It accomplishes this by optimizing the page as a whole using the following web page-aware techniques and with minimal configuration needed:
  • Make HTML cacheable. Typically, most web pages are not cached because they contain small amounts of personalized information or other non-cacheable data. This rewriter separates the non-cacheable portions from the HTML and enables caching for the rest of the content on PageSpeed servers. When the page is loaded, PageSpeed servers send the cacheable parts immediately while non-cacheable parts are fetched from the origin server & patched into the browser later.

  • Prioritize visible content rendering. Rendering of a modern web page requires several network resources, but not all of them are needed right away. This rewriter automatically determines and prioritizes the content that is above the fold of the browser, so that it doesn’t have to compete with the rest of the page.

  • Defer Javascript. JavaScript execution is deferred until page load so that it doesn’t block rendering of visible content.
  • Find: US plans to block any plan to give UN more control of the internet

    Us blocks running the Internet by un committee. A good thing.  

    US plans to block any plan to give UN more control of the internet

    Ethernet / Internet (stock)

    Another round in the decades-old battle for which entities are able to control and regulate key parts of the internet is gearing up for this December. While the World Conference on International Telecommunications (WCIT) is likely to feature a debate about the UN's role in these regulations, the US has signaled its intentions to oppose any such measure, as noted by Wired.

    US head of WCIT delegations, Terry Kramer, has expressed some of the US' reservations on the issue. Firstly, he cites "greater regulatory burdens" that could slow online innovation. More importantly, he echoes the same concerns voiced by Vint Cerf earlier this month: that handing control of parts of the internet to states instead of the independent ICANN could lead to...

    Find: Visit the very first web page from more than 20 years ago

    Only 20 years!

    Visit the very first web page from more than 20 years ago

    Tim Berners-Lee's NeXT computer


    It was more than two decades ago that the first web page launched, and if you're curious what the web looked like back in 1991, CERN has preserved that original site for your perusing pleasure. Created by Tim Berners-Lee on a NeXT computer, the site was a place to find information about the new and exciting World Wide Web — "a wide-area hypermedia information retrieval initiative aiming to give universal access to a large universe of documents," as it was described.

    The preserved version has been around for some time, but isn't an exact replica, however. CERN says that it's a copy from 1992, and that "changes were made daily to the information available on the page as the WWW project developed." Sadly no screenshots exist of the...

    Find: Amazon launches rental service for paper textbooks

    Amazon launches rental service for paper textbooks

    Amazon Textbook RentalJuggernaut Amazon is striking another blow at the traditional textbook industry by offering semester-long rentals of physical books. The listings for a number of textbooks now include a "rent" option that's usually around $50 for a title that sells for $170. According to Amazon's FAQ, the books are rented out by the semester (counted as 130 days), with one 15-day extension allowed. Textbooks are shipped at standard prices, and the cost of returning them is paid by Amazon. Depending on their luck, renters might receive a book that's new or one that's gently used.

    Ebay's Half.com and other sites rent textbooks at roughly similar prices, but Amazon's ubiquity means it's likely to make the practice more mainstream. While Amazon also offers a...

    Find: ARPANET programmer and internet luminary Steve Crocker on how the internet was created

    On the recent wall street journal op ed claiming the Internet was built without government help. Short answer: yes it was. 

    ARPANET programmer and internet luminary Steve Crocker on how the internet was created

    ethernet_640

    ICANN Board Chairman and Internet Hall of Fame inductee Steve Crocker has been working with computer networks since before the internet existed, first as a UCLA graduate student and eventually as CEO of tech startup Shinkuro. Drawing on his early experience working with military network and internet precursor ARPANET in the 1960s and 1970s, he's written a long and thoughtful response to L. Gordon Crovitz's claim that government research begetting the internet was a "myth." While Crocker is clear that he doesn't think the internet could have been created without government funding and help, his piece is also a fascinating look into how early systems get built, regardless of who's paying for them. You can also read "father of the internet"...

    Wednesday, August 1, 2012

    Guest: Robert Buhler of MMI PR (updated: 10/25)

    Folks,

    On September 25 (updated: October 25) Rob Buhler, President of MMI Public Relations, will visit with us. He'll be sitting in on critique.

    Best,

    Ben