Filed under: Analytics, Data, Datarati, Research, Statistics | Tags: Datarati, Google Chief Scientist, Hal Varian, McKinsey Quarterly, Statisticians
Hal Varian, professor of information sciences, business, and economics at the University of California at Berkeley and Google’s Chief Economist…
“I keep saying the sexy job in the next ten years will be statisticians.
People think I’m joking, but who would’ve guessed that computer engineers would’ve been the sexy job of the 1990s?
Varian then goes on to say:
The ability to take data—to be able to understand it, to process it, to extract value from it, to visualise it, to communicate it—that’s going to be a hugely important skill in the next decade, not only at the professional level but even at the educational level for elementary school kids, for high school kids, for college kids.
Because now we really do have essentially free and ubiquitous data. So the complimentary scarce factor is the ability to understand that data and extract value from it.
I think statisticians are part of it, but it’s just a part. You also want to be able to visualise the data, communicate the data, and utilise it effectively.
But I do think those skills—of being able to access, understand, and communicate the insights you get from data analysis—are going to be extremely important.
Managers need to be able to access and understand the data themselves.
Filed under: Data, Research | Tags: Amazon Web Services, AWS, Cloud Computing
When will Australian data be available in the cloud on Amazon Web Services (AWS)?
Public Data Sets on AWS provides a centralised repository of public data sets that can be seamlessly integrated into AWS cloud-based applications. AWS is hosting the public data sets at no charge for the community, and like all AWS services, users pay only for the compute and storage they use for their own applications.
An initial list of data sets is already available, and more will be added soon.
Previously, large data sets such as the mapping of the Human Genome and the US Census data required hours or days to locate, download, customize, and analyze. Now, anyone can access these data sets from their Amazon Elastic Compute Cloud (Amazon EC2) instances and start computing on the data within minutes.
Users can also leverage the entire AWS ecosystem and easily collaborate with other AWS users. For example, users can produce or use prebuilt server images with tools and applications to analyze the data sets.
By hosting this important and useful data with cost-efficient services such as Amazon EC2, AWS hopes to provide researchers across a variety of disciplines and industries with tools to enable more innovation, more quickly.
Filed under: Search | Tags: Deep Peep, Deep Web, Google, Prof. Juliana Freire
Prof. Juliana Freire at the University of Utah is working on an ambitious project called DeepPeep (www.deeppeep.org) that eventually aims to crawl and index every database on the public Web. Extracting the contents of so many far-flung data sets requires a sophisticated kind of computational guessing game.
“The naïve way would be to query all the words in the dictionary,” Ms. Freire said. Instead, DeepPeep starts by posing a small number of sample queries, “so we can then use that to build up our understanding of the databases and choose which words to search.”
Based on that analysis, the program then fires off automated search terms in an effort to dislodge as much data as possible. Ms. Freire claims that her approach retrieves better than 90 percent of the content stored in any given database. Ms. Freire’s work has recently attracted overtures from one of the major search engine companies.
As the major search engines start to experiment with incorporating Deep Web content into their search results, they must figure out how to present different kinds of data without overcomplicating their pages. This poses a particular quandary for Google, which has long resisted the temptation to make significant changes to its tried-and-true search results format.
Filed under: Behavioural Targeting | Tags: Bluekai, Marketing as a Service
I came across this company today. They essentially provide you with Marketing as a Service.
Much like with commercials on television, online consumers like you are familiar with receiving messages from marketers in exchange for free or subsidised content across the Internet. While BlueKai has not come up with a solution to eliminate advertising altogether, they’ve created an anonymous registry of online preferences that helps you manage and control what marketers know about you.
In return, you, the consumer, are rewarded with the 3C’s: control, charity, and content.
Control—With the BlueKai registry, you can control and manage your online preferences by selecting or de-selecting topics of interest. Your preferences may be used anonymously to influence which types of marketing messages you receive across partner sites that we work with. Or you can choose to not participate at all. (But we encourage you to read on before you decide!)
Charity—It gets better! When marketers pay to access anonymous data from BlueKai, you will be rewarded with a credit to donate to the charity of your choice.
Content—By voluntarily sharing your online preferences, you’re helping marketers provide polite and relevant marketing to you, while they continue to pay the publishers who manage the Web sites you frequent. In return, you will continue to reap the benefits of free content that is available across the Internet.
Check it out: http://www.bluekai.com
Filed under: Datarati, Twitter, Web Analytics | Tags: Data Insertion API, Omniture API, Twitter API
This is a very very cool integration. Although still in prototype at this stage, if you are using Omniture in your organisation, you can track consumer chatter online within SiteCatalyst.
Filed under: Live Chat, Web Analytics | Tags: Google Analytics, Live Chat, Live Person
LivePerson allows you to interact with your customers live via chat to offer real-time assistance and advice while a user is engaged with your website, a highly valuable service for helping customers, and also for learning about what their intention is.
They created a robust integration with Google Analytics that is definitely worth a look. The integration shows you conversions after chat, and populates chat analytics data into a number of other reports, including
- Map Overlay Report which displays volume and quality metrics of live chat interactions by geographic region
- Search Engines Report: Reveals the sources and keywords that drive interactive chats (and resulting conversions)
- Reverse Goal Path Report: Lists the navigation paths that lead to the most chats
Take a look at a screenshot which shows a “Live Chat” column:
Filed under: Call Centre Data, Web Analytics | Tags: Google Analytics, Mongoose Metrics
Mongoose Metrics supplies toll free numbers in bulk for extremely cheap in case you’d like to use unique phone numbers to track campaigns or even different product orders.
Even better though, Mongoose has created a way to track these offline phone calls within Google Analytics.
You can see the phone number called, as well as the duration and date of the call. You can also integrate this with an email alerts system for sales management.
The service works by provisioning a trackable number and assigning that number to a hidden web page on your site containing your Google Analytics tracking code.
When a phone call to the tracking number is connected, the technology will place a web browser visit to your hidden tracking web page and in this way insert the phone call event back into your Google Analytics account.
Each phone call generates a unique visit which is clearly labeled inside of Analytics, as you can see in this screenshot: