Frequent thinker, occasional writer, constant smart-arse

Tag: issue (Page 2 of 3)

Google: the ultimate ontology

A big issue with the semantic web is ontologies – the use of consistent definitions to concepts. For those that don’t understand what I’m talking about – essentially, the next evolution of the web is about making content readable by not just humans but also machines. However for a machine to understand something it reads, it needs consistent definitions. Human’s for example, are intelligent – they understand that the word “friend” is also related to the word “acquaintance”, but a computer would treat them to mean two different things. Or do they?

Just casually looking at some of my web analytics, I noticed some people landed on my site by doing a google search for how many acquaintances do people have, which took them to a popular posting of mine about how many friends people have on facebook. I’ve had a lot of visitors because of this posting, and its been an interesting case study for me on how search engines work. However today was something different from other times: I found the word acquaintance weird. I know I didn’t use that word in my posting – and when I went to the Google cache I realised something interesting: because someone linked to me using that word, the search engine replaced the word ‘friend’ with ‘acquaintances’.

acquaintances

Google’s linking mechanism is one powerful ontology generator.

Half the problem has been solved with time spent

On Thursday, I attended the internal launch of the Australian Entertainment & Media Outlook for 2007-2011. It was an hour packed with interesting analysis, trends, and statistics across a dozen industry segments. You can leave a comment on my blog if you are interested in purchasing the report and I’ll see if I can arrange it for you.

One valuable thing briefly mentioned, was the irony of online advertising.
Continue reading

Some things will never change: how to create credibility

This weekend in my office with a half dozen colleagues, we toiled away on an (academic) assignment due tonight. When you spend 11 hours in one day around one table, on something that drives you mad – conversation is a aplenty on things not related to what we were doing. And when there as no conversation, procrastination was aplenty with Facebook being the prime culprit amongst all of us.

An interesting scenario happened, which made me revisit something I have long wondered. One of the girls asked how does Facebook make money, and I went on a rant about their $200 million Microsoft deal, how they are heading towards an IPO, and other random facts I just happen to know. They all looked at me stunned, in the sense how could I possibly know such things, and I replied I read a lot – I read a lot of blogs.

“…but how do you know that stuff you are reading is accurate?” with reference to that $200 million that I don’t even know where I read that. The funny thing about the question, is that it’s smart and stupid at the same time. The answer seems too obvious – but it isn’t: how DO I know those facts I stated where true?

Why I bring this up, is because this is an issue I have long tried to come to grips with – what makes information credible? How do you know when you read something on the internet, that it is reliable? The answer is we don’t. Sort of.

This “new media” world isn’t the reason why we have this apparent problem: information credibility has long been an issue, first realised by the citizens of western democracy after the Great War when they recognised newspapers could no longer be taken as fact (due to the propaganda efforts). So its been a problem long before computers and hypertext had even been invented – it’s only that with us being in an Information Age, the quality of information has been under higher scrutiny with its abundance.

How do we know what makes something reliable? Is it some gee-whiz Google algorithm? Perhaps it’s the wisdom of the crowds? Maybe – but there is something else even more powerful that I have to thank Scott Karp for making me realise this, back in the days when he was starting out as a blogger: it’s all about branding.

Why makes an article about the New York Times, more credible than one written by a random student newspaper rag? What makes a high profile author, more credible in what they say, than a random nobody who puts their hand up in a town hall meeting? And going back to the question my colleague asked earlier – how do I know the blogs I am reading have any credibility – over say, something I read in an established newspaper such The Economist?

Simple: branding establishes information credibility. And a brand – for any type of entity be it an individual journalist or a news organisation – is dependent on recognition by others. There could be absolutely no credibility in your information (like Wikipedia) and yet you could have a brand that by default establishes credibility – just like how people regularly cite Wikipedia as a source now, despite knowing it’s inherently uncredible.

The power of branding is that no matter how uncredible you are – your brand will be enough to make anything you say, incredible.

Thoughts on attention, advertising, and a metric to measure both: keep it simple

Advertising on the Internet is exploding. Assuming you accept my premise that the Internet will be the backbone of the world’s attention economy – then, I am sure you can see the urgency of developing an effective metric for measuring audiences that consume content online. Advertisers are expecting more accountability online and there is increasing demand for an independent third-party to verify results. But you can’t have accountability and there is no value in audits, if one place measures in apples and the other in bananas.

The Attention Economy is seriously lacking an effective measurement system

Ajax broke the pageview model of impressions, the one billion-dollar practice of click-fraud is the dirty big secret of pay-for-performance advertising, and the other major metric of using unique visitors (through cookies) is proving inaccurate.

It sounds crazy, doesn’t it? The Internet has the best potential for targeted advertising, and advertisers are moving onto it in stampedes – and yet, we still can’t work out how to measure audiences effectively. Measurement is broken on the Net.

(Although I am focusing on advertising, this can be applied in other contexts. An advertising metric is simply putting a monetary value on what is really an attention metric.)

Yet when we look at the traditional media, are we being a little harsh on this new media? Is the problem with the web’s measurement systems just that it is more accountable for its errors? After all – radio, television, and print determine their audience through inference which are based on sampling methods and not actually directly measuring an audience. Sampling is about making educated guesses – but a guess is still a guess.

Maybe another way of looking at it is that the old way of doing advertising is no longer effective. Although we can say pageviews are broken due to AJAX, the truth is it was always an ineffective measurement system, as it was based on the traditional media’s premise of how many viewers/subscribers theoretically and potentially could see that ad. As an example of why this is not how it should be: when people visit my blog via Google Images, they hang around for 30 seconds. People that search for business issues on the web that I write about, like stuff you are reading right now – spend 5+ minutes. If both are equal in terms of page views, but the later actually reads the pages and the former only scans the content for an image – why are we treating them equally? My blog is half about travel, and half about the business of the internet, which is why I have two very different audiences. Just because I get high page views from my travel content, doesn’t mean I can justify higher CPM’s for people that want to advertise on internet issues. Not all pageviews are the same – especially when I know the people giving me high pageviews, arn’t really consuming my content

Another issue is that advertisers are so caught up on who can create the most entertaining 30 second ad, that the creativity to get people entertained has ovetaken the reason why advertising happens in the first place: to make sales. The way you do that, is by communicating your product to the people that would want to buy it. If I placed advertising on this blog, from people who want to do web-business related stuff, they should only pay for the peope that read my blog postings for 5+ minutes on the Attention economy, not for the Google images searchers who are looking for porn (my top keywords, and how people find my blog, makes me laugh out loud sometimes!).

When we create a metric that measures attention, lets be sure of one thing: the old way is broken, and the new ways will continue to be broken if we simply copy and paste the old ways. New ways like click-through ads that appear on search results, and account for 40% of internet advertising is not how advertising should be measured. The reason is because it is putting the burden of an effective advertising campaign, on a publisher. Why should a publisher not get paid, with the opportunity cost of not using another ad that would have paid, because of the ineffectiveness of the advertisers campaign strategy at targeting?

When measuring audience attention, lets not overcomplicate it. It should be purely measuring if someone saw it. As an advertiser, I should be able to determine which people from which demograph can see it my ad – and yes, I will pay the premium for that targeting. If it turns into a sale, or if they enjoyed the content – is where your complex web analytic packages come in. But for a simple global measurement system, lets keep it simple.

Concluding thought

If I stood at the toll booths of the Sydney Harbour bridge naked, some people will honk at me and others won’t. If I can guarantee that they can see me naked, that’s all as a publisher I need to do. It’s the advertisers problem if people honk at me or not. (Not enough honks means as a model I should still get my wage. They just need to hire a better looking model next time!)

Privacy – just like inflation

Privacy is a massive business issue. I’ve commented on the lack of interest in privacy from entrepreneurs in the web space; I’ve tried to define privacy; and I joined the APML workgroup for this reason

Need to know why I think it matters? Well here are three facts:

1) Targeted advertising is the future of advertising. Why? Because it’s most effective type of advertising.

2) Web services, and arguably the entire attention economy, rely on advertising as a revenue model.

3) There is a natural friction between targeted advertising and privacy. You can’t target without knowing who you are targeting – which implies some type of implicit collection of data.

Google, on the strength of its brand, has been able to manage the privacy issue. But no longer. Privacy International has ranked Google at the worst privacy offended on the internet. As 99% of Google’s revenue relies on advertising, with open acknowledgment that they are trying to find ways of better targeting advertising, we can expect to hear more and more how Google’s evil is in the data they collect and the way they control it.

Economic growth is one of the key concepts to how our world works – it’s what companies and countries for example, constantly aim for. But as we have seen repeatedly, if an economy grows too quickly, problems can appear – inflation, infrastructure issues, and fatigue. Greed has a price. In the context of an economy, inflation is the speed-hump – the faster you drive over it, the bigger the hit.

So would it be too far to extend the metaphor, to say that privacy is the advertising equivalent to inflation? If you are relying on advertising as a revenue model, remember that privacy will matter more and more with an interconnected world.

The attention economy needs a consistent base

Okay, enough naval gazing. The journalist in me (by experience), the accountant in me (by education), and the businessman in me (by occupation) is going to synthesise my understanding of the world and propose a new metric for the attention economy. I don’t know the answer yet, but I am going to use this blog to develop my thinking. I can’t promise a solution, however I am sure breaking the issue down into key requirements, assumptions, and needs of what this magical metric is – will add value somewhere for someone.

So let’s start with the most important assumption of all: what are we measuring? As Herbert Simon coined it, and smart guys like Umair, Scott and Chris have extended (at least for my conceptual understanding) – it is called the attention economy. It is important to note however, that the attention economy is an aspect of the Information Sector (see below). And as I described in a previous posting, the attention economy needs a metric for two reasons: monetisation and feedback.


What incorporates the attention economy?
Well, this is a bit like a related problem I had when I first came to grips with what new media was. A few years back, I did some active research trying to understand how a book, a television, a newspaper, and a search engine – could all somehow be classed as “media”. I found my question answered by Vin Crosbie’s manifesto (read this for a recent summary). Take note of what he considers is the key element of new media (the technology aspect).

I am going to propose one of my key assumptions of the future, which will answer this question. It might not happen for another 5, 10 or even 20 years – but I am convinced this is the future. The Internet will act as infrastructure.

I believe the unifying aspect, and the backbone of the attention economy, will be the Internet. All enterprise software, all consumer software, all (distributed) entertainment, all (distributed) communications and all information – will be delivered digitally over the Internet. I think the people at the US Census bureau?Ç? conceptually have already worked this out by defining the information sector of the economy, which classes the above mentioned and more into this one diverse category. The Internet is the enabler of the Information Age, just like how the production line was for the Industrial Age

I’m not saying we are going to live, sleep, and eat on computers in the future. However just think – anything that runs on electricity, can connect to the Internet. And look at the technologies being developed that enable the Internet to live beyond the computer screen like electronic paper and?Ç? dynamic interfaces. Even more powerfully, is that the Internet has brought entire industries to their knees – like the newspaper and music industries – because it is providing a more efficient way of delivering content. If it’s information, communication or entertainment related – then it probably works better in digital format, over the Internet. (Excluding of course the things like theme parks and the like, which are more about physical entertainment and not distributed entertainment like a television programme).

I think this is an important issue to be recognised, that the Internet will the the backbone of the attention economy. By being the core back-end, it means that no matter the output device – whether it is mobile phone, a computer, or a television – it will be providing a consistent delivery mechanism for digital information. For a measurement system to work, it needs to be consistent. The Internet infrastructure will be that consistency. If you can recognise that, then that is a big step forward to solving the issue.

Tangler

This is the second post in a series – wizards of oz – which is to highlight the innovation we have down under, and how the business community needs to wake up and realise the opportunities. I review Tangler, a Sydney-based start-up that has recently released their application to the world as a public beta.

Tangler is a web-service that enables discussions over a network. Think of discussions with the immediacy of Instant Messaging (it’s easy), but with the persistency of a forum (messages are permanently stored). Discussions are arranged into communities of interest (groups), which are further broken down into topic areas. Click here to see a video overview.

Value

1) It’s a network application. Although it’s got a great design, and looks like a funky website, the real power of this web service is what it’s working towards: discussions over a network. Imagine a little widget with the topic “What do you think of Elias Bizannes?” placed on my (external) personal blog, my internal work blog, my myspace/facebook/social networking page, as well as it’s own dedicated forum on the Tangler site. A centralised discussion, in a decentralised manner. That’s big.

2) It’s community has great DNA. Communities are not easy things to build – my own experience on a getting-bigger-by-the-day internal project has shown that it is a complex science, touching everything from understand motivational theory to encouraging the right kind of behaviours (policing without policing). My usage on the site has shown to me that the active community building currently occuring, is on the right track. Anyone can hire a code monkey, wack on some flashy front-end, and say they have a great product. But not anyone can build a strong community – even Google struggles on this (the acquisition of YouTube happened largely because of community, because the YouTube community beat Google’s own service). Tangler’s community is already turning into a powerful asset – the DNA is there – now it just needs exposure, and the law of cumulative advantage will kick in.

3) The founder and staff are responsive to its community. I posted a question on the feedback forum, to prove this point: I got a response in an hour, on a Saturday. The staff at Tangler are super responsive – which in part, is due to the real-time discussion ability of the software – but also because of their commitment. As I state above – the value of Tangler is the community of users it builds – this type of responsiveness is crucial to keep its users satisfied to come back, because it makes them feel valued. Additionally, the community is driving the evolution of the application, and that’s the most powerful way to create something (adapting to where there is a need by the people that use it)

4) It’s a platform. What makes Tangler powerful, is that it encourages discussions around niche content areas. Make that niche content, being created for free. Low cost to produce + highly targeted content = an advertisers dream. Link it with a distributed network across the entire Internet (see 1 above), and you’ve got something special.

Conclusion

Social networks, which is what Tangler is, are characterised by:
1) the existence of a repository of user-generated content and
2) the need of members to communicate.

Tangler’s user-generated content and communications web make them an interesting fit for both media conglomerates and telecommunication companies (but for different reasons). I see a Tangler acquisition as a no-brainer for the big Telco’s. Integrating a social network like Tangler into Telstra, builds on the synergy between the communication needs of social network users and the communications expertise and service infrastructure of the communication companies. Unlike voice calls that are a commodity now, the Telco’s need to take advantage of their network infrastructure and accommodate for text-based discussions, which can be monetised for as long as the content exists (with advertising).

The challenge for Tangler however – as with any other Internet property – is that the scale of the audience of social networks determines the nature of the relationship with a communications company. Micro-sized social networks are not interesting to communication companies. Massive social networks are, but history has shown they would rather be partners than be acquired. To be attractive to the big end of town, Tangler needs to show to have a scale large enough to grow as a business but not too large to dictate the terms of the business.

My observations conclude me to think that they will be a hit once they open up their application to external developers, which will relieve the development bottleneck faced by their resource and time constrained team. However they shouldn’t rush this, as I still think their performance issues are not completely ironed out yet. An open API would be taken up by its enthusiastic community who are technologically orientated. Not too mention the strong relationships the CEO and CMO have forged with the local web entrepreneurial and development community in Australia.

My boss is currently doing a secondment as acting Finance Director at Sensis, Telstra’s media arm. Maybe I need to organise a catch-up with him, before these guys get snatched up by some US conglomerate!

Faraday Media – Particls

This series of blog posts – wizards of oz – is to highlight the innovation we have down under. So I begin with Faraday media, a Brisbane based start-up that launched their keynote product today,

Particls is an engine that learns what you are interested in, and alerts you when content on the internet becomes available – through a desktop ‘ticker’ or pop-up alerts.

Value
1) It’s targeted. Particls is an attention engine – it learns what you want to read, and then goes and finds relevant information. That’s a powerful tool, for those of us drowning in information overload, and who don’t have time to read.

2) It catches your attention. Particls is based on the concept of ‘alerts’ – information trickles across your screen seemlesly as you do your work, like a news ticker. For the things that matter, an alert will pop-up. The way you deal with information overload is not by shutting yourself out – it’s by adjusting the volume on things that you value more than other things.

3) The founders understand privacy. They started the APML standard – a workgroup I joined because it’s the best attempt I have seen yet that tackles the issue of privacy on the internet. For example, I can see what the Particls attention engine uses to determine my preferences – lists of people and subjects with “relevance scores”. And better yet – it’s stored on my hard-disk.

4) It’s simple. RSS is a huge innovation on the web, that only a minority of users on the internet understand. The problem with RSS (Real Simple Syndication), is that it’s not simple. Particles makes it dead simple to add RSS and track that content.

Conclusion

Why the hell doesn’t Fairfax acquire the start-up, rather than wasting time creating yet another publication (incidently in the same city) that we don’t have time to read. In my usage of the product, I have been introduced to content that I am interested in, that I never would have realised had existed on the web. In my trials, I have mainly used it to keep track of my research interests, and despite my skepticism about how ‘good’ the the attention engine is, it has absolutely blown me away.

And it’s not just in the consumer space – a colleague (who happens to hold a lot of influence in enterprise architecture of our 140,000 person firm) was blasting RSS one day on an internal blog – saying how we don’t yet have the technology to ‘filter’ information. I told him about Particls – he’s now in love. If a guy like him, who shapes IT strategy for a $20 billion consulting firm, can get that excited – that’s got to tell you something.

New measurement systems need a purpose

Chris recently proposed a new measurement system for attention, after yet another call to arms for a new way of measuring metrics. This is a hard issue to gnaw at, because it’s attempting to graple at the emerging business models of a new economy, which we are still at the cross roads at. Chris asked us on the APML workgroup on what we thought of his proposal, which is interesting, but I thought it might be better to take a step back on this one and look at the bigger picture. Issues this big need to be conceptually clear, before you can break into the details.

Television, radio, and newspapers are the corner stone of what we regard as the mainstream media. For decades, they have ruled the media business – with their 30 second advertising spots, and “pageviews” (circulation). Before the information age, they were what the ‘attention economy’ was. None of those flamin’ blogs stealing our attention: content and advertising flowed through to us from one place.

The internet is enabling literally an entire new Age of humanity. A lot of the age-old business models have been replicated, because we don’t know any better, but people are abandoning them because they are realising they can now do so much more. So the key here is not to get too excited on what you can do – rather, we need to think why what we need to do.

Let me explain – advertisers sold their product on a TV/radio commercial, and a newspaper page, because it guaranteed them that a certain amount of people would see it. Advertisers advertise because they want to do one thing: to make money. It’s just how capitalism works – profit is god – so do what you can to make higher profit.

But back then, the traditional mainstream media was the only way they could reach audiences on an effective scale. However advertising on the Sunday night movie is the equivalant to dropping a million pamphlets out of a plane, hoping that the five customers you know that would buy your product, end up catching it. Back then, no one complained – it was the best we could do. It sucked, but we didn’t know any better.

The internet changed that.

Advertisers can now target their advertising to a specific individual. They don’t care anymore about advertising on a mass scale; what they would rather is advertising on a micro scale. Spending $20,000 on 10,000 people you know that want to buy your product, has a much better Return on Investment than $2,000,000 on 1,000,000 people – of which 10% don’t speak the language of your ad, 20% aren’t the target group for your ad; and 30% are probably offended by your ad and will ruin it for the 40% they you were targeting in the first place.

Sound crazy? Well Google making $10 billion dollars doing just that is crazy.

So now that we have cleared that up – let’s get back to the issue. We now know one of the reasons why we need measurement: advertisers want to target their advertising better. Are there any other reasons? Sure- sometimes people want to measure what their audience reads for non-monetary reasons – they could just trying to find out what their readers are interested in, so they can focus on that content. Statistics like that is not narcissism – it’s just being responsive to an audience. Or then again, it could be pure ego.

So when it comes to measuring content, there are two reasons why anyone cares: to make money, or to see how people react to your content. However it’s the first type that is causing us problems in this issue. And that’s because how long someone spends on your content, or how many people view your content, is no longer relevant as it was in the mass media days. What is relevant is WHO is reading your content.

I don’t think you can have a discussion about new ways of measuring the way content is consumed, without separating those two different motives for measurement. I like Chris’s proposal – knowing how long someone spends reading my blog posting is something I would find interesting as a blogger. But that’s pure ego – I just want to know if I have a readership of deep thinkers or random Google visitors that were looking for a picture about shorts skirts. (As an aside – one of my pictures is the number one Google image result for “women in short skirts” – thank God it goes to my Flickr account now, the bandwidth that used to eat up was crazy!)

So before we come up with new measurement systems, lets spend more time determining why we are measuring. Simply saying we are better measuring what consumers are giving their attention to, is only part of the problem. We need to first determine what value we obtain from measuring that attention in the first place.

« Older posts Newer posts »