Source: http://www.getelastic.com/responsive-design-vs-native-monetization/
more info here more info here more info here more info here more info here more info here more info here
Source: http://www.getelastic.com/responsive-design-vs-native-monetization/
more info here more info here more info here more info here more info here more info here more info here
Source: http://www.getelastic.com/13-ways-to-use-vine-for-ecommerce-or-instagram-video/
more info here more info here more info here more info here more info here more info here more info here
Having worked in the SEO industry for quite a few years now, I?ve heard myself uttering the same words over and over again: ?SEO is a team game.? Clients of mine understand that SEO is truly something that requires effort on both sides, and potential clients hear it all the time. Unlike many other industries, [...]
The post Four Schools of SEO Success Thought appeared first on Search Engine Journal.
Source: http://feedproxy.google.com/~r/SearchEngineJournal/~3/cX2CJgHNwPk/
more info here more info here more info here more info here more info here more info here more info here
In our infographic about the sausage factory that is online journalism, we had a throw away line about how companies were partnering with FindTheBest to auto-generate subdomains full of recycled content. Apparently, a person named Brandon who claims to work for FindTheBest didn't think our information was accurate:
Hi Aaron,
My name is Brandon. I have been with FindTheBest since 2010 (right after our launch), and I am really bummed you posted this Infographic without reaching out to our team. We don't scrape data. We have a 40 person+ product team that works very closely with manufacturers, companies, and professionals to create useful information in a free and fair playing field. We some times use whole government databases, but it takes hundreds-of-thousands of hours to produce this content. We have a product manager that owns up to all the content in their vertical and takes the creation and maintenance very seriously. If you have any questions for them about how a piece of content was created, you should go to our team page and shoot them a email. Users can edit almost any listing, and we spend a ton of time approving or rejecting those edits. We do work with large publishers (something I am really proud of), but we certainly do not publish the same exact content. We allow the publishers to customize and edit the data presentation (look, style, feel) but since the majority of the content we produce is the factual data, it probably does look a little similar. Should we change the data? Should we not share our awesome content with as many users as possible? Not sure I can trust the rest of your "facts", but great graphics!
I thought it was only fair that we aired his view on the main blog.
...but then that got me into doing a bit of research about FindTheBest...
In the past when searching for an issue related to our TV I saw a SERP that looked like this
Those mashed sites were subdomains on trusted sites like VentureBeat & TechCrunch.
Graphically the comparison pages appear appealing, but how strong is the editorial?
How does Find The Best describe their offering?
In a VentureBeat post (a FindTheBest content syndication partner) FTB's CEO Kevin O?Connor was quoted as saying: ??Human? is dirty ? it?s not scalable.?
Hmm. Is that a counter view to the above claimed 40 person editorial research team? Let's dig in.
Looking at the top listed categories on the homepage of Find The best I counted 497 different verticals. So at 40 people on the editorial team that would mean that each person managed a dozen different verticals (if one doesn't count all the outreach and partnership buildings as part of editorial & one ignores the parallel sites for death records, grave locations, find the coupons, find the company & find the listing).
Google shows that they have indexed 35,000,000 pages from FindTheBest.com, so this would mean each employee has "curated" about 800,000 pages (which is at least 200,000 pages a year over the past 4 years). Assuming they work 200 days a year that means they ensure curation of at least 1,000 "high quality" pages per day (and this is just the stuff in Google's index on the main site...not including the stuff that is yet to be indexed, stuff indexed on 3rd party websites, or stuff indexed on FindTheCompanies.com, FindTheCoupons.com, FindTheListing, FindTheBest.es, FindTheBest.or.kr, or the death records or grave location sites).
Maybe I am still wrong to consider it a bulk scrape job. After all, it is not unreasonable to expect that a single person can edit 5,000 pages of high quality content daily.
Errr....then again...how many pages can you edit in a day?
Where they lost me though was with the "facts" angle. Speaking of not trusting the rest of "facts" ... how crappy is the business information for SEO Book on FindTheBest that mentions that our site launched in 2011, we have $58,000 in sales, and we are a book wholesaler.
I realize I am afforded the opportunity to work for free to fix the errors of the scrape job, but if a page is full of automated incorrect trash then maybe it shouldn't exist in the first place.
I am not saying that all pages on these sites are trash (some may be genuinely helpful), but I know if I automated content to the extent FTB does & then mass email other sites for syndication partnerships on the duplicate content (often full of incorrect information) that Google would have burned it to the ground already. They likely benefit from their CEO having sold DoubleClick to Google in the past & are exempt from the guidelines & editorial discrimination that the independent webmaster must deal with.
One of the ways you can tell if a company really cares about their product is by seeing if they dogfood it themselves.
Out of curiousity, I looked up FindTheBest on their FindTheCompany site.
They double-list themselves and neither profile is filled out.
That is like having 2 sentence of text on your "about us" page surrounded by 3 AdSense blocks. :D
I think they should worry about fixing the grotesque errors before worrying about "sharing with as many people as possible" but maybe I am just old fashioned.
Certainly they took a different approach ... one that I am sure that would get me burned if I tried it. An example sampling of some partner sites...
we have seen search results where a search engine didn't robots.txt something out, or somebody takes a cookie cutter affiliate feed, they just warm it up and slap it out, there is no value add, there is no original content there and they say search results or some comparison shopping sites don't put a lot of work into making it a useful site. They don't add value. - Matt Cutts
That syndication partnership network also explains part of how FTB is able to get so many pages indexed by Google, as each of those syndication sources is linking back at FTB on (what I believe to be) every single page of the subdomains, and many of these subdomains are linked to from sitewide sidebar or footer links on the PR7 & PR8 tech blogs.
And so the PageRank shall flow ;)
Hundreds of thousands of hours (eg 200,000+) for 40 people is 5,000 hours per person. Considering that there are an average of 2,000 hours per work year, this would imply each employee spent 2.5 full years of work on this single aspect of the job. And that is if one ignores the (hundreds of?) millions of content pages on other sites.
How does TechCrunch describe the FTB partnership?
Here?s one reason to be excited: In its own small way, it combats the recent flood of crappy infographics. Most TechCrunch writers hate the infographics that show up in our inboxes? not because infographics have to be terrible, but because they?re often created by firms that are biased, have little expertise in the subject of the infographic, or both, so they pull random data from random sources to make their point.
Get that folks? TechCrunch hosting automated subdomains of syndicated content means less bad infographics. And more cat lives saved. Or something like that.
How does FTB describe this opportunity for publishers?
The gadget comparisons we built for TechCrunch are sticky and interactive resources comprised of thousands of SEO optimized pages. They help over 1 million visitors per month make informed decisions by providing accurate, clear and useful data.
SEO optimized pages? Hmm.
Your comparisons will include thousands of long-tail keywords and question/answer pages to ensure traffic is driven by a number of different search queries. Our proprietary Data Content Platform uses a mesh linking structure that maximizes the amount of pages indexed by search engines. Each month?mainly through organic search?our comparisons add millions of unique visitors to our partner?s websites.
Thousands of long-tail keyord & QnA pages? Mesh linking structure? Hmm.
If we expand the "view more" section at the footer of the page, what do we find?
Holy Batman.
Sorry that font is so small, the text needed reduced multiple sizes in order to fit on my extra large monitor, and then reduced again to fit the width of our blog.
Each listing in a comparison has a number of associated questions created around the data we collect.
For example, we collect data on the battery life of the Apple iPad.
An algorithm creates the question ?How long does the Apple iPad tablet battery last?? and answers it
So now we have bots asking themselves questions that they answer themselves & then stuffing that in the index as content?
Yeah, sounds like human-driven editorial.
After all, it's not like there are placeholder tokens on the auto-generated stuff
{parent_field}
Ooops.
Looks like I was wrong on that.
And automated "popular searches" pages? Nice!
As outrageous as the above is, they include undisclosed affiliate links in the content, and provided badge-based "awards" for things like the best casual dating sites, to help build links into their site.
That in turn led to them getting a bunch of porn backlinks.
If you submit an article to an article directory and someone else picks it up & posts it to a sketchy site you are a link spammer responsible for the actions of a third party.
But if you rate the best casual dating sites and get spammy porn links you are wonderful.
Content farming never really goes away. It only becomes more corporate.
Source: http://www.seobook.com/scalable-seo
more info here more info here more info here more info here more info here more info here more info here
Source: http://feedproxy.google.com/~r/OnlineMarketingSEOBlog/~3/anzWNjaZQjk/
more info here more info here more info here more info here more info here more info here more info here
Posted by Dr-Pete
If you follow our MozCast Google "weather" tracker, you may have noticed something unusual this morning – a record algorithm flux temperature of 113.3°F (the previous high was 102.2°, set on December 13, 2012). While the weather has been a bit stormy off and on since Penguin 2.0 and the announcement of 10-day rolling Panda updates, this one was still off the charts:
I’m usually cautious about over-interpreting any single day's data – measuring algorithm change is a very difficult and noisy task. Given the unprecedented scope, though, and reports coming in of major ranking shake-ups in some verticals, I've decided to post an early analysis. Please understand that the Google algorithm is incredibly dynamic, and we’ll know more over the next few days.
Some industry verticals are naturally more volatile than others, but here’s a breakdown of the major categories we track in order by the largest percentage change over the 7-day average. The temperature for June 25th along with the 7-day average for each category is shown in parentheses:
Every vertical we track showed a solid temperature spike, but “Home & Garden” led the way with a massive 51° difference between the single-day temperature and its 7-day average.
There are so many reasons that a query can change that looking at individual cases is often a one-way ticket to insanity, but that doesn’t seem to stop me from riding the train. Just to illustrate the point, the query “gay rights” showed a massive temperature of 250°F. Of course, if you know about the Supreme Court rulings announced the morning of June 26th, then this is hardly surprising. News results were being churned out fast and furious by very high-authority sites, and the SERP landscape for that topic was changing by the hour.
Sometimes, though, we can spot an example that seems to tell a compelling story, especially when that example hasn’t historically been a high-temperature query. It’s not Capital-S Science, but it can help us look for clues in the broader data. Here are a couple of interesting examples…
On the morning of June 25th, a de-localized and de-personalized query for “limousine service” returned the following results:
One possible pattern is that there are no domains in the new Top 10 with either the phrase “limousine service” or “limo service” in them, which could indicate a crack-down on partial-match domains (PMDs). Interestingly, the term “limousine” disappeared altogether in the post-update domain list, although “limo” still fares well. This could also indicate some sort of tweak in how Google treats similar words ("limo" vs. "limousine").
Here’s another query that shows a similar PMD pattern, clocking in at a MozCast temperature of 239°. The morning of June 25th, “auto auction” showed the following Top 10:
In the first SERP, eight of the top ten had “auto auction(s)” in the URL; in the second, only two remained, and one of those was an official US government sub-domain (even that site lost a ranking spot).
Ultimately, these are anecdotes. The question is: do we see any pattern across the broader set? As luck would have it, we do track the influence of partial-match domains (PMDs) in the MozCast metrics. Our PMD Influence metric looks at the percentage of total Top 10 URLs where the root or sub-domain contains either “keywordstring” or “keyword-string”, but is not an exact-match. Here’s a graph of PMD influence over the past 90 days:
Please note that the vertical axis is scaled to more clearly show rises and falls over time. Across our data set, there’s been a trend toward steady decline of PMD influence in 2013, but today showed a fairly dramatic drop-off and a record low across our historical data (back to April 2012). This data comes from our smaller (1K) query set, but the pattern is also showing up in our 10K data set.
For reference and further investigation, here are a few examples of PMDs that fell out of the Top 10, and the queries they fell out of (including some from the same queries):
Recently, Matt Cutts warned of a multi-week algorithm update ending just after July 4th – could this be that update? The short answer is that we have no good way to tell, since Matt’s tweet didn’t tell us anything about the nature of the update. This single-day spike certainly doesn’t look like a gradual roll-out of anything, but it’s possible that we’ll see large-scale instability during this period.
This is an imperfect exercise at best, and one day of data can be misleading. The situation is also constantly changing – Google claims Panda data is updating 10 days out of every 30 now, or 1/3 of the time, for example. At this early stage, I can only confirm that we’ve tracked this algorithm flux across multiple data centers and there is no evidence of any system errors or obvious data anomalies (we track many metrics, and some of them look relatively normal).
Finally, it’s important to note that, just because a metric drops, it doesn’t mean Google pulled a lever to directly impact that metric. In other words, Google could release a quality adjustment that just happened to hit a lot of PMDs, even though PMDs weren’t specifically the target. I would welcome any evidence people have seen on their own sites, in webmaster chatter, in unofficial Google statements, etc. (even if it’s evidence against something I’m saying in this post).
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Source: http://feedproxy.google.com/~r/seomoz/~3/-pKPWiesWlQ/early-look-at-googles-june-25-algo-update
more info here more info here more info here more info here more info here more info here more info here
For all sorts of reasons, some people have a problem with updating WordPress installs properly. I will state now that�for both our free and premium plugins we do not support anything but the latest and the prior to last version. At the time of writing that’s WordPress 3.5 and WordPress 3.4. If you’re running anything…
Why we don’t support old WordPress versions is a post by Joost de Valk on Yoast - Tweaking Websites. A good WordPress blog needs good hosting, you don't want your blog to be slow, or, even worse, down, do you? Check out my thoughts on WordPress hosting!
Source: http://feedproxy.google.com/~r/joostdevalk/~3/i9CzOajL6_Q/
more info here more info here more info here more info here more info here more info here more info here
Measuring PPC and SEO is relatively straightforward. But how do we go about credibly measuring social media campaigns, and wider public relations and audience awareness campaigns?
As the hype level of social media starts to fall, then more questions are asked about return on investment. During the early days of anything, the hype of the new is enough to sustain an endeavor. People don't want to miss out. If their competitors are doing it, that's often seen as good enough reason to do it, too.
You may be familiar with this graph. It's called the hype cycle and is typically used to demonstrate the maturity, adoption and social application of specific technologies:
Where would social media marketing be on this graph?
I think a reasonable guess, if we're seeing more and more discussion about ROI, is somewhere on the "slope of enlightenment". In this article, we?ll look at ways to measure social media performance by grounding it in the only criteria that truly matter - business fundamentals.
We?ve talked about the Cluetrain Manifesto and how the world changed when corporations could no longer control the message. If the message can no longer be controlled, then measuring the effectiveness of public relations becomes even more problematic.
PR used to be about crafting a message and placing it, and nurturing the relationships that allowed that to happen. With the advent of social media, that?s still true, but the scope has expanded exponentially - everyone can now repeat, run with, distort, reconfigure and reinvent the messages. Controlling the message was always difficult, but now it?s impossible.
On the plus side, it?s now much easier to measure and quantify the effectiveness of public relations activity due to the wealth of web data and tools to track what people are saying, to whom, and when.
As much as things change, the more they stay the same. PR and social media is still about relationships. And getting relationships right pays off:
Today, I want to write about something I?d like to call the ?Tim Ferriss Effect.? It?s not exclusive to Tim Ferriss, but he is I believe the marquee example of a major shift that has happened in the last 5 years within the world of book promotion. Here?s the basic idea: When trying to promote a book, the main place you want coverage is on a popular single-author blog or site related to your topic.....The post opened with Tim briefly explaining how he knew me, endorsing me as a person, and describing the book (with a link to my book.) It then went directly into my guest post? there was not even an explicit call to action to buy my book or even any positive statements about my book. An hour later, (I was #45 on Amazon?s best seller list
Public relations is more than about selling, of course. It?s also about managing reputation. It?s about getting audiences to maintain a certain point of view. Social media provides the opportunity to talk to customers and the public directly by using technology to dis-intermediate the traditional gatekeepers.
How do you measure the value of a relationship?
Difficult.
How can you really tell if people feel good enough about your product or service to buy it, and that ?feeling good? was the direct result of editorial placement by a well-connected public relations professional?
Debatable, certainly.
Can you imagine another marketing discipline that used dozens of methods for measuring results? Take search engine marketing for example. The standards are pretty cut and dry: visitors, page views, time on site, cost per click, etc. For email marketing, we have delivery, open rates, click thru, unsubscribes, opt-ins, etc?
In previous articles, we?ve looked at how data-driven marketing can save time and be more effective. The same is true of social media, but given it?s not an exact science, it?s a question of finding an appropriate framework.
There are a lot of people asking questions about social media's worth.
Does sending out weekly press releases result in more income? How about tweeting 20 times a day? How much are 5,000 followers on Facebook worth? Without a framework to measure performance, there?s no way of knowing.
Furthermore, there?s no agreed industry standard.
In direct marketing channels, such as SEO and PPC, measurement is fairly straightforward. We count cost per click, number of visitors, conversion rate, time on site, and so on. But how do we measure public relations? How do we measure influence and awareness?
PR firms have often developed their own in-house terms of measurement. The problem is that without industry standards, success criteria can become arbitrary and often used simply to show the agency in a good light and thus validate their fees.
Some agencies use publicity results, such as the number of mentions in the press, or the type of mention i.e. prestigious placement. Some use advertising value equivalent i.e. is what editorial coverage would cost if it were buying advertising space. Some use public opinion measures, such as polls, focus groups and surveys, whilst others compare mentions and placement vs competitors i.e. who has more or better mentions, wins. Most use a combination, depending on the nature of the campaign.
Most business people would agree that measurement is a good thing. If we?re spending money, we need to know what we?re getting for that money. If we provide social media services to clients, we need to demonstrate what we?re doing works, so they?ll devote more budget to it in future. If the competition is using this channel, then we need to know if we?re using it better, or worse, than we are.
Perhaps the most significant reason why we measure is to know if we?ve met a desired outcome. To do that we must ignore gut feelings and focus on whether an outcome was achieved.
Why wouldn?t we measure?
Some people don?t like the accountability. Some feel more comfortable with an intuitive approach. It can be difficult for some to accept that their pet theories have little substance when put to the test. It seems like more work. It seems like more expense. It?s just too hard. When it comes to social media, some question whether it can be done much at all
For proof, look no further than The Atlantic, which shook the social media realm recently with its expose of ?dark social? ? the idea that the channels we fret over measuring like Facebook and Twitter represent only a small fraction of the social activity that?s really going on. The article shares evidence that reveals that the vast majority of sharing is still done through channels like email and IM that are nearly impossible to measure (and thus, dark).
And it's not like a lot of organizations are falling over themselves to get measurement done:
According to a Hypatia Research report, "Benchmarking Social Community Platform Investments & ROI," only 40% of companies measure social media performance on a quarterly or annual basis, while almost 13% or the organizations surveyed do not measure ROI from social media at all, and another 18% said they do so only on an ad hoc basis. (Hypatia didn't specify what response the other 29% gave.)
If we agree that measurement is a good thing and can lead to greater efficiency and better decision making, then the fact your competition may not be measuring well, or at all, then this presents great opportunity. We should strive to measure social media ROI, as providers or consumers, or it becomes difficult to justify spend. The argument that we can't measure because we don?t know all the effects of our actions isn?t a reason not to measure what we can.
Marketing has never been an exact science.
Measurement should be linked back to business objectives.
In ?Measure What Matters?, Katie Delahaye Paine outlines seven steps to social media measurement. I liked these seven steps, because they aren?t exclusive to social media. They?re the basis for measuring any business strategy and similar measures have been used in marketing for a long time.
It?s all about proving something works, and then using the results to enhance future performance. The book is a great source for those interested in reading further on this topic, which I?ll outline here.
Any marketing objective should serve a business objective. For example, ?increase sales by X by October 31st?.
Having specific, business driven objectives gets rid of conjecture and focuses campaigns. Someone could claim that spending 30 days tweeting a new message a day is a great thing to do, but if, at the end of it, a business objective wasn?t met, then what was the point?
Let?s say an objective is ?increase sales of shoes compared to last December?s figures?. What might the social strategy look like? It might consist of time-limited offers, as opposed to more general awareness messages. What if the objective was to ?get 5,000 New Yorkers to mention the brand before Christmas?? This would lend itself to viral campaigns, targeted locally. Linking the campaign to specific business objectives will likely change the approach.
If you have multiple objectives, you can always split them up into different campaigns so you can measure the effectiveness of each separately. Objectives typically fall into sales, positioning, or education categories.
Who are you talking to? And how will you know if you?ve reached them? Once you have reached them, what is it you want them to do? How will this help your business?
Your target audience is likely varied. Different audiences could be industry people, customers, supplier organizations, media outlets, and so on. Whilst the message may be seen by all audiences, you should be clear about which messages are intended for who, and what you want them to do next. The messages will be different for each group as each group likely picks up on different things.
Attach a value to each group. Is a media organization picking up on a message more valuable than a non-customer doing so? Again, this should be anchored to a business requirement. ?We need media outlets following us so they may run more of our stories in future. Our research shows more stories has led to increased sales volume in the past?. Then a measure might be to count the number of media industry followers, and to monitor the number of stories they produce.
What does it cost you to run social media campaigns? How much time will it take? How does this compare to other types of campaigns? What is your opportunity cost? How much does it cost to measure the campaign?
As Delahaye Paine puts it, it?s the ?I? in ROI.
Testing is comparative, so have something to compare against.
You can compare yourself against competitors, and/or your own past performance. You can compare social media campaigns against other marketing campaigns. What do those campaigns usually achieve? Do social media campaigns work better, or worse, in terms of achieving business goals?
In terms of ROI, what?s a social media ?page view? worth? You could compare this against the cost of a click in PPC.
Once you?ve determined objectives, defined the audience, and established benchmarks, you should establish criteria for success.
For example, the objective might be to increase media industry followers. The audience is the media industry and the benchmark is the current number of media industry followers. The KPI would be the number of new media industry followers signed up, as measured by classifying followers into subgroups and conducting a headcount.
Measuring the KPI will differ depending on objective, of course. If you?re measuring the number of mentions in the press vs your competitor, that?s pretty easy to quantify.
?Raising awareness? is somewhat more difficult, however once you have a measurement system in place, you can start to break down the concept of ?awareness? into measurable components. Awareness of what? By whom? What constitutes awareness? How to people signal they?re aware of you? And so on.
How will you collect measurement data?
There are an overwhelming number of tools available, and outside the scope of this article. No tool can measure ?reputation? or ?awareness? or ?credibility? by itself, but can produce usable data if we break those areas down into suitable metrics. For example, ?awareness? could be quantified by ?page views + a survey of a statistically valid sample?.
Half the battle is asking the right questions.
A measurement process is about iteration. You do something, get the results back, act on them and make changes, and arrive at a new status quo. You then do something starting from that new point, and so on. It?s an ongoing process of optimization.
Were objectives met? What conclusions can you draw?
Those seven steps will be familiar to anyone who has measured marketing campaigns and business performance. They?re grounded in the fundamentals. Without relating social media metrics back to the underlying fundamentals, we can never be sure if what we?re doing is making or a difference, or worthwhile. Is 5,000 Twitter followers a good thing?
It depends.
What business problem does it address?
You invested time and money. Did you get a return?
If you?ve linked your social media campaigns back to business objectives you should have a much clearer idea. Your return will depend on the nature of your business, of course, but it could be quantified in terms of sales, cost savings, avoiding costs or building an audience.
In terms of SEO, we?ve long advocated building brand. Having people conduct brand searches is a form of insurance against Google demoting your site. If you have brand search volume, and Google don?t return you for brand searches, then Google looks deficient.
So, one goal of social media that gels with SEO is to increase brand awareness. You establish a benchmark of branded searches based on current activity. You run your social media campaigns, and then see if branded searches increase.
Granted, this is a fuzzy measure, especially if you have other awareness campaigns running, as you can?t be certain cause and effect. However, it?s a good start. You could give it a bit more depth by integrating a short poll for visitors i.e. ?did you hear about us on Twitter/Facebook/Other??.
Measuring social media isn?t that difficult. In fact, we could just as easily use search metrics in many cases. What is the cost per view? What is the cost per click? Did the click from a social media campaign convert to desired action? What was your business objective for the social media campaign? To get more leads? If so, then count the leads. How much did each lead cost to acquire? How does that cost compare to other channels, like PPC? What is the cost of customer acquisition via social media?
In this way, we could split social media out into the customer service side and marketing side. Engaging with your customers on Facebook may not be all that measurable in terms of direct marketing effects, it?s more of a customer service function. As such, budget for the soft side of social media need not come out of marketing budgets, but customer service budgets. This could still be measured, or course, by running customer satisfaction surveys.
Look around the web for definitions of the differences between PR and social media, and you?ll find a lot of vague definitions.
Social media is a tool used often used for the purpose of public relations. The purpose is to create awareness and nurture and guide relationships.
Public relations is sometimes viewed it as a bit of a scam. It?s an area that sucks money, yet can often struggle to prove its worth, often relying on fuzzy, feel-good proclamations of success and vague metrics. It doesn?t help that clients can have unrealistic expectations of PR, and that some PR firms are only too happy to promise the moon:
PR is nothing like the dark, scary world that people make it out to be?but it is a new one for most. And knowing the ropes ahead of time can save you from setting impossibly high expectations or getting overpromised and oversold by the firm you hire. I?ve seen more than my fair share of clients bringing in a PR firm with the hopes that it?ll save their company or propel a small, just-launched start-up into an insta-Facebook. And unfortunately, I?ve also seen PR firms make these types of promises. Guess what? They?re never kept
Internet marketing, in general, has a credibility problem when it doesn?t link activity back to business objectives.
Part of that perception, in relation to social media, comes from the fact public relations is difficult to control:
The main conduit to mass publics, particularly with a consumer issue such as rail travel or policing, are the mainstream media. Unlike advertising, which has total control of its message, PR cannot convey information without the influence of opinion, much of it editorial. How does the consumer know what is fact, and what has influenced the presentation of that fact?
But lack of control of the message, as the Cluetrain Manifesto points out, is the nature of the environment in which we exist. Our only choice, if we are to prosper in the digital environment, is to embrace the chaos.
Shouldn?t PR just happen? If you?re good, people just know? Well, even Google, that well known, engineering-driven advertising company has PR deeply embedded from almost day one:
David Krane was there from day one as Google's first public relations official. He's had a hand in almost every single public launch of a Google product since the debut of Google.com in 1999.
Good PR is nurtured. It?s a process. The way to find out if it?s good PR or ineffective PR is to measure it. PR isn?t a scam, anymore so than any other marketing activity is a scam. We can find out if it?s worthwhile only by tracking and measuring and linking that measurement back to a business case. Scams lack transparency.
The way to get transparency is to measure and quantify.
Source: http://www.seobook.com/measuring-social-media
more info here more info here more info here more info here more info here more info here more info here
more info here more info here more info here more info here more info here more info here more info here
5 Secrets of Good Copywriting
Post from: Quality SEO Services & Link Building Services
Source: http://feedproxy.google.com/~r/quantumseolabs/~3/a_nYZkAWqZ0/
more info here more info here more info here more info here more info here more info here more info here
Source: http://feedproxy.google.com/~r/OnlineMarketingSEOBlog/~3/YNGhwA5U_ZQ/
more info here more info here more info here more info here more info here more info here more info here
Facebook?s Graph Search will change the search marketing game. Search results are increasingly about connections. They are heavily reliant on consumers to�share their experiences with various businesses within their networks. Without user data, Graph�Search becomes virtually useless. Other search engines will not have the opportunity to evolve�using a similar path. Graph Search?s influence will be [...]
The post 3 Ways to Leverage Facebook?s Graph Search by @MarcPurtell appeared first on Search Engine Journal.
Source: http://feedproxy.google.com/~r/SearchEngineJournal/~3/Y85U5OlbW9U/
more info here more info here more info here more info here more info here more info here more info here