Hamlet AU on NwN blog wrote in the comments:
“The AdSense ads for Second Life bring in a ton of new users — trouble is, most of them go away because they can’t be bothered to download a client or figure out how to use SL”
My response is part of a loose theory I am currently mulling over:
Pussycat: “Or they go away because they never arrived.”
I suspect 99% of them are automated spam bots registered for what in the SEO world appears to be an online community with a forum.
- Spam Bots target these with 10s of thousands of account creations per day, on even small forums, just to register sleeper accounts.
Its like a flu infection – the virus sends millions of copies of itself out, assaulting immune systems, and millions of these germs land on you every day. Most of them never doing anything once they get there.
Forum Spam Bots seem to work like that. Creating accounts on open registration systems in the thousands per day with no human involvement.
They will then cycle back months later and post up a random worded post with all kinds of odd phrases compiled together. The purpose of which is to hide what terms in there apply to their actual client, and to make that client’s terms appear to be related to the forum, in order to ‘ride the coattails’ of the forum’s organic search ranking in google and bing.
- It only takes a post or two per week to do this. But you need to get a few hundred thousand accounts injected into a target system in order to hope that the system’s admins fail to catch and ban / delete all of them.
If you’ve ever seen spam that seems to talk about some news item, and then randomly in mid sentence switch to shoes or special medicines or study help or talk as if a friend… but with odd grammar and a few weird words… that’s the spam bots. If you have a blog, you’ve probably seen them in your comments filter. I’d wager they’re about half the comments I get here, which is why I moderate comments here. And my blog has bad SEO…
If the forum or site is run by people who do not know how to look for invalid accounts, or how to recognize the difference, you can easily put in a million or two accounts with no one the wiser…
Where I work in RL, we were up to 250,000 of them before I figured out what they really were, and came up with some patterns among our real customers that let us wipe the database with as low risk as possible of hitting real ones.
- And that was in a span of about 4 months. Being a ‘social scientist’ rather than an IT person (by education, I work in graphic/web arts/design), I see these patterns from a different angle – though I fuddle through the solution implementation part.
LLs / SL has been going for years being run by people who show all the signs of not knowing how to manage a community both in terms or customer relations, security, and awareness of who their customers even are. People who likely do not know how to recognize the difference between a human user and a spam bot. Especially given how quirky some of their real users are (in terms of name and sign-on details) – seeing the difference is not easy.
Consider that SL doesn’t even run Captcha software… although with OCR (my guess), some bots can blast past that now. When we put it on our system, it merely reduced bot accounts from about 10,000 per day down to about 100 – significant, but still a good number getting in. It took a change in our SEO to finally shake the bots. To make our forum no longer look like a forum, while still looking like a forum…
(Convoluted… we basically made it look like a branded FAQ page, but left the word forum in place in the link and in our marketing material. We down to about 1-10 spam bots a day now, and they seem to be focused in on specific old user posts -before- they arrive. Google Analytics is handy. Automated tools from the forum service we use knocks these out so I can focus on my real work: putting colors and letters on web stuffz… :p)
So my new theory is that most people who sign up to SL actually enter the world, and stay. Probably only a few dozens per day (but I have no idea on the actual rate of people that create an account and stay).
But most bots that sign up, never enter the world, because they never were even designed to. They are just like ‘bacteria’ crashing up against the walls of the internet, and in this case, getting into the skin… but not further because what they land on is not what they were designed to target.
What I do see inworld, is that people who are day one newbs, quite often become week one newbs, and many even month one newbs. Once they hit this point… I almost think its fair to shift any blame for their loss on the community’s ability to invite them in and socialize with them…
- And I’m suspecting almost half of them reach month one. But I have no data on that. Its just a feeling from what I see looking around in world and the newbs I regularly run across.
The ‘humans’ that signup, by far, actually give it a good try to make it stick. Being past the hype days, more often now they’re slightly dedicated before even hitting the signup page. Its a 10-year old platform now, and the humans who arrive, are more likely now to have looked for it before arriving.
But the spam bots, they just trigger on what web crawling or something leads them to…
So new theory:
Most of the signups are bots. So it makes sense that they never seen to enter the world or ‘stick around’.
Only a small handful are real accounts. A hundred a day might seem dangerously small, but for a 10-year old platform, its pretty good. Most of these actually go inworld, and I suspect most of them stick past a few days.