Monthly Archives: April 2005

Free Link Submission Directory Index

Free Link Submission Directory Index Search engine traffic is an altogether different beast compared to traditional marketing, and if done correctly will improve your ROI (return on investment). The thing about search engines is that the customers are already pre-qualified, they have an idea of what they are searching for, if they are searching for an x-box, they will go to a page which is presenting an x-box. This marketing model and promotional tool has great ROI and will increase your sale conversions significantly.

This engine has been installed some time ago now, and is building up fairly well. I may get around to changing some of the colours and style sheets mind you.

E-learning inspires Euro universities

Information World Review: “Universities are embracing e-learning, with three quarters of the major European universities claiming to have either implemented e-learning systems or having plans to implement one within the next three years.
A survey of 150 European universities commissioned by e-learning specialist WebCT indicates that student access to electronic resources is expanding fast beyond traditional library resources.
The survey found that most universities cited increased quality of education as the main benefit of e-learning, followed by improved access and cost efficiencies. The adoption of e-learning appears to be driving new collaborations that faculty librarians will be forced to track.”

Getting The Opt-In E-Mail Through

This week the big project is finding a solution to the e-mail marketing problem we have found.

For a number of years I/WE (there are others involved now) have been using a third party company for the delivery of company e-mails and auto-replied messages.

There is a significant difference however between these two systems. Company e-mails, i.e. periodical news e-mails are simply sent as and when the company decides. The newsletter is constructed and sent out on mass to a defined set of opt-in e-mail subscribers. The second system mentioned was an auto-reply service which is entirely different in that when a users signs up their e-mail address gets entered into an auto responder. An auto responder can be set to send an e-mail immediately (noting a thank you to the customer) with a further "set" of time delayed emails which have already been constructed.

Bulk E-Mailing

The problem we have found recently is that in using a third party company for our bulk e-mails means that we are completely reliant on their capacity of preventing spammers using their service. I have found that there has been an increase of e-mails returned which have been bounced by receiving e-mail servers due to this third party company having been black-listed as a spammer. This has presumably come about due to a user or two using their system to send out spam. Obviously this is not a great relationship for us to have for two reasons.

  1. We do not want to be related to spammers
  2. People actually do want company news, having signed up to receive it.

So I’ve made a decision to go in-house with one of our big mailing lists as this ensures we have zero relationships with blacklists and users will get the mail they wish, having opted-in for it.

Mail Headers

As domains are cheap as chips I have purchased a new domain (related in wording to the main company site) specifically for the purpose of being the originator of the weekly newsletter. This makes sense on many levels, not least that within the mail headers sent out, the domain will be listed as the mail sender and therefore will keep the news as a distinct department and away from the main company business.

I feel this makes a great deal of sense and can in fact been seen in use by tons of companies, for many different purposes, press domains, marketing domains,image domains, video domains etc, all essentially "branches" or extended arms of the company.

More soon.

Music Industry Sues Hundreds Of File Sharers At Colleges

Music Industry Sues Hundreds Of File Sharers At Colleges (washingtonpost.com): “In the year and a half since the Recording Industry Association of America, the trade group of major music companies such as Sony BMG Music Entertainment and Universal Music Group, began suing Internet song-swappers, more and more college students have moved off the Web to trade music on Internet2, a separate network used by universities and colleges for sharing research and other academic works. “

e-Learning Expansion

University of Ulster Online – News Release: “The University of Ulster today unveils three innovative postgraduate programmes in social research skills. Delivered by distance learning, they will be offered through the University�s Campus One from October 2005.

The part-time programmes have been developed by the School of Policy Studies and, according to Course Director Professor Sile O�Connor from the Magee campus, because they will be delivered through e-learning, the programmes will be particularly suited to people who need maximum flexibility in time and place of access to learning opportunities.”

Ignoring Robots

I’ve recently been playing around with a perl script which sits in the /cgi-bin/ of a site.

The idea is that the script is to catch robots that are not obeying the robots.txt files found in the domains root. The robot.txt is a useful file (if obeyed) in that search engines and content spiders read what is allowed to be viewed or not. If a particular folder or file is disallowed in the robots.txt then they robot should in fact ignore the said file. The problem being that as the internet is full of leachers and badly behaving robots just out to leach information, we needed some way of being able to stop that practice.

After some diving and seeking I found a perl script ready to go. The script is great in that if triggered, the script will add the robot to a ban list in the site’s .htaccess file thus banning the IP from the site completely.

The way I’ve implemented this is to have a hidden link of the form:

<a href=”getout.php” onmouseover=”window.status=’Burglar Alarm'; return true;” onclick=”return false;”>
<img src=”../images_folder/oddly_named_graphic.gif” alt border=”0″ WIDTH=”1″ HEIGHT=”1″></a></td>

This link is not evident to a human surfer but a spider would find the file getout.php and try to read it. Little does the spider know however that getout.php is re-directed to my perl script, thus snared!

Within a couple hours of implementing this across our clients sites the script had sent me an e-mail informing that it had banned the first robot.

A great way to keep your sites bandwidth down and your server running fast.

Latest Blog News

It’s been a while since I last blogged and a few interesting things have taken place. I recently installed a second Amazon affiliates shop on audiocourses.co.uk, in order for the UK sufers to buy. I’m still amazed that amazon doesn’t make the geo surfing particularly easy, so I stuck it in our end. You can see that shop front here: Audiocourses.co.uk

The current categories are:

The benefits are many fold for that co.uk site as well as providing some useful
leads to the .com portal too.