1. What is a spider? What does it do?
Its a type of software agent that collect data from web pages to index later. They are often refered to by a number of different names such as web crawler, automatic indexers, ants, bots, web robots.
Spiders or crawlers can be used for a number of different purposes but the main one is for search engine wanted to get a copy of the website pages visited so that it can index these pages so that the search engine can produce faster searches. Another use is they can be used for is searching through a website to ensure that the contents of the webpage are formatted correctly and that all hyperlinks are working, and also gathering specific types of information on webpages such as harvesting email addresses for spam.
So in its basic form a spider or web crawler is a computer program that looks or crawl over websites on the internet looking for or collecting certain information about webpages for particular uses.
Very interesting read about how a search engine spider actually works
http://www.articledashboard.com/Article/How-exactly-do-search-engine-spiders---robots-work/1185146
2. Differentiate the various types of software agents.
Software agents are computer programs capable of flexible, autonomous action. According to Griss (2001), there are three types of software agents which he speaks about in his paper:
Personal Agents - interact directly with a user, presenting some "personality" or "character", monitoring and adapting to the user's activities, learning the user's style and preferences, and automating or simplifying certain rote tasks.
Mobile Agents - are sent to remote sites to collect information or perform actions and then return with results.
Collaborative Agents - communicate and interact in groups, representing users, organizations and services. Multiple agents exchange messages to negotiate or share information. Examples include online auctions (GetPrice.com.au, priceline.com)
3. Identify various activities in e-commerce where software agents are currently in use.
I've picked two sites i believe are using various software agents
webjet.com.au
When you undertake a search for flights this aggregates all the data to one site for all the airlines in realtime
getprice.com.au
Very similar to webjet, but this is for product searches it aggregates all the product information and products costs from various e-commerce outlets to the one page
4. Computing ethics and bot programming case study: rocky
a. Get an account username and password from the lecturer to LC_MOO at http://ispg.csu.edu.au:7680 and login to the Welcome Lobby
b. Hold a 5-minute discussion with Rocky on a special topic
c. Rocky is an ELIZA-like bot. Report your findings.
Discussion with ELIZA-like bot rocky train4 activates Rocky. Rocky says, "Gee thanks for waking me up! I must have dozed off.." You say, "how was your sleep?" Rocky says, "I would rather not discuss my 'sleep' if it's allright with you..." You say, "what is the time?" Rocky [to train4]: the time? I'm not sure i understand.. You say, "what is the date?" Rocky says, "Don't you know?" You say, "no" Rocky says, "Why are you so negative?" You say, "im not negative, just that i don't know the date?" Rocky says, "Hmm, not sure I know.." You say, "how old are you?" Rocky says, "Do you come here often?" You say, "first time?" Rocky says, "I am sure you should be working on another assignment..." You say, "i've already finished an assignment" Rocky says, "Sometimes I wished I stayed asleep..." train4 hushes Rocky.
I found it quite interesting, and a little fun, but you had to ask the right questions to get a good answer, but this was not always the case because it asks another question even though you want it to answer yours.
No comments:
Post a Comment