Friday, October 24, 2008

Research Using the Internet - Evaluating Search Results and Source Credibility

When preparing for a research paper, there are many places you can go to find the information you are looking for. Traditionally, you would go to the library and browse the many books they have looking for exactly what you want, but times have changed. Thanks to the internet you no longer need to look through hundreds of book to get what you are looking for. You can now use the internet to search for articles, books, magazines, and websites containing the information you are looking for, which may be available virtually and/or physically. Thanks to this technology, research can be done much more quickly than in the past, but there is still much needed knowledge to ensure you are getting the results you want, with information from credible sources, fast. Keywords must be carefully chosen to get the results you are looking for from a search engine, and sources must be carefully examined. According to Bonnie Tensen (2004), there are six aspects to examine when evaluating the credibility of a source. They are the purpose (why it was written), the source (who wrote it), the intended audience (is it general or in-depth), the date it was published, the appearance (how it looks), and reputation. Using these six aspects when evaluating a source should help identify its credibility, and whether or not it should be used in your paper.

While researching the Web 2.0 technology Facebook, I carefully noted which keywords and search engines gave me the best results, and critiqued the validity of each result to make sure that they will be suitable for my upcoming research paper. The two search engines I used to look for information on Facebook were Google and Yahoo. I decided that it would be a good idea to try a broad search using only the keyword “facebook” in each search engine and see what results were returned. The first ten results returned from each search engine were very similar, but there were one or two sites that were different. Both Google and Yahoo returned the main site for Facebook as the first result. This was the actual site where you could sign up for and log into your Facebook account. If you wanted to include some hands on use of Facebook in your paper, this would be the only place you could really get it. Another result that both Google and Yahoo returned in the results was the Wikipedia page on Facebook. This page included a lot of useful information about Facebook, such as when Facebook was created, who the founder is, and how Facebook has changed in recent years. This seems like a credible source for information, but it must be noted that the information on Wikipedia is added by the general public who want to write on the topic. It would be a better idea to follow the references at the bottom of this page to find where the information was derived from and use that source for your paper if it is credible. Other results that Google and Yahoo shared were the iPhone Facebook application and the Facebook login page.

There were also a few results Google and Yahoo returned that were not the same. Google returned the Facebook Company Profile from crunchbase.com, as well as, Facebook – The Complete Biography from mashable.com. Both of these websites provide in-depth information about Facebook. The profile from crunchbase.com seemed a lot like Wikipedia. There is a lot of information written with sources provided at the bottom of the page, and the site was recently updated in 2008, making the information very up to date. The biography that mashable.com provides is also very informative, and even includes pictures showing people how to use Facebook, however, the article is from 2006 so some of the information may not be reliable. Two different results that Yahoo provided were The Facebook Blog and Facebook Blast. The Facebook Blog is a blog that is actually run on the Facebook website. All the posts there seemed to come from people who worked for Facebook, and some even came from Mark Zuckerberg, the founder of Facebook. This blog seemed like a very credible source to use for discussing recent Facebook updates. Facebook Blast, on the other hand, may be reliable if you were looking to spruce up your Facebook, but included no information on Facebook that would be usable for a research paper.

After going through many search results returned from Google and Yahoo, I decided that I would give the LexisNexis database a try. In LexisNexis, you use keywords to search just as you do in search engines. LexisNexis uses these keywords to search within the publications of your choice. I chose to use the keyword “facebook” as I did in Google and Yahoo, and search within major U.S. and world publications. Close to one thousand results were returned and sorted by relevance to the keyword that I had used. Each result included the title of the article, where it was published, and when. Many of the results were published recently, which indicated to me that the information would be up to date. They also came from publications that I have heard of and deemed credible already. One article that came from the New York Times was entitled, “The Way We Live Now: Facebook Politics” and was published September 14, 2008. This article discussed how Facebook was being used by politicians to reach out to voters. Coming from the New York Times, I trust that this is a very credible source. There may be a bias in relation to the politicians and how they are discussed, but I believe the information about Facebook will remain very factual. article I found using LexisNexis was “Facebook Exposes Users To Search Engines”, which was published by the web-based publication TechWeb on September 5, 2007. This article briefly discusses the issue of privacy on Facebook, and how anybody can find you by typing your name in Google unless your Facebook profile is set to private. The source seems fairly credible as it seems like a large company with many other websites; however, there are many ads that may discredit the site.

After using search engines like Google and Yahoo, and the LexisNexis database to research the Web 2.0 technology Facebook, I learned a lot about finding information and evaluating sources on the internet. Both Google and Yahoo returned similar results, but not all were the same. Given the fact that I found the different results that each provided useful, I think that I will use both search engines simultaneously in future searches. For the keyword “facebook” Google returned 580 million results, while Yahoo returned 1.8 billion results. I think in future searches I will try and narrow the results by using Boolean operators like AND or NOT to find exactly what I am looking for. An example of this would be using the keywords “facebook AND history” to get results that include both the words facebook and history. It will also be important for me to evaluate each result individual to make sure that it is credible and will be able to be used in my paper. I especially liked using the LexisNexis database due to the fact that there were not too many search results and I had the choice of which publications I would like to view. I believe that a majority of the information available here is credible, and will definitely be using this again for my paper. By adding more terms, I think I will be able to narrow the results down to exactly what I want, which will be provide me with useful information in a timely manner.

In conclusion, through my personal use of these search engines and the LexisNexis database, I would say that the internet has definitely provided a new way to research, and a quicker way to do it than in the past, but more knowledge and evaluation must be done on the internet to ensure the credibility of the sources and that you get the results you want when searching. If the use of keywords and Boolean operators is perfected, and sources are fully inspected of their credibility, the internet can be an extremely useful tool in doing research for your next paper.


Sources:
Tensen, Bonnie L. (2004). Research strategies for a digital age (chapter 5). Boston: Wadsworth.

Thursday, September 25, 2008

Usenet: The Current State and The Future

Internet communication has come a long way from what it originally started as. Pre-web internet communication was done primarily through text. Most of the time, this communication would be asynchronous, where users can write something and have to wait for a reply. Later internet communication adopted synchronous methods where users could send a message and immediately receive a response (Adams & Clark, 2001). There are multiple locations and programs people can use on the internet as a way to communicate. Some of the places and programs include IRC (Internet Relay Chat), MUDs (Multi-user Dungeons), and Usenet. Here people can search for specific groups conataining discussions related to specific topics. They can choose where they would like to go, how often they participate, what their username will be, and what they will say. This anonymity could be a good thing or a bad thing. If the actual author of a message is unknown will they be more apt to talk off-topic, spam, and take information without participating? If so, how is this currently dealt with and what does the future of internet communication hold?

In hopes of finding answers to these questions I subscribed to a Usenet group. The reason I chose Usenet is because “the Usenet is one of the largest is one of the largest computer-mediated communications systems in existence.” (Kollock and Smith, pg. 111) I figured that there were enough groups to choose from and enough members in these groups that I would really be able to get a grasp on what goes on in the area of internet communication. I decided to join a NY Giants group, which can be found at alt.sports.football.pro.ny-giants, and analyze the communication within the group for five days. Joining the group was not hard. I just clicked subscribe and was asked for my email address and to create a username for the group. After that I was asked if I would like to receive emails containing any updates or changes to the group. There were a bunch of options to choose from, but the one that suited me best was to receive one email a day advising me of all the activity that happened for the day. I was now set up and began browsing some of the posts.

Because this was a football group and it was a Friday, I expected to see some talk of the NY Giants past performances and thoughts on how they would fair in Sunday’s game. I noticed a few posts related to this, but I noticed a lot more that did not. According to Kollock and Smith, these posters would be called grandstanders, or people who post without regard to the topic. Two threads posted by grandstanders that caught my eye had the titles “"I am begging everyone, PLEASE vote for the Palin/McCain ticket!" and "OT - Jessica Alba And Her Boobs Again ". This is a football group so what were these doing here? I was also curious as to how other members would respond. The first thread got many responses calling him an “asshole” and “jackass” for posting this. The other thread didn’t receive any responses at all. Looking at the titles of these threads I noticed that the Jessica Alba one had OT in front of it. I assumed that OT was an abbreviation for off-topic, and this is why there were no harsh responses. The second day I saw more off-topic threads, some with OT in the title and some without. I noticed another type of thread that could be considered spam, but also semi-off-topic. It was a thread linking to an online gambling site. The only reason I say it could be considered semi-off-topic is because football is a sport and people do bet on sports, but because they are trying to make money it is probably just spam. I wondered how all this off-topic would affect the group as a whole and if it was just tolerated.

On the third day I found exactly what I was looking for. A new poster with the username tuck91 posted a thread titled "New Member Question". In it he asked, "...if the spam postings bothered anyone else. Or is it just tolerated?" This was exactly what I was wondering. The responses all revealed a similar approach, that they were just ignored. A few people gave specific usernames of posters that were known to spam or create off-topic threads. After this thread was created I began to see a decrease in off-topic posting and was wondering if this had anything to do with it. People were now talking about the Giants, what commentators said about the Giants, where they could watch the game, and how they were playing. On the last day of observing I was sad to see that the off-topic posting had continued.

In the article “Managing the Virtual Commons” by Peter Kollock and Marc Smith, they talk about how there is not enough space on Usenet so it must be conserved for relevant posts in order to assure a future for the group. They note that free-riders and grandstanders are a big part of the problem. Free riders are people that take information from a group without ever contributing to the group and grandstanders, as previously defined, are people that post without regard to the topic. In my five days of observing the NY Giant’s Usenet group, I had noticed that grandstanders are definitely taking up space by creating all types of off-topic threads and spam. I cannot really say how many people are free-riding due to the fact the free-riders never post, but I would assume there is a fair share of them.

In closing, I would like to say that my Usenet experience was very educational. I saw and witnessed the daily activities of a Usenet group firsthand. This helped me understand much more about internet communication, how it is managed, and possibilities as to where it is going. On the first day when I saw the spamming and off-topic threads outnumbering the on-topic threads I wondered how this was tolerated by other users. By day three, someone had asked just that. It seems that the grandstanders may never stop spamming and creating off-topic threads as long as the opportunity is there. The only thing to do is ignore these posts and get to what you are looking for. My only suggestion would be that Usenet appoint some type of moderator or program that would not allow for certain subjects to be talked about. Until then I guess it would be best if grandstanders started placing the letters OT in any post that is off-topic this way other members do not waste their time opening these threads. I think that by adapting to the current problems and inconveniences as well as those of the future, Usenet can be around for an extensive period of time and continue its growth.


Sources
Adams & Clark. (2001). What is it? Characteristics of the medium.
Kollock, Peter & Smith, Marc. (1996). Managing the virtual commons: Cooperation and conflict in the computer communities. In Susan C. Herring (Ed.), Computer-mediated communication: Linguistic, social and cross-cultural perspectives (pp. 109-128). Philadelphia: John Benjamins.
Alt.sports.football.pro.ny-giants

Wednesday, September 24, 2008

Web 2.0: The Difference

Today I read the article “What is Web 2.0: Design Patterns and Business Models for the Next Generation of Software” by Tim O’Reilly. This article was published in 2005 on the website http://www.oreillynet.com/. In the article, O’Reilly begins by talking about how the dot-com bubble had seemed to burst. Many websites had taken similar approaches on how the internet works and the way their sites should use it. With many sites using the internet in the same manner, people began to believe that the web was overhyped. What happened from here was the development of new websites used the internet in an entirely different way. A new web was born and given the name “Web 2.0”. O’Reilly goes on to talk about the differences between the old ways companies used the internet with the new, “Web 2.0” approaches.

According to O’Reilly, one of the methods to becoming successful in the “Web 2.0” era is “harnessing collective intelligence”. What he means by this is that companies can use information gathered over time to aid in helping new or returning visitors. With the presence of even more information than before, their experience can be made much easier and more informative than the last. One example that O’Reilly gives in this section is Amazon.com. Amazon.com “harnesses collective intelligence” by keeping track of user reviews and using them to rank search results. This is a great example that I feel many internet consumers can relate to. When you are in the process of buying something you usually want something that is not only said to be good, but also tested and verified. The best source for this information is the consumers who have already tried the products. Another way I have seen Amazon.com use previous information to be more helpful to the consumer is recommending other products. They tell you other products people have purchased after buying the product you are looking at. This is helpful to the customer and also to Amazon.com. They may be helping you pick a matching pair of pants for your shirt, but if you make the purchase, it also helped them to profit.

The concept of “Web 2.0” is one that is very complex and can be discussed for hours. It is a technology that will continue to advance. “Web 2.0” takes different approaches on how to use the internet than the previous version, and these new approaches are what make “Web 2.0” more useful than its predecessor. The internet will continue to evolve as time goes by and innovations will continue to occur. The important thing is to stay ahead of the game because history repeats itself. When the “Web 2.0” sites become oversaturated it is the “Web 3.0” sites that will rise to the top.

Tuesday, September 23, 2008

Final Day of Usenet

It has been five days now. I have searched for a Usenet group that interested me, joined, and read it everyday. I have topics that are relevant, topics that are semi-relevant, an topics that are totally off topic. Today I am seeing more off-topic postings. People must get bored off talking football after the games are over :). I am seeing postings of more "naked" celebrities that may be real or fake, and more political spam, which I assume will probably be on the rise with the elections nearing. I don't want to say too much because I will be posting an essay combining all of my findings with a theory about Usenet and its future that I have come to. So if my pasts posts have kept you entertained or enlightened you than stay tuned for my next post, "Usenet Today and Its Future".

Usenet from Yesterday

Hey. Sorry I didn't have time to post yesterday, but I still have all of yesterday's updates thanks to the daily email I receive. I have seen an increase in useful relevant conversation. Some people are still talking off-topic and semi-off-topic posting about gambling pics in the NFL, but for the most part relevant conversation is emerging. Yesterday was Monday so you did have talks about Sunday's game. People talked about how individuals of the team played, how the team as a whole played, and how the referees called the game. The Giants did win, but there was still criticism. I also noticed some older posts that received responses and got bumped up. One that was interesting was about the game being streamed live online. Did not know that the NY Giants games are streamed online for free! That's about it for yesterday. I will be posting today's observations in a few minutes.

Sunday, September 21, 2008

Usenet Being Questioned?

So today I took a look at the daily email and found another interesting post. A new member, by the name of tuck91, posted a thread titled "New Member Question". In this thread he asks, "...if the spam postings bothered anyone else. Or is it just tolerated?" A bunch of people responded telling him that they usually ignored them. On a better note, I also saw a thread with relevant information. The thread was titled "Eddie George calls Giants a fraud", and talks about how Eddie George said that a present 2-0 team he thought was a fraud was the NY Giants. The responses to this thread were a bunch of hating on Eddie George. Apparently the college team he played football on was caught cheating so he has no credibility. I have to say I am starting to like this Usenet. You learn something new everyday! Until next time...

The Creation of New Media

Today I read Chapter 3 of the book The Internet: The Basics entitled “New Media and Web Production”. This book was published in 2002 and written by Jason Whitaker. In this chapter, Whitaker does a good job of explaining the new media made available through the creation of the computer, and how they differ from media of the past. The main media that Whitaker focuses on in this chapter are text, hypertext, digital images, audio and video. Later in the chapter he discusses the basics of HTML, how the language has improved, and other languages and scripts that have stemmed from HTML.

According to Whitaker, the internet started text-based and has evolved through time as technology has improved. Hypertext was originally used to link text-based documents with other text-based documents, but as access to high speed internet has become more available, hypertext can now link images, videos, and audio. Another reason these types of media can be linked through hypertext is the ability to compress data. According to Whitaker, because there are certain colors we cannot distinguish, certain sound frequencies we cannot hear, and certain parts of videos that we do not need to see, they can ultimately be eliminated, compressing the file to reduce its size. As a frequent internet audio and video listener and watcher, I find this very interesting. It does make complete sense though. If we can’t see or hear certain things, why do they need to be there? Plus, by being able to eliminate these certain aspects and reduce the file sizes, it makes downloading these files a more enjoyable experience due to the decreased wait time.

Another part of the reading I found interesting was the section where Whitaker talked about digital imaging. Whitaker says, “There was a time, until very recently, a photograph was the touchstone for the truth of an event.” According to Whitaker, before images became digital, the process to edit a photograph was long, hard work, which is why people could usually used pictures to determine the accuracy of an event. However, today editing a picture has become a lot easier to do with the access to digital cameras and photo editing software. Take for instance the picture of Sarah Palin. She is seen holding a rifle in a bikini, but is that her? When I first saw this picture I was sure it was, but later the original picture was found. Somebody replaced the other person’s head with Palin’s, making it seem like it was actually her, when it actually wasn’t. This just goes to show you that you cannot always believe what you see.