Tuesday, November 30, 2010

Muddiest Point

I believe the websites we were given were the weakest bits of reading for the week. I was able to figure out what sorts of issues about which both of the sources were trying to inform me, but I often prefer receiving my information as articles. When it comes to observing websites as sources, I often I have to go through the trouble of going all over place, picking up on the details from one location after another, and putting the content into a nutshell. For the case of the articles, all the details I need to know about are already in front of me, rendering the information well-prepared for me to analyze; thus saving for myself a lot of hassle. The “EPIC Terrorism (Total) Awareness Page” website was nothing but links to other articles and I did not know how far I was supposed to read. As for the “No Place to Hide,” I did not know as to whether I was supposed to be analyzing the content of the website or the book to which it appears to be devoted. Either way, I really wish I had something more straightforward to read instead.

Discussion Topic: Online Privacy

The biggest fear in regards to privacy is not really the notion of being watched, but what those at the other end are capable of doing with information pertaining to the individuals under surveillance. When it is an issue of national security, how threatened people tend to feel all depends on the level of trust they are willing to offer to their government. The only way that authorities are able to track down the suspects through advanced technology is by maintaining the anonymity of those they are inspecting, while implementing proper strategies to counter the possible threats. Yet it is not always a government that would want to gather information on the people, but the businesses. Since big businesses are always competing for achieving higher quality, they obviously feel the same way about finding employees who the employers believe are capable of delivering that level of potential. By bringing social networks to their advantage, employers are able to conduct the background checks so as to determine the value of those they are putting into consideration for hire. Whether it is bigger governments or bigger businesses monitoring over everyone, these circumstances leave people in a desperate situation where they must defend their reputations to whatever extent necessary. The best way to fight back is by fighting fire with fire. Rather than submitting to the technologies that are already in wider use (which in turn ensures more power to those who distribute the products), more innovation needs to be encouraged amongst the people. As people learn to be more tech-savvy and promote their products, they are able to establish a greater sense of independence from the big businesses, as well as the big governments. As more become independent, there is also more competition. As more competition comes about, there is a lesser chance for one to impose its will on all.

Week 13: Comments

Comment #1: http://lostscribe459.blogspot.com/2010/11/week-13-reading-notes.html

Comment #2: http://christyfic.blogspot.com/2010/11/reading-notes-week-13-dec-6-2010.html

Week 13: IT Issues: Security and Privacy

Because the latest breakthroughs in technology allows just about any piece of information to be created, published, and distributed for the general public to see, there is more to consider than the risk of coming across junk. One of the most frightening aspects of the Internet that has been coming about is the risk of our privacy being compromised. Since any piece of information can easily make its way through the Internet, some of the content that is bound to become more visible to a wider range of users may include information about individuals. People have every right to feel afraid about their personal backgrounds being made public, especially when there is the issue of how the authorities may react. In order for criminals and terrorists to remain one step ahead of the law for the sake of carrying out their plans with greater success, they needed to adapt to more sophisticated methods. As they further ensured their survival, the government obviously became more paranoid. On account that certain individuals with an antisocial personal disorder are capable of hiding in plain view, secretly spying on everyone seems like the most logical approach to detect the suspects and uproot the perpetrators. Because there are always possibilities that the reasoning of those carrying out the investigations can be overtaken by a paranoia from the surroundings, a lust for power, or other factors that lead to corruption, just about anyone is prone to being labeled as a suspect. Putting such a factor into consideration, it should be rendered as justifiable for libraries to refuse disclosing information about their patrons. When the law is pursuing those engaging in antisocial activities and a lead would point to a library, it is crucial for the staff to cooperate. Yet if there are suspicions that the investigation is to be conducted in a reckless manner (i.e. utilizing the most unconvincing reasons to potentially prosecute any innocent civilian), only then it would seem appropriate for the library to stand its ground. Reassurance must be given that the investigators are to abide by a genuinely proper procedure in carrying out their tasks so the rights of the patrons would still be respected.

Based on what I can remember from one of my Information Technology courses during my junior year as an undergraduate student, it is often the governments that are the first to utilize the latest technological breakthrough that occurs. As soon as the innovation occurs, the predecessor becomes available to the general public (with the big companies often being the next in line). The United States government is obviously not exempt from this concept, especially when in times of war. It is important to have the most advanced technology readily available and at one’s disposal for the sake of being able to out-maneuver the enemy with greater ease. The outstanding technologies we have today were developed as a result of the most major world events that ended up taking place during the 20th century, which were World War I, World War II, and the Cold War. Had it not been for the global impact they made, we probably would not have the technological luxuries that we have today. However, all this seems to be coming at a price. As the times change, so does crime and the threats it imposes. With terrorism officially becoming the new military threat to civilization, there is a new kind of challenge being faced. Since the participants of such activities are capable of carrying out their plans in a manner that is becoming increasingly difficult to detect out in the open, our government felt compelled to spy on our own people just to pick up on potential leads. The irony to this entire scenario is that we condemned the Soviet Union for their surveillance on civilians (which was especially the case for the Stasi in East Germany) and Nazi Germany for devastation it inflicted, and yet this nation had to experience the ordeals of the McCarthy trials, the Vietnam War, the Patriot Act, and the War in Iraq. It is becoming apparent that the power people have garnered through the use of updated technology is gradually turning them into everything they have always hated. It is a matter of time before they will be rendered as no different, only to be replaced by a new form of a superpower, who themselves will be going through a similar process all over again.

Monday, November 22, 2010

Muddiest Point

I thought the weakest of all the readings was the article, "Using a wiki to manage a library instruction program: Sharing knowledge to better serve patrons" by Charles Allan. I am not saying it was useless. I found it very informative and, considering the topic that the class needed to focus for the week, I knew for sure it fit right in there. The problem was that we were also required to watch a video from TED, titled “How a ragtag band created Wikipedia” by Jimmy Wales. Because the video explained how useful of a purposed Wikis tend to serve (not to mention Wikipedia is often the first thing that pops into people’s minds whenever the technology is being mentioned), I felt as though there was probably no need to bring up the article by Allan. But then again, I tend to absorb more information whenever I am listening to the content rather than reading it, so that could explain why I felt as though I was able to grasp more information about Wikis from the Wales video rather than the Allan article.

Week 12: Comments

Comment #1: http://lostscribe459.blogspot.com/2010/11/week-12-reading-notes.html

Comment #2: http://rjs2600.blogspot.com/2010/11/readings-for-11-29-12-3.html

Week 12: Social Software

One particular habit in which members of the scientific community need to constantly engage is the use of journals or logs for their work. By keeping track of all their activities throughout the day, scientists are maintaining and updating a source that can serve as a reference for them. Through the utilization of the web-log, or blog, not only does the technological innovation enable the scientists more sufficient means for recording the events of each time period in accordance to their entries, but also to easily publish the information and to allow others to observe the details of their studies. The drawback to the data is that they have been recorded in an off-hand manner through the perspective of an individual. This in turn can render the source as not quite presentable and fall short from being considered as professional. That is why there are wikis to encourage more collaboration within the scientific community. As members are able to bring their observations to the table, each individual takes part in assembling the information and editing it in a manner to ensure coherence and accuracy. However, just because these sources are being made available, it does not necessarily mean they would be accessible. Of course, there are always search engines to help locate the source, but unless the user knows the title of an article or its publisher, the information basically remains lost within a shuffle. That is why there is the practice of folksonomy to help narrow down on the search. By providing the means to label sources based on the topics with which readers may tend to associate them, the tags serve as an alternative option for retrieving the types of information that users seek within the confines of a specific subject. Such a task could be utilized by the scientific community, but it should be intended more for the general public. It is for their sake the information needs to be made available and accessible, which means the people should be entitled to label the sources in which they see most fitting. Although these technological innovations provided for members of the scientific community more efficient means to gather, assemble, publish, and distribute their research, even those tasks should not be made exclusive to those individuals in particular. As a result of Wikipedia, the general public has not only the power to organize sources of information their way, but to also create them. The website permits people to conduct their own researches in certain fields and contribute their own articles. As a result of more sources of information from the scientific community that have become more widely available over the Internet, users have been given more opportunities to access the content, which enables them to create the kind of material that a website like Wikipedia is seeking to fulfill its purpose.

I am being reminded of the book “The Meaning of Everything: The Story of the Oxford English Dictionary” by Simon Winchester. Some of the earlier versions of the English dictionary were developed single-handedly, as was done by Samuel Johnson and Noah Webster. Although the English-speaking world does indeed owe a debt of gratitude for their devotion and hard work, for obvious reasons their contributions were simply not sufficient enough. When it came to the development of the Oxford English Dictionary, a completely different approach was to be utilized. The original members of the staff who were responsible for establishing the project had decided that instead of just taking up the responsibility themselves, the general public was also to be involved. People were requested to submit their list of words, as well as their definitions, while the staff consulted each other on what to accept and reject and what edits need to be done. From there, they were to be compiled, organized, and then assembled. The parallel should be pretty noticeable between how this endeavor came about and how Wikipedia made its impact. Just like the development of the Oxford English Dictionary, the creators of Wikipedia knew that the job of assembling a reliable source for information is no easy task for a group of a few individuals; hence why they believed more sufficient results could be achieved by turning to the general public for assistance. In comparison to how Encyclopedia Britannica presents itself, Wikipedia offers a lot more flexibility. Whereas the former collects its information very selectively and has the articles created and assembled by a devoted group of scholars in a very professional manner, the latter allows just about everyone to contribute whatever pieces of information they want and can assemble the sources in a similar manner. Although it appears as though Wikipedia functions in a chaotic manner, the staff is smart enough to realize that a certain degree of order always needs to be maintained. Because the staff is always looking through articles to check for accuracy and neatness, this is a clear indication that the professional model often observed by the older generation (as is for the case of Encyclopedia Britannica, of course) has never been abandoned, or at least not in its entirety. Even though the manner in which most people tend to gather and publish information via Wikipedia may not be as professional as how scholars perform their duties for such sources as Encyclopedia Britannica, as long as Wikipedia gives readers a general idea about every topic that is available (as what any other encyclopedia attempts to accomplish), then it is successfully fulfilling its purpose.

Tuesday, November 16, 2010

Muddiest Point

I believe the article by Sarah L. Shreeves, Thomas G. Habing, Kat Hagedorn, and Jeffrey A. Young, titled “Current developments and future trends for the OAI protocol for metadata harvesting. Library Trends,” was probably the weakest piece of material for the week. I am not saying it was uninformative or incomprehensible in any sort of way. Indeed, I was able to figure out the core elements of the article and how it was related to the other two articles. Then again, I did study the subject before, so that could explain why I was able to catch on. The reason why I believe this article in particular seemed like the weakest is because when I compare it to the others, I notice that they included visuals as a means to depict how the devices they are describing tend to function. The article we were given about Metadata does not provide such aids, which leaves the readers to use a little more effort to figure out how it works. Putting those factors into consideration, it would only seem fair that an article on this topic should follow the example of the other two by utilizing visuals of its own to better explain how it works.

Z39.50 at Zoom@ Pitt and OAI-PMH at NSDL

I am able to confirm noticing the quirks that “Z39.50 at Zoom@ Pitt” and “OAI-PMH at NSDL” are capable of demonstrating. For the case of the former, I think the reason why some of the databases tend to function more slowly than the others is probably because of the amount of content that is being handled. As a database needs to manage more content, it obviously has to take more time to maneuver through the increasing collection. If the load was any lighter, the database would not experience as much of this problem. And for the case of the latter, because the organization was created via a government agency, the space and proficiency it is able to afford is practically unlimited, or at least compared to what the other can achieve. However, this is not to suggest that it would be without fault. It is likely that the website could be using more than enough resources than what is potentially needed. As a result of those circumstances, so many options would end up being created for the users, only to end up narrowing down on the same sets of documents that are actually available in the website over and over again. On the other hand, these observations are all based on my own personal assumptions.

Week 11: Comments

Comment #1: http://adamdblog.blogspot.com/2010/11/unit-11-reading-notes-11-22-2010.html

Comment #2: http://acovel.blogspot.com/2010/11/unit-11-reading-notes.html

Week 11: Web Search and OAI Protocol

See this link: http://att16.blogspot.com/2010/11/week-11-web-search-and-oai-protocol.html

Tuesday, November 9, 2010

Muddiest Point

I believe the article by Clifford A. Lynch was probably the weakest piece of material. What led me to state this opinion is the fact that when I compare it to the other two, it seems rather wordy. The first two articles from “D-Lib” seem a lot more straight-forward in terms of trying to understand what the topic of this week is all about. Because the information from the third article did not appear to demonstrate that sort of brevity, I had a bit more difficulty absorbing the core aspects. Of course, I am not claiming the article was bad, let alone useless. It was still informative and I managed to figure out how it was related to the other readings. All I am saying is that if the first two articles were to provide information in a rather straight-forward manner, then it would seem logical that the third one should be presented in a similar way. Other than that one particular issue, I did not really have that much trouble observing the content from any of the readings.

DiLight System/NYPL DL

The websites we were given have clearly demonstrated how the search for items within libraries have been simplified. Whereas the organization of finding aids in their physical copies can be a much more tedious process, the digital copies prove to be a lot more efficient. Through the efficient use of the digital finding aids, the searches have shown that the libraries have sources of information in all kinds of different formats, ranging from books, to audios, to DVDs. What enables this sort of solid, yet flexible, structure is the database within the computer system. The database keeps track of all the records that indicate what items are properties of the library. The records provide all the information regarding the item that it represents. Since the information also includes where the actual items are located, the records can also function as finding aids. Of course, that is under the assumption the represented items happen to be in their proper places.

Week 10: Comments

Comment #1: http://rjs2600.blogspot.com/2010/11/readings-for-11-15-11-19.html

Comment #2: http://pittlis2600.blogspot.com/2010/11/week-ten-reading-notes.html

Week 10: Digital Library, Institutional Repositories

Because the introduction of digitization, as well as wider use of the Internet, changed the way sources of information can be formatted and organized, it was only a matter of time that libraries, a haven for sources of information, had to adapt to these technological breakthroughs. However, in order for the incorporation to work, the current staff within the libraries simply could not implement the tasks alone, especially with their increasingly out-dated methods. The situation called upon computer scientists and their field of knowledge to collaborate with the librarians. As a result of the cooperation, the transition became a success, which in turn established and made use of the digital library. As the digital library became well-recognized along with the increasing popularity of the Internet, more libraries had to keep up with the times by adapting to these technologies. By maintaining a collection of sources in their digital format and making use of the Internet as a means to keep them available, libraries were able to continue satisfying the needs of the general public, who are readily engaging in different methods to obtain information. Yet the libraries alone should not be burdened with the task of making digital/digitized sources available to the general public. The universities also contain a treasury for sources of information within their archives. Because the academic system also needs to keep up with the times as much as the public library system, it would seem logical that those institutions also make use of digitization and the Internet. Through the use of their institutional repositories, the items being kept within can be digitized and published over the World Wide Web. As more institutions devoted to collecting, organizing, and maintaining sources of information obtain and incorporate digitization and the Internet (as well as the other latest technological breakthroughs and trends), so long as the people remain engaged with their gadgets, information can become more and more available and much easier to access for the general public.

As noted by Christine L. Borgman in her book “Scholarship in the Digital Age: Information, Infrastructure, and the Internet,” the notion of a “digital library” had been dismissed at first. According to the skeptics, the concepts of the library and digitization were incompatible, i.e. “if a library is a library, it is not digital; if a library is digital, it is not a library.” As the times were passing by, science has once again proved the skeptics wrong. Modern-day technology has clearly demonstrated that the format of the library did not have to remain in the confines of what has traditionally been defined as such. As long as an entity is taking on the responsibility of collecting, organizing, and maintaining different sources of information (conventionally from various fields), it can still be considered by a technicality as a library, whether it is in a physical or digital format. The flexibility of this concept should also apply to the archival and academic communities, on account that the items consisting of their collections can also exist in their physical and digital formats. Because universities often maintain libraries and archives within their institutions, through the incorporation of digitization and the Internet, the circumstances render them as the most benevolent of contributors to the general public. However, their generosity does not have to stop there, or at least not within those specific areas. Since universities also preserve the researches of the scholars who had contributed to their institutions, the utilization of the technologies also enables them to quickly publish their works and make the materials readily available via the Internet; thus allowing even more sources of information to become accessible to the general public.

Wednesday, November 3, 2010

Week 11: Web Search and OAI Protocol

Because people have the means to create their own websites, just about anyone can publish their material over the Internet. As more people are obtaining the ability to do so, sources of information can easily get lost in the shuffle. That is why there are search engines to help establish order. The device utilizes an algorithm that calculates how often a website is associated with key words, which is based upon the amount of visitors it receives. As a result, the most popular websites in every category end up reaching at the top of the list when the search begins. However, just because a website happens to be more popular, it does not necessarily mean the content would be accurate. This was when Metadata had been introduced, so that the websites can be harvested in a more meticulous manner; thus concentrating more on the quality of the content, rather than the quantity of the visitors. However, the approach is bound to be met with disagreement, or even hostility, on account that it could encourage some form of elitism. Technically speaking, Metadata is capable of providing for a select few the power to determine which websites should be rendered as superior to others, while the opinions of many are being ignored. With or without the Metadata, the search results being presented ultimately scratches the surface at most. This is why there was a need for another technology that looks much deeper within the content of the website, providing greater accuracy and efficiency from the search results. The use of the Deep Web achieves this goal through a compromise of the core attributes between the popular approach of the former and the selective approach of the latter.

Regardless as to what technology will become available, what will frequently become the case is that a minority ends up overpowering the majority. The only significant difference is whether the shared opinion of a populace or an agreed-upon decision by the elite becomes the determining factor. Whichever might prevail gets to choose their minority that is to overshadow the majority. There are always possibilities that tensions could arise between the two sides, but not all the time. Sometimes the popular choice and the right choice can in fact be one and the same, and it is instances of these mutual agreements and understandings that Michael K. Bergman’s proposition attempts to exploit. Even if only the most genuine of all websites manage to successfully become the minority, the situation becomes increasingly difficult for the majority. Although there is the relief of knowing that websites providing nothing but junk are more likely to be thrown deeper into the depths, websites of better quality are still going to potentially be ignored. A website can provide information just as professionally as any scholarly source could, but there is still no guarantee if it will achieve higher recognition. One factor that the latest innovation explained by Bergman would ignore, which might also apply to future successors, is human nature. So long as people want quick results and can get bored easily, whatever it is that any individual chooses to look into, chances are that the person will only glance at the top ten results at most and will only bother looking thoroughly through one of them (if at all). Unless a website has made an impact on the populace and the elite, it would be lucky to reach even the top twenty.