- DL's are expected to remain stable but computer science field has the make sure they are stable despite rapid advancements in Internet technology
- Systems are expected to be interoperable with other DLs
-existing systems classified as DLs have resulted from custom built software development projects. There are built in isolation to suit the needs of a specific community. Most DLs are quick responses to urgent needs by a community of users. As DL systems get more complex extensibilty becomes more difficult and maintainability is compromised. There are few software toolkits available to build dls.
The solution to this problem is the creation of software toolkits. Of the the existing toolkits there are two main problems.
1. The rang of possible workflows is restricted by the design of the system
2. The software is either built as a monolithic system or as componets that communication using non-standard protocols.
In 1999 the OAI was launched in an attempt to address issues of interoperability among dls. The resulting protocol is simple and popular. In the OAI dls are modeled as networks of extended open archives, with each extended OA being a source of data and/or a provider of services. Componentization and standardization are built into the system. Closely resembles the way physical libraries work.
How OAI can provide higher level dl services
1.All dl services should be encapsulated within components that are extensions of open archives (I am not sure what this means)
2.All access to the dl services should be through their OAI interfaces
3.The semantics of the OAI protocal should be extended or overloaded as allowed by the OAI protocal but without contradiction the essential meaning
4.All dl services should get access to other data using extended OAI protocol.
5. Dls should be constructed as networks of extended open archives
Digital Libraries and the Problem of Purpose:
-Problem facing public libraries? How will the Internet affect the accepted library purpose. Will they fashion themselves into portals for using the Internet.
-Problem facing academic libraries? How can the perform their traditional functions when faced with increase in prices for materials. Perhaps work with scholarly societies to create their own journals.
Purpose issues with DLs
1.The idea of an all digital world will probably not come to pass. By prescribing to this idea creators of DLs are missing out on opportunities to integrate heterogeneous collections into DLs.
2.More information is not always better. So the push to continually put more content on DLs does not improve them and in some cases makes them less functional.
3.The DL agenda has been largely set by the computer science community. DLs need input from social scientist and the traditional library community.
The Internet and the World Wide Web
-The Internet is an interconnected group of independently managed networks. Each network supports the technology for inter-connection
Local Area Networks - created to link computers within a department or organization
Wide Area Networks - National Networks
IP - Internet protocol. joins together separate network segments that constitute the Internet Assigns a unique (IP) address to every computer on the Internet.
TCP - Transport Control Protocal. Takes a message divides that message with a destination IP address and sequence number and sends it out on the network. The receiving computer reassembles the message sends it to the application program and acknowledges that the message has been received.
-Not all packets are received successfully overloaded routers drop/ignore some packets meaning the sending computer never gets acknowledgement that the sent message has been received and sends the packet again.
Dropping-a-packet - overloaded router
Time-out - resending packet
UDP - sending computer sends out a sequence of packets hoping they all arrive. The UDP does its best to guarantee all packets all packets will arrive. Think streaming audio
Domain Names - links multiple IP addresses under one domain name.
TCP/IP suite - a group of programs
-Terminal Emulation - telnet is a program that allows personal computers to emulate a terminal that relies on a remote computer for processing. Typically used for system administration.
-File Transfer - the basic protoc0l for moving files from one computer to another across the Internet. FTP. Email uses the simple mail protocol (smpt)
World Wide Web - is a linked collection collection of information on many computers around the world. "It provides a convenient way to distribute information over the Internet. Individuals can publish information and users can access that information by themselves without training.
URL - uniform resource locator. Provides a simple/flexible addressing mechanism that allows the web to link info on computers all over the world. Three parts
1. http is the name of the protocol
2. www.blah.com is the domain name
3. andrew.html is the file on that computer.
http is the protocol that is used to send messages from web browsers to web servers
MIME types - specifies the data type of a file being sent across the Internet
Reading question
I was hoping we could spend some time discussion OAI. I understand the need for standardization as a means of increasing interoperability among different DLs but the specifics of the OAI, such as the components, are confusing to me.
No comments:
Post a Comment