A server is both a running instance of some software that is capable of accepting requests from clients, and the computer that executes such software.
Servers operate within a client-server architecture, in which "servers" are computer programs running to serve the requests of other programs, the "clients". This may be to share data, information or hardware and software resources. Typical computing servers are database server, file server, mail server, print server, web server, gaming server, and application server.
The clients may run on the same computer, but typically connect to the server through a network.
In the hardware sense, a computer primarily designed as a server is generally specialised in some way for its task. Sometimes more powerful and reliable than standard desktop computers, they may conversely be simpler and more disposable if clustered in large numbers.
The term server is used quite broadly in information technology. In theory, any computerised process that shares a resource to one or more client processes is a server. So, while the existence of files on a machine does not classify it as a server, if it uses some mechanism to share these files then it can be a file server. Similarly, web server software can be run on any capable computer, and so a laptop or personal computer can fulfil the role of a web server.
In the hardware sense, the word server typically designates computer models specialised for their role as servers to do it better than a generic personal computer.
Hardware requirement for servers vary widely, depending on the server application.
Since servers of all these classes are usually accessed over a network, many run in "headless" mode (without a monitor or input device), and audio and USB interfaces may be omitted. Processes that are not needed for the server's function are not used and many servers do not have a graphical user interface (GUI), being accessed remotely via SSH or with a web browser.
Large traditional single servers would need to be run for long periods without interruption. Availability would have to be very high, making hardware reliability and durability extremely important. Mission-critical enterprise servers would be very fault tolerant and use specialized hardware with low failure rates in order to maximize uptime. Uninterruptible power supplies might be incorporated to ensure against power failure AND Hardware redundancy such as dual power supplies and RAID disk system ECC memory, along with extensive pre-boot memory testing and verification. Critical components might be hot swappable, allowing technicians to replace them on the running server without shutting it down, and to guard against overheating, servers might have more powerful fans or use water cooling. They will often be able to be configured, powered up and down or rebooted remotely, using out-of-band management, typically based on IPMI. Server casings are usually flat and wide, and designed to be rack-mounted.
These types of servers are often house in dedicated server centers. These will normally have very stable power and Internet and increased security. Noise is also less of a concern, but power consumption and heat output can be a serious issue. Server rooms are equipped with air conditioning devices.
Modern datacenters are now often built of very large clusters of much simpler servers, and there is a collaborative effort, Open Compute Project around this concept.
A class of small specialist servers called network appliances are generally at the low end of the scale - often being smaller than common desktop computers.
On the Internet the dominant operating systems among servers are UNIX-like open source distributions, such as those based on Linux and FreeBSD, with Windows Server also having a very significant share. Proprietary operating systems such as z/OS and Mac OS X are also deployed, but in much smaller numbers.
Specialist server-oriented operating systems have traditionally had features such as:
- GUI not available or optional
- Ability to reconfigure and update both hardware and software to some extent without restart
- Advanced backup facilities to permit regular and frequent online backups of critical data,
- Transparent data transfer between different volumes or devices
- Flexible and advanced networking capabilities
- Automation capabilities such as daemons in UNIX and services in Windows
- Tight system security, with advanced user, resource, data, and memory protection.
- Advanced detection and alerting on conditions such as overheating, processor and disk failure.
In practice, today many desktop and server operating systems share similar code bases, differing mostly in configuration.
Main category: Servers (computing)
In a general network environment the following types of servers may be found.
- Application server, a server dedicated to running certain software applications
- Catalog server, a central search point for information across a distributed network
- Communications server, carrier-grade computing platform for communications networks
- Compute server, a server intended for intensive (esp. scientific) computations
- Database server, provides database services to other computer programs or computers
- Fax server, provides fax services for clients
- File server, provides remote access to files
- Game server, a server that video game clients connect to in order to play online together
- Home server, a server for the home
- Mail server, handles transport of and access to email
- Media server, a specialized application server, usually enterprise class machine, providing video on demand
- Name server, provides DNS services
- Print server, provides printer services
- Proxy server, acts as an intermediary for requests from clients seeking resources from other servers
- Sound server, provides multimedia broadcasting, streaming.
- Stand-alone server, a server on a Windows network that does not belong to or govern a Windows domain
- Web server, a server that HTTP clients connect to in order to send commands and receive responses along with data contents
Almost the entire structure of the Internet is based upon a client–server model. High-level root nameservers, DNS, and routers direct the traffic on the internet. There are millions of servers connected to the Internet, running continuously throughout the world and virtually every action taken by an ordinary Internet user requires one or more interactions with one or more server. There are exceptions that do not use dedicated servers; for example peer-to-peer file sharing, some implementations of telephony (e.g. pre-Microsoft Skype).
In 2010, data centers (servers, cooling, and other electrical infrastructure) were responsible for 1.1-1.5% of electrical energy consumption worldwide and 1.7-2.2% in the United States. One estimate is that total energy consumption for information and communications technology saves more than 5 times its carbon footprint in the rest of the economy by enabling efficiency.