blog




  • Essay / improve latency using mpls - 812

    I. INTRODUCTIONEthernet latency can be defined as the time it takes for a network packet to reach a destination or the time it takes to return from its destination. This also impacts the time an application must wait for data to arrive at its destination [1]. This is as important as download speed, because a network with high latency (a slow network) will take longer to transmit information, which can have a negative effect because web pages will take longer to load at each subsequent image request. the script or text has a significant delay in between [2]. Latency in a packet-switched network is reported as either one-way latency or round trip time (RTT). One-way latency is the time it takes to send a packet from source to destination or RTT divided by two (RTT/2), which means one-way latency from source to destination plus one-way latency from destination to source divided by two (RTT/2) [1]. Latency also refers to one of several types of delays typically incurred when processing network data. Low latency systems must not only be able to get a message from A to B as quickly as possible, but also be able to do so for millions of messages per second. End-to-end latency is a cumulative effect of the individual latencies along the end-to-end network path. Network routers are the devices that create the most latency of all devices on the end-to-end path. These network devices (routers) are usually found in network segments. Packet queuing due to link congestion is most often the cause of significant latency through a router. Since latency is cumulative, the more links and router hops between the sender and receiver, the greater the end-to-end latency...... middle of paper ..... . the data must arrive at its destination, and is normally expressed in milliseconds (ms). Although latency and bandwidth define the speed and capacity of a network, having a 25 Mbps (megabits per second) connection doesn't really allow a single bit of data to travel that distance any faster. However, a high-bandwidth connection only allows you to send or receive more data in parallel, but not faster, because the data still has to travel the distance and experience the normal delay [8].IV. THE IMPACT OF LATENCYApplications whose programming models are likely to experience performance degradation due to latency include:• Applications that rely on the frequent delivery of transactions one at a time, as opposed to the transfer of large amounts of data. • Applications that track or process data in real time, such as “low latency” applications » [2].