Categories
General Technical

How to define latency in computer science ? ?

Latency is the delay between a user’s action and a web application’s response to that action, often referred to in networking terms as the total round trip time it takes for a data packet to travel.

How to reduce latency

Content Delivery Network (CDN) providers provide customers with private networks that allow them to bypass the public internet. Therefore, these private networks reduce latency by providing more efficient paths for data packets to travel on.

Also the caching techniques is resolving the latency issue

How Latency Works

Consider that you are buying a product through an online shop and you press the “Add to Cart” button for a product that you like.

The chain of events that occur when you press that button are as follow :

  1. Firstly, You press the “Add to Cart” button.
  2. Secondly, The browser identifies this as an event and initiates a request to the website’s servers. The clock for latency starts now.
  3. Thirdly, The request travels to the site’s server with all the relevant information.
  4. Then, The site’s server acknowledges the request. The first half of the latency is complete.
  5. The server accept the request status.
  6. The site’s server replies to the request with relevant information.
  7. The request reaches your browser and in the card the system adds the product. With this, the latency cycle is completed.

The latency is the time that it takes for all these events to complete.

latency
latency

source https://blog.stackpath.com/latency/