Get All Links on a Webpage using Selenium

Get All Links on a Webpage using Selenium

Hello friends! at times during automation, we are required to fetch all the links present on a webpage. Also, this is one of the most frequent requirements of web-scrapping. In this tutorial, we will learn to fetch all the links present on a webpage by using tagname locator.

If you have a basic understanding of HTML, you must be aware of the fact that all hyperlinks are of type anchor tag or ‘a’.

How to fetch all the links on a webpage?

  • Navigate to the desired webpage
  • Get list of WebElements with tagname ‘a’ using driver.findElements()-
    List allLinks = driver.findElements(By.tagName(“a”));
  • Traverse through the list using for-each loop
  • Print the link text using getText() along with its address using getAttribute(“href”)
    System.out.println(link.getText() + ” – ” + link.getAttribute(“href”));

Sample Code to Scrape Links

package seleniumTutorials;
import java.util.List;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.firefox.FirefoxDriver;
public class GetAllLinks {

public static void main(String[] args){
WebDriver driver = new FirefoxDriver();

//Launching sample website

//Get list of web-elements with tagName  - a
List allLinks = driver.findElements(By.tagName("a"));

//Traversing through the list and printing its text along with link address
for(WebElement link:allLinks){
System.out.println(link.getText() + " - " + link.getAttribute("href"));

//Commenting driver.quit() for user to easily verify the links


От QA genius