您需要先安装一个扩展,例如 篡改猴、Greasemonkey 或 暴力猴,之后才能安装此脚本。
您需要先安装一个扩展,例如 篡改猴 或 暴力猴,之后才能安装此脚本。
您需要先安装一个扩展,例如 篡改猴 或 暴力猴,之后才能安装此脚本。
您需要先安装一个扩展,例如 篡改猴 或 Userscripts ,之后才能安装此脚本。
您需要先安装一款用户脚本管理器扩展,例如 Tampermonkey,才能安装此脚本。
您需要先安装用户脚本管理器扩展后才能安装此脚本。
Scrape LinkedIn Sales Navigator search results with pagination and export to markdown
A powerful userscript to automate the scraping of lead search results from LinkedIn Sales Navigator. It handles pagination, extracts detailed profile information, and exports the collected data into a clean, organized Markdown file.
To use this script, you need a userscript manager browser extension. Tampermonkey is recommended.
Install a Userscript Manager:
Install the Userscript:
LinkedIn-Sales-Navigator-Scraper.user.js
file.https://www.linkedin.com/sales/search/people...
The script includes a CONFIG
object at the top of the file that allows you to customize its behavior:
DELAY_BETWEEN_PAGES
: Time (in ms) to wait after scraping a page before navigating to the next one.MAX_RETRIES
: Number of times to retry a failed operation.BATCH_SIZE
: The expected number of results per page (typically 25).SCROLL_STEP_PX
: The distance (in pixels) to scroll in each step to trigger lazy-loading.SCROLL_PAUSE_MS
: The pause (in ms) between each scroll step.LOAD_TIMEOUT_MS
: The maximum time (in ms) to wait for all profiles on a single page to load.This script is intended for personal use to automate data collection. LinkedIn's terms of service prohibit scraping. Use this tool responsibly and at your own risk. Abusing this script may lead to restrictions on your LinkedIn account.
Install Tampermonkey/Greasemonkey:
Install the Userscript:
LinkedIn-Sales-Navigator-Scraper.user.js
Verify Installation:
You can modify the scraping behavior by editing these constants in the script:
const CONFIG = {
DELAY_BETWEEN_PAGES: 3000, // 3 seconds between pages
WAIT_FOR_RESULTS: 2000, // 2 seconds to wait for results to load
MAX_RETRIES: 3, // Maximum retries for failed operations
BATCH_SIZE: 25 // Expected results per page
};
For each profile, the scraper captures:
Field | Description | Example |
---|---|---|
Name | Full name | "Daniel Gindi" |
Profile URL | Direct link to LinkedIn profile | https://linkedin.com/in/... |
Title | Current job title | "Data Analyst" |
Company | Current company | "Bentex" |
Company URL | Direct link to company page | https://linkedin.com/company/... |
Location | Geographic location | "New York City Metropolitan Area" |
Connection Degree | Relationship level | "2nd", "3rd" |
Mutual Connections | Shared connections count | "6 mutual connections" |
Shared Education | Educational connections | "Shared education" |
Experience | Current role duration | "6 months in role" |
Previous Experience | Past work history | "2023 – 2024 (1 yr 5 mos) Shamco..." |
Profile Image | Profile photo URL | https://media.licdn.com/... |
The scraper generates a beautiful markdown file with the following structure:
# LinkedIn Sales Navigator Search Results
**Date**: 2024-01-15
**Search URL**: https://www.linkedin.com/sales/search/people?...
**Total Profiles**: 47
---
## 1. Daniel Gindi

**Title**: Data Analyst
**Company**: Bentex
**Location**: New York City Metropolitan Area
**Connection**: 2nd
**Mutual Connections**: 6 mutual connections
**Shared Education**: Shared education
**Experience**: 6 months in role
**Previous Experience**: 2023 – 2024 (1 yr 5 mos) Shamco Management Corp. Software Coordinator
**Profile URL**: [View Profile](https://linkedin.com/...)
**Company URL**: [View Company](https://linkedin.com/...)
---
The scraper targets these specific LinkedIn Sales Navigator elements:
#search-results-container
[data-x-search-result="LEAD"]
[data-sn-view-name="search-pagination"]
[data-view-name="search-results-lead-name"]
[data-view-name="search-results-lead-company-name"]
The script activates on URLs matching:
https://www.linkedin.com/sales/search/people*
The scraper detects pagination by:
?page=N
)linkedin-sales-navigator-results-YYYY-MM-DD.md
Scraper UI not appearing:
No profiles detected:
Export button disabled:
Navigation issues:
To enable debug logging, open browser console and run:
localStorage.setItem('sn-scraper-debug', 'true');
Feel free to submit issues, feature requests, or improvements!
This project is provided as-is for educational and personal use. Please respect LinkedIn's Terms of Service and use responsibly.
QingJ © 2025
镜像随时可能失效,请加Q群300939539或关注我们的公众号极客氢云获取最新地址