Download and Sign Up
Get a $5 Coupon For Free
Getting Started Main Features

 5 practical Python scripts | Web Scraping Tool | ScrapeStorm

2025-03-27 17:21:34
93 views

Abstract:Here are 5 practical Python scripts covering file operations, automation, and web requests, with clear explanations and ready-to-use code: ScrapeStormFree Download

Here are 5 practical Python scripts covering file operations, automation, and web requests, with clear explanations and ready-to-use code:


1. Batch Rename Files

Use Case: Rename all files in a folder (add prefix/suffix/change extension).

import os

def batch_rename(path, prefix="", suffix="", new_ext=None):
    for filename in os.listdir(path):
        name, ext = os.path.splitext(filename)
        new_name = f"{prefix}{name}{suffix}{new_ext if new_ext else ext}"
        os.rename(
            os.path.join(path, filename),
            os.path.join(path, new_name))
        
# Example: Add prefix "backup_" to all .txt files
batch_rename("/path/to/folder", prefix="backup_", new_ext=".txt")

2. Download Images from URLs

Use Case: Download multiple images from a list of URLs.

import requests

def download_images(urls, save_dir="images"):
    os.makedirs(save_dir, exist_ok=True)
    for i, url in enumerate(urls):
        try:
            response = requests.get(url, stream=True)
            with open(f"{save_dir}/image_{i+1}.jpg", "wb") as f:
                for chunk in response.iter_content(1024):
                    f.write(chunk)
        except Exception as e:
            print(f"Failed to download {url}: {e}")

# Example
urls = ["https://example.com/1.jpg", "https://example.com/2.jpg"]
download_images(urls)

3. Convert CSV to Excel

Use Case: Automate data format conversion.

import pandas as pd

def csv_to_excel(csv_path, excel_path):
    df = pd.read_csv(csv_path)
    df.to_excel(excel_path, index=False)

# Example
csv_to_excel("data.csv", "data.xlsx")

4. Schedule Computer Shutdown

Use Case: Shut down the system after a delay (Windows only).

import os
import time

def schedule_shutdown(minutes):
    seconds = minutes * 60
    print(f"Shutting down in {minutes} minutes...")
    time.sleep(seconds)
    os.system("shutdown /s /t 1")

# Example: Shutdown after 30 minutes
schedule_shutdown(30)

5. Extract All Links from a Webpage

Use Case: Scrape hyperlinks from a URL.

from bs4 import BeautifulSoup
import requests

def extract_links(url):
    try:
        response = requests.get(url)
        soup = BeautifulSoup(response.text, "html.parser")
        return [a["href"] for a in soup.find_all("a", href=True)]
    except Exception as e:
        print(f"Error: {e}")
        return []

# Example
links = extract_links("https://example.com")
print(links)

Disclaimer: This article is contributed by our user. Please advise to remove immediately if any infringement caused.

Download videos in batches python download file Generate URLs in batches Download images in batches Match emails with Regex Data scraping with python Keyword extraction from web content python crawler Download web page as word php crawler
关闭