WE CODE NOW
  • Home 
  • Blog 
  • Guides 
Guides
  1. Home
  2. Guides
  3. Python Programming
  4. Automating Tasks with Python

Automating Tasks with Python

Posted on June 1, 2024  (Last modified on June 8, 2024) • 2 min read • 319 words
Python
 
Automation
 
Scheduling
 
Web Scraping
 
Python
 
Automation
 
Scheduling
 
Web Scraping
 
Share via

Discover how to automate repetitive tasks with Python, including file operations, web scraping, and using scheduling libraries.

On this page
  • Automating File Operations
    • Renaming Files
    • Moving Files
  • Automating Web Scraping
    • Using BeautifulSoup
  • Scheduling Tasks
    • Using Schedule Library
    • Using Cron Jobs
  • Conclusion

Automating Tasks with Python  

Python is an excellent language for automating repetitive tasks. This guide covers automating file operations, web scraping, and using scheduling libraries to run tasks at specific intervals.

Automating File Operations  

Renaming Files  

import os

directory = "/path/to/directory"
for filename in os.listdir(directory):
    new_filename = filename.replace("old", "new")
    os.rename(os.path.join(directory, filename), os.path.join(directory, new_filename))

Renaming files based on patterns.

import os

directory = "/path/to/directory"
for filename in os.listdir(directory):
    if filename.endswith(".txt"):
        new_filename = filename.replace(".txt", ".md")
        os.rename(os.path.join(directory, filename), os.path.join(directory, new_filename))

Moving Files  

import shutil

source = "/path/to/source"
destination = "/path/to/destination"
for filename in os.listdir(source):
    shutil.move(os.path.join(source, filename), os.path.join(destination, filename))

Moving files based on conditions.

import shutil
import os

source = "/path/to/source"
destination = "/path/to/destination"
for filename in os.listdir(source):
    if filename.startswith("log_"):
        shutil.move(os.path.join(source, filename), os.path.join(destination, filename))

Automating Web Scraping  

Using BeautifulSoup  

import requests
from bs4 import BeautifulSoup

url = "http://example.com"
response = requests.get(url)
soup = BeautifulSoup(response.text, "html.parser")

# Extracting all links
for link in soup.find_all('a'):
    print(link.get('href'))

Advanced web scraping with BeautifulSoup.

url = "http://example.com"
response = requests.get(url)
soup = BeautifulSoup(response.text, "html.parser")

data = []
for table in soup.find_all('table'):
    headers = [header.text for header in table.find_all('th')]
    for row in table.find_all('tr')[1:]:
        values = [value.text for value in row.find_all('td')]
        data.append(dict(zip(headers, values)))

print(data)

Scheduling Tasks  

Using Schedule Library  

First, install the schedule library.

pip install schedule
import schedule
import time

def job():
    print("Task running...")

# Schedule job every minute
schedule.every(1).minutes.do(job)

while True:
    schedule.run_pending()
    time.sleep(1)

Scheduling tasks at different intervals.

import schedule
import time

def morning_task():
    print("Morning task running...")

def evening_task():
    print("Evening task running...")

# Schedule tasks
schedule.every().day.at("08:00").do(morning_task)
schedule.every().day.at("18:00").do(evening_task)

while True:
    schedule.run_pending()
    time.sleep(1)

Using Cron Jobs  

For more advanced scheduling, use system cron jobs (Linux/Mac) or Task Scheduler (Windows).

# Edit crontab
crontab -e

# Add a job that runs every day at midnight
0 0 * * * /usr/bin/python /path/to/your_script.py

Conclusion  

Automating tasks with Python can save time and improve productivity. Practice automating file operations, web scraping, and scheduling tasks to streamline your workflows.

 Working with APIs in Python
Data Visualization with Matplotlib 
On this page:
  • Automating File Operations
    • Renaming Files
    • Moving Files
  • Automating Web Scraping
    • Using BeautifulSoup
  • Scheduling Tasks
    • Using Schedule Library
    • Using Cron Jobs
  • Conclusion
Copyright © 2025 WE CODE NOW All rights reserved.
WE CODE NOW
Link copied to clipboard
WE CODE NOW
Code copied to clipboard