I Automated My Instagram With Python. Here Is What Actually Worked.

I Automated My Instagram With Python.

I Automated My Instagram With Python.
Managing an Instagram account manually is one of those tasks that feels productive but rarely is. You spend twenty minutes liking posts, another ten following accounts in your niche, schedule a story, respond to comments, and by the time you are done the algorithm has already moved on. The time investment is real. The returns are inconsistent.

I decided to automate as much of it as possible using Python. Not because I wanted to game the platform, but because I wanted to spend my time creating content instead of doing repetitive tasks that a script could handle more consistently than I ever would manually.

After three weeks of building, testing, and adjusting, here is what worked, what did not, and the exact code behind each automation.

How I Automated My Instagram With Python. What You Can and Cannot Automate on Instagram

Before writing a single line of code, it is worth being direct about the boundaries. Instagram’s official API, the Graph API, is designed for business accounts and has strict rate limits and restricted permissions. It does not allow automated liking, following, or direct message sending through official channels.

The automations in this article use two approaches. The first is Instagram’s official Graph API for things it does support, specifically publishing content, reading insights, and managing comments on business accounts. The second is Instagrapi, a well-maintained unofficial Python library that communicates with Instagram’s private API, the same one the mobile app uses.

Using unofficial libraries carries risk. Instagram can and does restrict accounts that show bot-like behavior patterns. The mitigations for this are built into every script in this article: random delays between actions, human-realistic activity windows, and conservative rate limits that stay well below what would trigger detection.

Setup and Installation

# Install required libraries
# pip install instagrapi pillow requests schedule python-dotenv

from instagrapi import Client
from instagrapi.exceptions import LoginRequired, ChallengeRequired
import os
import time
import random
from dotenv import load_dotenv

load_dotenv()

USERNAME = os.environ.get("INSTAGRAM_USERNAME")
PASSWORD = os.environ.get("INSTAGRAM_PASSWORD")

def create_client() -> Client:
    cl = Client()
    cl.delay_range = [2, 5]  # Random delay between 2-5 seconds on every action

    session_file = "instagram_session.json"

    if os.path.exists(session_file):
        try:
            cl.load_settings(session_file)
            cl.login(USERNAME, PASSWORD)
            print("Logged in using saved session.")
            return cl
        except Exception:
            print("Session expired. Logging in fresh.")

    try:
        cl.login(USERNAME, PASSWORD)
        cl.dump_settings(session_file)
        print("Login successful. Session saved.")
        return cl
    except ChallengeRequired:
        print("Instagram requires verification. Complete it manually first.")
        raise

cl = create_client()

Saving the session to a file is important. Logging in fresh every time the script runs increases the likelihood of triggering Instagram’s security checks. A saved session behaves like a persistent device login, which is far less suspicious.

Automation 1: Auto-Post With Captions and Hashtags

The most valuable automation for most creators is scheduled posting. This script takes an image, a caption, and a hashtag list and posts them at a specified time:

from PIL import Image
from pathlib import Path
import textwrap

def prepare_image(image_path: str) -> str:
    img = Image.open(image_path)

    # Instagram requires minimum 320px, maximum 1440px
    width, height = img.size
    if width < 320:
        ratio = 320 / width
        img = img.resize((320, int(height * ratio)), Image.LANCZOS)

    # Convert to RGB if needed (PNG with transparency)
    if img.mode != "RGB":
        img = img.convert("RGB")

    output_path = f"prepared_{Path(image_path).name}"
    img.save(output_path, "JPEG", quality=95)
    return output_path

def post_photo(
    image_path: str,
    caption: str,
    hashtags: list,
    location_name: str = None
) -> str:
    prepared = prepare_image(image_path)
    full_caption = f"{caption}\n\n{' '.join(hashtags)}"

    # Add location if provided
    location = None
    if location_name:
        locations = cl.location_search(location_name)
        if locations:
            location = locations[0]

    media = cl.photo_upload(
        path=prepared,
        caption=full_caption,
        location=location
    )

    print(f"Posted successfully. Media ID: {media.pk}")
    return media.pk

# Example usage
post_photo(
    image_path="content/python_tip.jpg",
    caption="Here is a Python security tip that most developers skip.",
    hashtags=["#Python", "#PythonDeveloper", "#CyberSecurity", "#Programming", "#CodeTips"],
    location_name="San Francisco"
)

Automation 2: Scheduled Posting Queue

Manually running the post script at specific times defeats the purpose. This scheduler reads from a content queue and posts automatically at the right time:

import schedule
import json
from datetime import datetime

QUEUE_FILE = "post_queue.json"

def load_queue() -> list:
    if not os.path.exists(QUEUE_FILE):
        return []
    with open(QUEUE_FILE, "r") as f:
        return json.load(f)

def save_queue(queue: list):
    with open(QUEUE_FILE, "w") as f:
        json.dump(queue, f, indent=2)

def add_to_queue(image_path: str, caption: str, hashtags: list, post_time: str):
    queue = load_queue()
    queue.append({
        "image_path": image_path,
        "caption": caption,
        "hashtags": hashtags,
        "post_time": post_time,
        "status": "pending"
    })
    save_queue(queue)
    print(f"Added to queue. Scheduled for: {post_time}")

def process_queue():
    queue = load_queue()
    current_time = datetime.now().strftime("%Y-%m-%d %H:%M")

    for post in queue:
        if post["status"] == "pending" and post["post_time"] <= current_time:
            try:
                post_photo(
                    post["image_path"],
                    post["caption"],
                    post["hashtags"]
                )
                post["status"] = "published"
                post["published_at"] = datetime.now().isoformat()
                print(f"Published: {post['caption'][:50]}...")
            except Exception as e:
                post["status"] = "failed"
                post["error"] = str(e)
                print(f"Failed to post: {str(e)}")

    save_queue(queue)

# Add posts to the queue
add_to_queue(
    image_path="content/monday_tip.jpg",
    caption="5 Python security tools every developer should know.",
    hashtags=["#Python", "#Security", "#Developer"],
    post_time="2026-03-09 09:00"
)

# Run the scheduler
schedule.every(5).minutes.do(process_queue)

print("Scheduler running. Press Ctrl+C to stop.")
while True:
    schedule.run_pending()
    time.sleep(30)

Automation 3: Hashtag Research and Engagement Tracker

Knowing which hashtags actually drive reach is more valuable than blindly copying a popular list. This script analyzes a set of hashtags and returns engagement metrics to help you choose the most effective ones for your niche:

def analyze_hashtags(hashtag_list: list, posts_to_check: int = 10) -> list:
    results = []

    for tag in hashtag_list:
        try:
            # Random delay to avoid rate limiting
            time.sleep(random.uniform(3, 7))

            tag_info = cl.hashtag_info(tag)
            recent_medias = cl.hashtag_medias_recent(tag, amount=posts_to_check)

            total_likes = sum(m.like_count for m in recent_medias)
            total_comments = sum(m.comment_count for m in recent_medias)
            avg_engagement = (total_likes + total_comments) / max(len(recent_medias), 1)

            results.append({
                "hashtag": tag,
                "total_posts": tag_info.media_count,
                "avg_engagement": round(avg_engagement, 1),
                "competition_level": (
                    "High" if tag_info.media_count > 1_000_000
                    else "Medium" if tag_info.media_count > 100_000
                    else "Low"
                )
            })

            print(f"#{tag}: {tag_info.media_count:,} posts, avg engagement: {avg_engagement:.1f}")

        except Exception as e:
            print(f"Could not analyze #{tag}: {str(e)}")
            continue

    results.sort(key=lambda x: x["avg_engagement"], reverse=True)
    return results

tags_to_analyze = [
    "pythonprogramming", "pythondeveloper", "cybersecurity",
    "fastapi", "programmerlife", "coding", "devops"
]

best_hashtags = analyze_hashtags(tags_to_analyze)

print("\nTop hashtags by engagement:")
for tag in best_hashtags[:5]:
    print(f"#{tag['hashtag']}: {tag['avg_engagement']} avg engagement ({tag['competition_level']} competition)")

Automation 4: Comment Monitor and Auto-Responder

Responding to comments quickly improves engagement rate significantly. This script monitors recent posts for new comments and sends a templated response to first-time commenters:

import re

RESPONSE_TEMPLATES = [
    "Thank you for reading! Really appreciate your comment.",
    "Great point! Glad this was helpful.",
    "Thanks for engaging! Let me know if you have questions.",
    "Appreciate the feedback! More content like this coming soon.",
]

def get_responded_users(log_file: str = "responded_users.json") -> set:
    if not os.path.exists(log_file):
        return set()
    with open(log_file, "r") as f:
        return set(json.load(f))

def save_responded_user(username: str, log_file: str = "responded_users.json"):
    users = get_responded_users(log_file)
    users.add(username)
    with open(log_file, "w") as f:
        json.dump(list(users), f)

def monitor_and_respond(posts_to_check: int = 5, dry_run: bool = True):
    responded_users = get_responded_users()
    user_medias = cl.user_medias(cl.user_id, amount=posts_to_check)

    for media in user_medias:
        comments = cl.media_comments(media.pk, amount=20)

        for comment in comments:
            username = comment.user.username

            if username == USERNAME:
                continue
            if username in responded_users:
                continue
            if len(comment.text) < 3:
                continue

            response = random.choice(RESPONSE_TEMPLATES)

            if dry_run:
                print(f"[DRY RUN] Would respond to @{username}: {response}")
            else:
                time.sleep(random.uniform(10, 20))
                cl.media_comment(media.pk, f"@{username} {response}")
                save_responded_user(username)
                print(f"Responded to @{username}")

# Test with dry_run=True first, then switch to False
monitor_and_respond(posts_to_check=3, dry_run=True)

The dry_run parameter is there for a reason. Always run it in dry run mode first to verify the logic is working correctly before enabling actual responses.

Rate Limiting and Safety Guidelines

Every script in this article includes delays but the overall daily limits matter as much as individual action delays. These are the conservative limits that kept my accounts safe across three weeks of testing:

DAILY_LIMITS = {
    "posts": 3,
    "comments": 30,
    "hashtag_lookups": 50,
    "profile_views": 100,
}

# Always add randomized delays between actions
def safe_delay(min_seconds: int = 5, max_seconds: int = 15):
    delay = random.uniform(min_seconds, max_seconds)
    time.sleep(delay)

Staying well inside these limits produced zero account restrictions across the testing period. Pushing against them, even once, triggered a temporary action block that lasted six hours.

What Actually Worked After Three Weeks

The scheduled posting queue was the most immediately valuable automation. Consistent posting at peak engagement times, without having to manually remember or be at a computer, produced a measurable improvement in reach within the first week.

The hashtag analyzer revealed that medium-competition hashtags with high average engagement outperformed high-follower hashtags consistently. Three of the seven hashtags I had been using were actively hurting reach because of extremely low engagement rates relative to their post volume.

The comment responder worked but required more careful tuning than expected. The generic templates were fine for casual comments but felt hollow on substantive ones. The better approach turned out to be using it only for short acknowledgment comments and responding manually to anything more than a sentence long.

The biggest lesson from the entire project was about restraint. Automation works best when it handles the repetitive and consistent tasks while leaving the judgment-dependent ones to you. Scheduling posts is a perfect automation task. Deciding what to say in response to a thoughtful comment is not.

Build the automation that removes friction from the work you already know how to do. That is where the real time savings are.

Post you may also like-

My Python App Now Has an AI Watchdog. Here Is How I Built It.

We Handle 100K API Requests/Second With Python (Architecture Breakdown)

Leave a Comment

Your email address will not be published. Required fields are marked *