Core Build: 5 Automation Scripts

Build all 5 automation scripts, organize in reusable library, and configure secrets management in 50 minutes

Overview

This chapter builds a complete automation toolkit with five purpose-built scripts. Each script demonstrates a different automation pattern, from simple text processing to complex workflow orchestration.

What you'll build:

  • Script #2: Text analysis automation
  • Script #3: Data transformation pipeline
  • Script #4: API integration script
  • Script #5: Workflow orchestration
  • Library structure: Organized, reusable toolkit
  • Configuration system: Secure secrets management

Time allocation:

  • Scripts 2-5: 10 minutes each (40 minutes total)
  • Library organization: 10 minutes
  • Testing and verification: Built into each script section

The Script Development Pattern

Every script in this toolkit follows a three-part structure: input handling, API interaction with prompts, and output formatting. This pattern creates maintainable scripts that are easy to debug and extend. When a script fails, the structure tells you exactly where to look.

Each script demonstrates this pattern with increasing complexity, teaching transferable skills applicable to any automation task.

Building the Five Scripts

Script #2: Text Analysis Automation (10 min)

Core concept: Send text to AI API for analysis and receive structured insights.

Real-world applications:

  • Economics: Analyze research paper abstracts for methodology and findings
  • Software: Generate code review summaries from pull request descriptions
  • Business: Extract sentiment and themes from customer feedback

Pattern demonstration:

# Input → AI Processing → Output pattern
curl -X POST $API_ENDPOINT \
  -H "x-api-key: $API_KEY" \
  -d "{\"prompt\": \"Analyze: $INPUT\"}"

Key concepts:

  • Command-line argument handling
  • Multi-line prompt engineering
  • JSON response parsing with jq
  • Error handling for API failures

Create scripts/analyze.sh and implement the three-part pattern. Test with a sample research abstract or business email.

Success criteria:

Run the script with sample text and receive structured analysis output (key points, sentiment, recommendations).

Script #3: Data Transformation Pipeline (10 min)

Core concept: Process data files through AI-guided transformation, validation, and enrichment.

Real-world applications:

  • Data cleaning: Fix inconsistent date formats, validate entries
  • Format conversion: Transform JSON to CSV with AI-guided field mapping
  • Data enrichment: Add categories, tags, or summaries to existing records

Pipeline pattern demonstration:

# File processing pipeline pattern
cat input.csv | \
  python3 -c "import sys; [process with AI]" | \
  tee output.csv

Key concepts:

  • Unix pipe composition (stdin/stdout)
  • File format detection and parsing
  • Batch processing for large datasets
  • Preserving original data during transformation

Create scripts/transform.sh that reads a data file, sends records to AI for processing, and outputs transformed results.

Success criteria:

Process a sample CSV file and generate cleaned, enriched output with AI-added insights.

Script #4: Integration Automation (10 min)

Core concept: Connect multiple APIs and services in a single automated workflow.

Real-world applications:

  • Research automation: Pull papers from arXiv, summarize with AI, save to Notion
  • Monitoring: Fetch metrics from analytics API, analyze trends, send Slack alerts
  • Data sync: Read from source API, transform with AI, write to destination database

Integration pattern demonstration:

# Multi-API orchestration pattern
curl $SOURCE_API | jq '.data' | \
  ./ai-process.sh | \
  curl -X POST $DEST_API -d @-

Key concepts:

  • API authentication for multiple services
  • Data format translation between systems
  • Error recovery and retry logic
  • Logging for debugging complex integrations

Create scripts/integrate.sh that demonstrates fetching, processing, and pushing data across service boundaries.

Success criteria:

Successfully retrieve data from one API, process with AI, and deliver to another service with proper error handling.

Script #5: Workflow Orchestration (10 min)

Core concept: Chain multiple scripts together with conditional logic and error handling.

Real-world applications:

  • Daily research digest: Fetch papers, summarize, analyze, email report
  • Content pipeline: Generate draft, review with AI, format, publish
  • Data processing workflow: Extract, validate, transform, load with quality checks

Orchestration pattern demonstration:

# Conditional workflow execution
./fetch.sh && \
  ./process.sh || \
  ./handle-error.sh

Key concepts:

  • Script exit codes and error propagation
  • Conditional execution with && and ||
  • State management across script steps
  • Rollback and recovery mechanisms

Create scripts/orchestrate.sh that coordinates the previous four scripts in a meaningful workflow relevant to your domain.

Success criteria:

Execute a multi-step workflow that demonstrates proper error handling, conditional logic, and successful completion across all stages.

Library Organization (10 min)

Create the toolkit structure:

Transform individual scripts into an organized, maintainable library. This structure makes scripts discoverable and reusable across projects.

Directory structure:

automation-toolkit/
├── scripts/
│   ├── summarize.sh      # Script #1 from Quick Start
│   ├── analyze.sh        # Script #2: Text analysis
│   ├── transform.sh      # Script #3: Data pipeline
│   ├── integrate.sh      # Script #4: API integration
│   └── orchestrate.sh    # Script #5: Workflow
├── lib/
│   └── common.sh         # Shared functions
├── config/
│   ├── .env.example      # Template for secrets
│   └── .gitignore        # Protect sensitive files
└── README.md             # Usage documentation

Shared library pattern:

Extract common functionality into lib/common.sh:

# Reusable API call function
call_ai_api() {
  curl -X POST $API_ENDPOINT \
    -H "x-api-key: $API_KEY"
}

Organization benefits:

Move all scripts into scripts/ directory, create lib/common.sh for shared functions, and document usage in README.md.

Organizing scripts into a library makes them discoverable, reusable, and maintainable. The lib/ directory contains shared functions that prevent code duplication. The config/ directory centralizes settings and secrets. The scripts/ directory contains user-facing commands with clear, consistent interfaces.

Configuration System

Secure secrets management:

API keys and sensitive credentials must never be hardcoded in scripts. Environment variables provide secure, flexible configuration.

Environment variable pattern:

# Load configuration from .env file
set -a
source .env
set +a

Setup steps:

  1. Create config/.env.example with placeholder values
  2. Copy to config/.env and add real API keys
  3. Add config/.env to .gitignore
  4. Source environment in each script

Best practices:

Never commit .env files to version control. Always provide .env.example as a template. Use descriptive variable names with consistent prefixes. Document required variables in README.md.

Security checklist:

  • Verify .env is in .gitignore
  • Use read-only permissions (chmod 600)
  • Rotate keys periodically
  • Never log or echo secret values

Testing Your Toolkit

Verification workflow:

Test each script individually before testing the orchestrated workflow. This isolates issues and validates the three-part pattern.

Per-script checklist:

  • Script runs without syntax errors
  • Accepts command-line arguments correctly
  • Makes successful API calls
  • Handles errors with helpful messages
  • Produces expected output format
  • Has executable permissions (chmod +x)

Integration testing:

Run the orchestration script to verify all five scripts work together. Check that errors in one script properly halt or recover the workflow.

Common issues and solutions:

What You Built

Text Analysis

AI-powered script that analyzes documents, emails, and research papers with structured insights

Data Transformation

Pipeline for cleaning, validating, and enriching CSV and JSON datasets

API Integration

Multi-service automation connecting research APIs, productivity tools, and databases

Workflow Orchestration

Master script coordinating multi-step processes with error handling and recovery

Organized Library

Maintainable toolkit structure with shared functions and configuration management

Key Takeaways

Pattern mastery:

The three-part script structure (input, API, output) applies to virtually all automation tasks. Recognizing this pattern accelerates development and debugging.

Composition over complexity:

Five focused scripts combined with orchestration prove more maintainable than one monolithic program. Each script has a single responsibility.

Security first:

Configuration management and secrets handling built from the beginning prevent future security issues. Never treat this as an afterthought.

Domain flexibility:

These patterns transfer across economics, software engineering, and business management. The scripts change, but the structure remains constant.

Next Steps

The next chapter explores domain-specific applications, showing how to adapt these five scripts to economics research, software development workflows, and business operations. Each domain demonstrates unique prompting strategies and integration patterns.