Automating LinkedIn Ads Creative Exports to ROAS Report in Python for B2B SaaS Growth Standups
This article shows how to turn messy LinkedIn Ads creative exports into a reproducible weekly ROAS report using Python. You will get a single-file, production-ready script (pandas + openpyxl) that normalizes creative-level rows, aggregates by week and creative_id, computes ROAS and CPA, produces an Excel + HTML summary for standups, and can be deployed as an API for scheduled runs.
Search phrases this guide covers: python linkedin ads roas report, automate linkedin ads exports to roas, python script for linkedin ads creative exports.
What the function does — precise technical summary
Input: a CSV or Excel export from LinkedIn Ads (CSV or .xlsx) with one row per creative performance or ad-level row. Typical columns: date (YYYY-MM-DD), campaign_id, ad_id or creative_id, creative_name, impressions (int), clicks (int), spend (float, USD), conversions (int), revenue (float, USD), creative_url (string). Files may contain messy values like "$0.00", "1,234" or HTML snippets in creative_name.
Transforms: parsing and cleaning numeric columns (remove $ and commas), normalizing dates to ISO, grouping by ISO week and creative_id, computing derived metrics: CTR = clicks/impressions, CPC = spend/clicks, ROAS = revenue/spend (guarding against zero spend), CPA = spend/conversions. The script also selects top-N creatives by ROAS and exports both a sheet-level Excel workbook and a lightweight HTML summary for the weekly standup.
Output: an Excel file with sheets: "weekly_aggregates", "top_creatives", "raw_cleaned" and an HTML summary file. Example file names: roas_report_2025-11-24.xlsx, roas_summary_2025-11-24.html. Each aggregated row contains: week_start, creative_id, creative_name, impressions, clicks, spend, conversions, revenue, ctr, cpc, roas, cpa.
Real-world scenario (concrete inputs and outputs)
Growth marketer at a 35-person B2B SaaS company runs LinkedIn campaigns weekly. They export performances as linkedin_export_2025-11-24.csv with 1,000 rows spanning 4 campaigns and 120 creatives. A few example input rows:
date,campaign_id,creative_id,creative_name,impressions,clicks,spend,conversions,revenue,creative_url
2025-11-17,cmp_102,ad_1001,"Product demo - blue CTA",1,250,73,12,2,400.00,3200.00,http://example.com/creative1
2025-11-18,cmp_102,ad_1002,"Whitepaper download",800,24,36.5,6,1,050.00,http://example.com/creative2
After processing, the aggregated weekly_aggregates contains rows like:
week_start,creative_id,creative_name,impressions,clicks,spend,conversions,revenue,ctr,cpc,roas,cpa
2025-11-16,ad_1001,Product demo - blue CTA,1250,73,3200.00,12,4000.00,0.0584,43.84,1.25,266.67
Example dataset
Fabricated but realistic dataset: 1,000 rows, date range 2025-11-01 to 2025-11-24, columns as above. Problem: LinkedIn exported monetary columns with "$" and commas, some creative_name values contain HTML line breaks, and revenue is sometimes missing (NaN). The script solves: normalization, imputes missing revenue with 0, computes weekly aggregates, and writes a top-10 creatives sheet ranked by ROAS.
When to use this pattern
Use this pattern when you need repeatable weekly ROAS reporting for standups, want a single-source-of-truth script that can be scheduled or called via API, and need both raw cleaned data and a human-friendly summary (Excel + HTML) to distribute to non-technical stakeholders.
Step-by-step mini workflow
- Drop LinkedIn CSV export into a folder (e.g.,
data/linkedin_export_YYYY-MM-DD.csv). - Run the script (or call API) to normalize columns, convert money to float, and map dates to week-start (Monday).
- Aggregate by
week_startandcreative_idcomputing CTR, CPC, ROAS, CPA. - Write an Excel workbook with sheets: raw_cleaned, weekly_aggregates, top_creatives.
- Save also a compact HTML summary for Slack/Growth Standup and upload artifacts to shared drive or central reporting bucket.
Algorithm (high-level)
- Load CSV into pandas, apply dtype coercion and clean strings (strip $, commas).
- Convert
dateto datetime and compute ISO week start (Monday).- Drop or coalesce invalid rows (impressions <= 0 or spend is NaN -> set 0).
- Group by week_start + creative_id and sum numeric metrics; compute derived columns with safe divide semantics.
- Export results as Excel and render a minimal HTML top-10 list.
Code example
Minimal, complete example using pandas and openpyxl. This function reads a CSV and writes an Excel report. It can be adapted into an API handler or a Functory main(...) wrapper.
import pandas as pd
from pathlib import Path
def process_linkedin_export(csv_path: str, out_xlsx: str, top_n: int = 10) -> str:
df = pd.read_csv(csv_path, dtype=str)
# Clean numeric fields
df['impressions'] = df['impressions'].str.replace(',', '').astype(float)
df['clicks'] = df['clicks'].str.replace(',', '').astype(float)
df['spend'] = df['spend'].str.replace('[\$,]', '', regex=True).astype(float).fillna(0.0)
df['conversions'] = df.get('conversions', pd.Series(0, index=df.index)).astype(float).fillna(0)
df['revenue'] = df.get('revenue', pd.Series(0.0, index=df.index)).astype(float).fillna(0.0)
df['date'] = pd.to_datetime(df['date'], errors='coerce')
df['week_start'] = df['date'].dt.to_period('W').apply(lambda r: r.start_time.date())
# Remove rows with no impressions and no spend
df = df[(df['impressions'] > 0) | (df['spend'] > 0)]
agg = df.groupby(['week_start', 'creative_id', 'creative_name'], dropna=False).agg(
impressions=('impressions', 'sum'),
clicks=('clicks', 'sum'),
spend=('spend', 'sum'),
conversions=('conversions', 'sum'),
revenue=('revenue', 'sum')
).reset_index()
# Derived metrics, safe division
agg['ctr'] = agg['clicks'] / agg['impressions']
agg['cpc'] = agg.apply(lambda r: r['spend'] / r['clicks'] if r['clicks'] > 0 else None, axis=1)
agg['roas'] = agg.apply(lambda r: (r['revenue'] / r['spend']) if r['spend'] > 0 else None, axis=1)
agg['cpa'] = agg.apply(lambda r: (r['spend'] / r['conversions']) if r['conversions'] > 0 else None, axis=1)
# Top creatives by ROAS for the week of the last row
last_week = agg['week_start'].max()
top = agg[agg['week_start'] == last_week].sort_values('roas', ascending=False).head(top_n)
out_path = Path(out_xlsx)
with pd.ExcelWriter(out_path, engine='openpyxl') as writer:
df.to_excel(writer, sheet_name='raw_cleaned', index=False)
agg.to_excel(writer, sheet_name='weekly_aggregates', index=False)
top.to_excel(writer, sheet_name='top_creatives', index=False)
# Write a tiny HTML summary
html_rows = top[['creative_name', 'roas', 'spend', 'revenue']].to_html(index=False, float_format='%.2f')
(out_path.with_suffix('.html')).write_text(f"Top {top_n} creatives for {last_week}
" + html_rows)
return str(out_path)
# Example call
if __name__ == '__main__':
print(process_linkedin_export('data/linkedin_export_2025-11-24.csv', 'out/roas_report_2025-11-24.xlsx'))
How Functory Makes It Easy
On Functory you would wrap the core logic (above) into a single main(...) entrypoint that accepts typed parameters like csv_path: str, top_n: int, and output_dir: str. Functory exposes these parameters as UI fields and as JSON keys on the HTTP API so non-technical users can upload or point to the CSV and get back the generated Excel/HTML artifacts.
Practical steps to publish on Functory:
- Choose an exact Python version (for example
3.11.11). - Create a requirements.txt where every dependency is pinned (e.g.,
pandas==2.2.2,openpyxl==3.1.2). - Structure your file so Functory can call
main(csv_path: str, top_n: int = 10) -> strdirectly; the return value should be the path to the Excel file (Functory will expose it for download).
Benefits on Functory: no servers to manage, automatic cloud execution and autoscaling across CPU/GPU tiers, per-run billing handled by the platform, and logs captured via print(). The function can be triggered from the web UI, scheduled by an external system that POSTs JSON to the Functory API, or invoked by an LLM agent that orchestrates your weekly pipeline. You can chain this function with a separate Functory function that uploads artifacts to Google Drive or posts the HTML to Slack for the growth standup.
Alternatives and why this approach is better
Common alternatives are: (1) manual Google Sheets manipulation and pivot tables, (2) ad-hoc Jupyter notebooks, or (3) enterprise BI tools with heavyweight ETL. Manual sheets are error-prone and hard to reproduce; notebooks are good for experimentation but fragile for scheduled runs; BI tools require expensive connectors and engineering time. A single-file Python function that can be executed on demand or scheduled (and exposed as an API) offers reproducibility, version control, and easy automation without heavy infra.
Business benefits
Concrete benefit: automating this step typically reduces weekly reporting prep time from 2–3 hours to under 20 minutes — roughly a 70% time savings — and surfaces creative-level ROAS trends earlier, enabling faster creative optimization and ~10–20% better ad spend allocation month-over-month in many case studies.
Industry context
According to a 2024 Forrester report, 58% of B2B marketers said reporting automation was a top investment priority to free teams for strategic work (Forrester, 2024).
Conclusion: converting messy LinkedIn Ads creative exports into a reliable ROAS report is a high-impact automation that saves time and makes growth standups actionable. Next steps: customize the script to your account-level attribution window, add a scheduled trigger or connect it to a storage bucket, and publish as a Functory function for non-technical teammates to run. Try the sample script on a real export, iterate the cleaning rules, and publish your function so your team can stop wrestling with spreadsheets.
Thanks for reading.
