Why scrape from inside a spreadsheet?
Most web scraping tools make you write Python scripts or pay for expensive software. And the data always ends up in the same place: a spreadsheet. You export a CSV, import it into Google Sheets, clean the columns, then start working.
That is a lot of steps just to get data into cells.
SheetMagic skips all of that. You write a formula. You point it at a URL or keyword. The data shows up in your cells. No code. No CSV imports. No jumping between apps.
And with nine Apify integrations in the AI Chat sidebar, you can pull structured data from tricky platforms too. Google Maps, YouTube, Amazon, LinkedIn, and more. Just describe what you want in plain English.
This guide covers every scraping formula SheetMagic offers. It also covers all nine Apify integrations, how credits work, and real workflows you can start using today.
Part 1: Formula-based scraping
SheetMagic has a whole family of scraping formulas. They all work the same way: point them at a URL or keyword, tell them what to pull, and get structured data back.
=SERP() -- Search engine results
=SERP("best crm software for startups")Returns up to 20 Google search results with titles, URLs, snippets, and ranking positions.
=SERP() is your starting point for keyword research, competitive analysis, and prospecting. Put your keywords in column A. Put =SERP(A2) in column B. Drag it down. Now you have structured SERP data for your entire keyword list.
Real use case: finding prospects
=SERP("saas companies series A funding 2026")Pull a list of recently funded companies from search results. Use the URLs in the next step for deeper research.
=BULKSERP() -- Horizontal SERP layout
=BULKSERP(A2)Same data as =SERP() but laid out horizontally -- one row per keyword, results spread across columns. Better for comparing multiple keywords.
Use =BULKSERP() when you want to compare SERP results across keywords. Each keyword stays on one row. That makes it easy to scan your whole keyword list at a glance.
=VISIT() -- Full page content
=VISIT(A2)Retrieves up to 50,000 characters of text content from the URL in A2. Supports JavaScript-rendered pages.
=VISIT() pulls the raw text content of a webpage. Use it when you need the full page for summarization, content analysis, or feeding into an =AITEXT() formula.
Real use case: competitor content analysis
Pull competitor URLs from SERP data
Run =SERP("your target keyword") and extract the top 10 URLs.
Scrape each page's full content
Put the URLs in column A and run =VISIT(A2) in column B. You now have the full text of every competitor page.
Analyze with AI
Use =AITEXT("Summarize this page in 3 sentences. What is its main argument and what audience is it targeting?", B2) in column C to process each page.
=PAGEDATA() -- Bulk page metadata
=PAGEDATA("https://example.com/about")Returns structured metadata: URL, Title, Description, H1, and HTTP Status.
=PAGEDATA() grabs page metadata (title tag, meta description, H1 heading, HTTP status) from one or more URLs at once. You can pass multiple URLs to get a table of metadata in a single call:
=PAGEDATA(A2, A3, A4, A5)Returns one row per URL with Title, Description, H1, and Status columns. Great for bulk SEO audits.
For targeted data extraction, combine =VISIT() with =AITEXT():
=AITEXT("Extract: company description in one sentence, main product or service, pricing page URL if visible, contact email if visible. Format as pipe-separated values.", VISIT(A2))Scrapes the page with =VISIT() and uses AI to extract specific fields. Use =SPLIT() afterward to break the result into columns.
Format your extraction instructions with pipe-separated output (using the | character). Then use =SPLIT(B2, "|") in adjacent columns to break the result into clean, separate cells. This turns messy web pages into neat spreadsheet data.
=GETSELECTOR() -- CSS selector scraping
=GETSELECTOR(A2, "div.product-price span.amount")Extracts content matching a specific CSS selector. For developers or anyone who inspects page elements.
If you know the exact HTML element you need, =GETSELECTOR() targets it directly with a CSS selector. This is the most precise scraping method. It works great when you need a specific price, rating, or data point that lives in a consistent HTML structure.
=GETIMG() -- Extract image URLs
=GETIMG(A2)Returns image URLs found on the page at A2.
Use this for cataloging product images, building visual mood boards, or checking image assets across a website.
=DOMAINCHECK() -- Domain availability
=DOMAINCHECK("mybrandname.com")Returns whether the domain is available for registration.
This is handy for brand naming projects. Generate name ideas with =AILIST(), then check domain availability for each one:
=DOMAINCHECK(A2&".com")Checks .com availability for the brand name in A2. Combine with =AILIST() to generate and check names in bulk.
Page metadata formulas
These are single-purpose formulas built for speed. Each one extracts one specific piece of metadata per call:
| Formula | What it extracts | Example use |
|---|---|---|
=GETMETATITLE(A2) | Title tag | SEO audit, competitor analysis |
=GETMETADESCRIPTION(A2) | Meta description | SERP preview, content gap analysis |
=GETH1(A2) | H1 heading | Content structure analysis |
=GETH2(A2) | All H2 headings | Outline extraction, topic mapping |
=GETHEADINGS(A2) | All headings (H1-H6) | Full content structure audit |
=GETP(A2) | Paragraph text | Content extraction, word count |
Real use case: SEO site audit
Set up your sheet with a list of your page URLs in column A. Then:
- Column B:
=GETMETATITLE(A2)to check for missing or duplicate titles - Column C:
=GETMETADESCRIPTION(A2)to find pages without meta descriptions - Column D:
=GETH1(A2)to verify every page has an H1 - Column E:
=AITEXT("Is this meta description under 160 characters? Does it include a call to action? Return Yes/No for each.", C2)for an automated quality check
For more SEO workflows, see the complete SEO guide.
Part 2: Apify integrations, structured data from major platforms
Some platforms are really hard to scrape with basic formulas. Google Maps, YouTube, Amazon, and LinkedIn all have complex page structures and anti-scraping measures. SheetMagic handles this with nine Apify integrations you can access through the AI Chat sidebar.
Apify integrations are available through the AI Chat sidebar (paid subscribers on Solo, Team, and Business plans). You describe what data you need in plain English. The AI picks the right Apify actor, sets up the parameters, and writes results directly into your sheet. Each Apify operation uses integration credits.
Google Maps scraper
Pull business listings for any location and category. Great for lead generation, market research, and competitive analysis.
What you can extract:
- Business name, address, phone number
- Rating, review count
- Website URL
- Business category
- Hours of operation
Example conversation with AI Chat:
You: "Find all Italian restaurants in Austin, TX with at least 4 stars. Put the results in a new sheet."
AI: Runs the Google Maps scraper, returns a structured table with business names, addresses, ratings, phone numbers, and websites.
Google Search SERP scraper
A more powerful alternative to the =SERP() formula for large-scale keyword research. It handles complex queries, geo-targeted results, and returns richer data.
YouTube channel and video scraper
Extract channel statistics, video metadata, view counts, and engagement metrics.
What you can extract:
- Channel subscriber count, total views
- Video titles, descriptions, view counts
- Upload dates, engagement rates
- Tags and categories
Example use: Build a creator outreach list by scraping channels in your niche. Filter by subscriber count. Then generate personalized outreach emails with =AITEXT().
Amazon product scraper
Pull product data including prices, ratings, review counts, and product descriptions. Really useful for e-commerce competitive research.
What you can extract:
- Product title, price, rating
- Number of reviews
- Best Seller Rank
- Product description and bullet points
- ASIN
LinkedIn company scraper
Extract company profile data for sales prospecting and market research.
What you can extract:
- Company name, industry, size
- Headquarters location
- Description, specialties
- Website URL
Additional Apify integrations
SheetMagic includes four more specialized scrapers for additional platforms and data sources. They all work through the same AI Chat interface. Just describe what you need and the AI takes care of the rest.
Part 3: How credits work
Web scraping formulas and Apify integrations use integration credits. These are separate from the AI tokens used by formulas like =AITEXT().
Credit costs
| Operation | Credits used |
|---|---|
=SERP() query | 1 credit per keyword |
=BULKSERP() query | 1 credit per keyword |
=VISIT() page | 1 credit per URL |
=PAGEDATA() extraction | 1 credit per URL |
=GETSELECTOR() | 1 credit per URL |
=GETMETATITLE(), =GETMETADESCRIPTION(), etc. | 1 credit per URL |
=GETIMG() | 1 credit per URL |
=DOMAINCHECK() | 1 credit per domain |
| Apify integrations | Varies by actor and results returned |
Credit limits by plan
| Plan | Integration credits/month | Price |
|---|---|---|
| Free | 50 | Free |
| Solo | 1,000 | $19/mo |
| Team | 5,000 | $79/mo |
| Business | 20,000 | $149/mo |
BYOK (Bring Your Own Key) users on lifetime deals can also purchase additional integration credit packs from the dashboard. See the pricing page for details.
Check your remaining credits anytime: open the SheetMagic sidebar under Extensions > SheetMagic, or visit the dashboard. Credits reset monthly on your billing cycle.
Smart credit usage
SheetMagic has 30-minute smart caching. If you scrape the same URL twice within 30 minutes, the second request uses cached data. No extra credits. This means you can tweak your extraction instructions and try again without burning through your quota.
Part 4: Complete workflows
Here are three end-to-end workflows that combine scraping and AI formulas.
Workflow 1: Lead generation from Google Maps
Scrape business listings
Open the AI Chat sidebar and ask: "Find all marketing agencies in Denver, CO. Include name, website, phone, and rating."
The Apify Google Maps scraper returns results directly into your sheet.
Enrich with website data
With website URLs in column D, run =AITEXT("What does this company specialize in? One sentence.", VISIT(D2)) in column E.
Qualify leads
Use =AITEXT("Based on this company description, rate the likelihood they need a CRM tool on a scale of 1-5. Return only the number.", E2) in column F.
Generate personalized outreach
For qualified leads (rating 4-5), generate email openers: =AITEXT("Write a 2-sentence cold email opener for a marketing agency that specializes in "&E2&". Reference something specific about their work.", "") in column G.
Workflow 2: E-commerce price monitoring
List competitor product URLs
Put competitor product page URLs in column A.
Extract pricing data
=AITEXT("Extract: product name, current price, original price if on sale, availability status. Pipe-separated.", VISIT(A2)) in column B.
Split into columns
=SPLIT(B2, "|") in columns C through F to get structured data.
Set up price comparison
Add your own prices in column G. Use =IF(C2<G2, "Competitor is cheaper by "&TEXT(G2-C2, "$0.00"), "You are competitive") in column H.
Generate price analysis
=AITEXT("Based on these prices, suggest a competitive pricing strategy: My price: "&G2&", Competitor price: "&C2&". Keep it to 2 sentences.", "") in column I.
Workflow 3: Content research and planning
Research top-ranking content
=SERP("your target keyword") to get the top-ranking URLs.
Extract content structure
=GETHEADINGS(A2) to pull the heading structure of each competitor page.
Analyze content gaps
=AITEXT("These are the heading structures from 5 articles ranking for the keyword 'your keyword'. Identify 3 subtopics that none of them cover well: "&B2&" | "&B3&" | "&B4&" | "&B5&" | "&B6, "") in a separate cell.
Generate a content brief
Feed the gap analysis into a brief: =AITEXT("Create a content brief for an article that covers these underserved subtopics: "&C2&". Include 8 suggested headings, target audience, and recommended word count.", "")
For the full version of this SEO workflow, see the SEO guide.
Best practices for reliable scraping
1. Be specific about what you want. When using =VISIT() + =AITEXT() for targeted extraction, "Extract the price" works better than "Get all data." Name the exact fields you need.
2. Use pipe-separated output. Ask for results formatted with | separators. Then use =SPLIT() to break them into columns. This is the fastest way to turn messy web data into clean spreadsheet data.
3. Process in batches. When scraping hundreds of URLs, don't run all formulas at once. Work in batches of 50 to 100 rows. This avoids timeouts and makes it easier to spot issues early.
4. Take advantage of caching. SheetMagic caches scraping results for 30 minutes. If you need to adjust your extraction instructions, you can rerun formulas against the same URLs without using extra credits.
5. Check your results. Spot-check extracted data against the source page. This is especially true for pricing and contact information. If results look off, refine your extraction instructions or try =GETSELECTOR() with a specific CSS selector.
6. Combine scraping with AI. The best workflows use scraping to gather data and AI to process it. =VISIT() grabs the page content. =AITEXT() analyzes, classifies, or extracts specific fields. Use =PAGEDATA() when you just need metadata (titles, descriptions, H1s) in bulk. See the workflows above for examples.
Web scraping results depend on the structure and accessibility of the target website. Some sites block automated access or load content dynamically. If a formula returns incomplete data, try =VISIT() (which supports JavaScript-rendered pages) or use an Apify integration through the AI Chat sidebar for platforms with complex structures.
Start scraping today
Web scraping in Google Sheets works right now. You can write these formulas in seconds. You can try all of this on the free tier. No credit card needed. You get 10 integration credits (one-time trial), which is enough to run a few scrapes and see how it fits your work.
Install SheetMagic, open a sheet, and try =SERP("your keyword") or =PAGEDATA("any-url.com") to pull page metadata right away. If you want structured data from Google Maps, YouTube, Amazon, or LinkedIn, open the AI Chat sidebar and describe what you need.
For more on combining scraping with AI: How to use AI formulas | SEO workflows | Getting started guide
Questions about web scraping, credits, or specific integrations? Visit the FAQ.

