Nimble Skills for Claude Code: Live, Scalable Web Data, Inside Your AI Assistant
.png)
Nimble Skills for Claude Code: Live, Scalable Web Data, Inside Your AI Assistant
.png)
If you needed a dataset from the web, your first instinct would probably be to ask Claude to do it for you. While Claude has built-in web search capabilities, it isn’t designed for large scale web data collection:
- Returns standard search results (titles, snippets, URLs)
- Single search mode, no domain/time filtering
- No structured data extraction or schema enforcement
So, to get this info, you’d probably need to set up a web scraper, which breaks whenever sites change and requires hours of configuration.
What if instead, you just told Claude:
Pull all active rental listings from Zillow in Austin, TX 78701That's one example of what Nimble Skills makes possible. Whether you're tracking property markets, monitoring competitor pricing, or researching leads at scale, the same pattern applies: Nimble's entire web data infrastructure, accessible through Claude, with no scraping scripts, no API integrations, and no pipelines to maintain.
To show how it works in practice, we'll walk through a real estate use case — tracking live property listings across multiple markets using Zillow's pre-built agent.
Nimble Skills: Turn Your AI Assistant into a Web Data Expert
Nimble Skills are plug-and-play extensions that give Claude direct access to live web data. Install them once and Claude can search the web, extract structured data from any page, discover URLs across entire domains, and build repeatable extraction pipelines — all from natural language.
There are two skills:
- nimble-web-expert gives Claude the ability to reach out to the live web on demand. Ask it to pull a page, run a search, run a pre-built agent, map a site, or crawl a section — and it handles the rendering, bot protection, and parsing automatically. The output comes back clean and structured, ready to work with.
- nimble-agent-builder gives Claude the ability to create, refine, and publish custom extraction agents. When you need structured data from a site that doesn't have a pre-built agent, nimble-agent-builder generates one from a natural language description, validates the output, and publishes it — ready to run at scale with any input.
Each skill is independently useful. nimble-web-expert is valuable on its own for any team that needs live web data without building infrastructure around it. nimble-agent-builder stands alone for teams who need a reliable, repeatable extraction workflow for a site that isn't already covered.
Getting Started
You'll need the Nimble CLI and an API key. Grab your key from Account Settings, then run:
npm install -g @nimble-way/nimble-cli
export NIMBLE_API_KEY="your-api-key"Then install Nimble Skills:
npx skills add Nimbleway/agent-skillsFinally, connect the MCP server so Claude has access to the full agent toolset:
claude mcp add --transport http nimble-mcp-server https://mcp.nimbleway.com \
--header "Authorization: Bearer ${NIMBLE_API_KEY}"That's it. Claude now has access to both skills.
From First Pull to Full Pipeline
One natural way to use the two skills together is as a proof-of-concept to production path.
Start with nimble-web-expert. Pull the data once, see what comes back, validate that it's the right signal. This takes minutes and requires no setup. If the data looks useful, you haven't committed to anything yet — you've just confirmed the source is worth building around.
Once you're confident in the data, use nimble-agent-builder to turn that one-off pull into a repeatable pipeline — parameterized by location or any other input, and ready to run across hundreds of inputs on a schedule.
This isn't the only way to use the two skills. But for teams moving from an idea to a production data feed, it's a fast, low-risk path that doesn't require any upfront infrastructure decisions.
Building a Real-Time Property Market Tracker
Here's how this plays out for tracking the property market in Austin, TX.
Step 1: The first pull
Ask Claude to pull current listings from Zillow for a specific zip code:
Pull all active rental listings from Zillow in Austin, TX 78701Claude checks Nimble's agent library, finds the pre-built zillow_plp agent — maintained by Nimble — and runs it using nimble-web-expert:
nimble agent run --agent zillow_plp --params '{"zip_code": "78701", "listing_type": "rentals"}'You get back a structured dataset pulled directly from Zillow's live listings page:
[
{
"street_address": "1100 Barton Springs Rd",
"unit": "Apt 204",
"full_address": "1100 Barton Springs Rd Apt 204, Austin, TX 78701",
"city": "Austin",
"state": "TX",
"price": 2100,
"bedrooms": 1,
"bathrooms": 1,
"sqft": 720,
"days_on_market": 4,
"listing_type": "rentals"
},
{
"street_address": "800 W 5th St",
"unit": "Apt 310",
"full_address": "800 W 5th St Apt 310, Austin, TX 78701",
"city": "Austin",
"state": "TX",
"price": 2850,
"bedrooms": 2,
"bathrooms": 2,
"sqft": 1050,
"days_on_market": 11,
"listing_type": "rentals"
}
]No scraping logic, no rendering to configure. Check the output — are the fields complete? Is the data current? If yes, you've validated the source in under a minute with nothing to build.
Step 2: Scale across markets
Once you've confirmed the data looks right, tell Claude explicitly that you want this running as a pipeline:
I want to run this every morning across zip codes 78701, 78702, 78703, and 78704 —
for rentals, sales, and recently sold. Aggregate the results into a single market snapshot.That's the trigger for Claude to switch to nimble-agent-builder, which sets up the workflow — running nimble agent run across every zip code and listing type combination:
nimble agent run --agent zillow_plp --params '{"zip_code": "78701", "listing_type": "rentals"}'
nimble agent run --agent zillow_plp --params '{"zip_code": "78701", "listing_type": "sales"}'
nimble agent run --agent zillow_plp --params '{"zip_code": "78701", "listing_type": "sold"}'
nimble agent run --agent zillow_plp --params '{"zip_code": "78702", "listing_type": "rentals"}'
# ... and so on across all zip codes and listing typesSchedule that to run every morning and your team has a live, structured view of twelve market segments — updated daily, without touching a scraper or maintaining any infrastructure.
Taking It Further
The real estate walkthrough is one example. The same two-skill pattern — nimble-web-expert for the first pull, nimble-agent-builder for the pipeline — applies to any domain where your business needs live, structured web data at scale.
Nimble maintains a library of pre-built agents across the verticals where businesses most commonly need reliable, recurring data. A few examples of what that looks like in practice:
E-commerce intelligence: Track product details, pricing, and availability across the major retail platforms:
nimble agent run --agent amazon_pdp --params '{"asin": "B08N5WRWNW"}'
nimble agent run --agent walmart_pdp --params '{"product_id": "123456"}'
nimble agent run --agent target_pdp --params '{"url": "https://www.target.com/p/..."}'Search visibility: Monitor how your brand or keywords appear in Google Search results — including AI Overviews:
nimble agent run --agent google_search --params '{"query": "best CRM software 2025"}'
nimble agent run --agent google_search_aio --params '{"query": "best CRM software 2025"}'Local market intelligence: Pull business listings, locations, and data from Google Maps for any search query:
nimble agent run --agent google_maps_search --params '{"query": "logistics companies Chicago IL"}'Any site, not just the prebuilt agents. When there's no pre-built agent for your target, nimble-agent-builder handles the build. Describe what you need in plain language, and Nimble generates and validates the agent. From that point it runs exactly like any other agent; parameterized, reliable, and ready to scale.
Get started at nimbleway.com.
FAQ
Answers to frequently asked questions
.png)

.avif)
.png)

.png)
.png)






