I have clicked the "Run" button when trying to export markdown but instead of markdown, it give me the option to only download JSON, XLXS, or CSV. I watched the tutorial video video but the screen in the video does not look at all like the real screen.
1. When website data is "updated" will it scrape automatically or on a schedule? 1b. How many credits are used for each of the "updates" verses the 10-30 credits it costs for the original scrape? 1c. Will the system alert us FIRST about changes on the original data - BEFORE - initiating an update automatically - or manually ?
2. Are their templates or prompts for ai scrape etc? 2b. Can we save workflows/prompts to reuse later? 2c. Can we automate API & webhooks to push to next workflows or to G-Drive etc? 2d. VSCode & Zapier Integrations?
3. Are Spidy Agents = to a "Workflow" for deployment? Ex: 5 Spidy agents/day = 5 workflows active for the day, either scraping OR on update /an alert for changes?
It's a joke! If we could at least stay in the growth plan and buy additional one-time credits, I would consider buying, but this is a scam from AppSumo.
Not sure what you're looking to build but I did something similar to what I originally intended on doing using an HTML scraper from Rapid API and OpenAI
Q: Can I use my own proxy with ScrapeGraphAI?
I would like to know if it’s possible to configure a custom proxy of my choice, instead of relying on the default setup or any proxy service included by the tool. This is important for my use case.
Q: Can I use this as data scrapper for LinkedIn for example?
I am looking for a tool to help me extract names, emails from LinkedIn
Share ScrapeGraphAI
Q: How do you get Markdown export to work?
I have clicked the "Run" button when trying to export markdown but instead of markdown, it give me the option to only download JSON, XLXS, or CSV. I watched the tutorial video video but the screen in the video does not look at all like the real screen.
Share ScrapeGraphAI
Q: About: Credits, Templates, Prompts, Scrape Updates, Automations, Storage, Agents & Proxies... ?
1. When website data is "updated" will it scrape automatically or on a schedule?
1b. How many credits are used for each of the "updates" verses the 10-30 credits it costs for the original scrape?
1c. Will the system alert us FIRST about changes on the original data - BEFORE - initiating an update automatically - or manually ?
2. Are their templates or prompts for ai scrape etc?
2b. Can we save workflows/prompts to reuse later?
2c. Can we automate API & webhooks to push to next workflows or to G-Drive etc?
2d. VSCode & Zapier Integrations?
3. Are Spidy Agents = to a "Workflow" for deployment?
Ex: 5 Spidy agents/day = 5 workflows active for the day, either scraping OR on update /an alert for changes?
4. Which Proxy plan are WE on?
Thx for the replies :)
Share ScrapeGraphAI
Q: Additional Credit Cost
How much do credits cost after you run out of the initial amount?
Share ScrapeGraphAI
https://scrapegraphai.com/pricing
Does that mean its not credit-based? After we run out of our initial credits here, we have to switch to a subscription?
It's a joke! If we could at least stay in the growth plan and buy additional one-time credits, I would consider buying, but this is a scam from AppSumo.
Not sure what you're looking to build but I did something similar to what I originally intended on doing using an HTML scraper from Rapid API and OpenAI
Q: Can I use my own proxy with ScrapeGraphAI?
I would like to know if it’s possible to configure a custom proxy of my choice, instead of relying on the default setup or any proxy service included by the tool. This is important for my use case.
Marco_ScrapeGraphAI
May 9, 2025A: no
Share ScrapeGraphAI
they are included