I would like to know if it’s possible to configure a custom proxy of my choice, instead of relying on the default setup or any proxy service included by the tool. This is important for my use case.
1. When website data "updates" will it scrape automatically or schedule? 1b. How many credits are used for each of the "UPDATES" - verses the 10-30 credits it costs for the original scrape? 1c. Are credits used to keep up with "scanning for live updates/changes"? 1d. Will the system alert us FIRST about changes on the original data - BEFORE - initiating an update automatically - or manually ?
2. Are their templates, custom templates? 2b. Can we save workflows/prompts to reuse later? 2c. Can we automate webhooks to push to Projects or to G-Drive etc work flows? 2d. VSCode & Zapier Integration?
3. Are Spidy Agents = to a Workflow's for deployment? Ex: 5 Spidy agents/day = 5 workflows active for the day, either scraping OR on update watch/alert for changes?
Gratis - I know it's probably not easy with the language barrier. :) But, I asked about "HOW" the credits are used in "updates". Not how many we get with each tier, thats obvious from the deal table. I would love an answer please, and im sure many would use this app more if the communication Gap could be filled in somehow. :)
Q: Does it work on websites needs subscription and needs to be logged in?
Does it work on websites needs subscription and needs to be logged in?
Q: Can I use my own proxy with ScrapeGraphAI?
I would like to know if it’s possible to configure a custom proxy of my choice, instead of relying on the default setup or any proxy service included by the tool. This is important for my use case.
Marco_ScrapeGraphAI
May 9, 2025A: no
Share ScrapeGraphAI
Q: Does ScrapeGraphAI support spanish language?
Marco_ScrapeGraphAI
May 9, 2025A: yes
Share ScrapeGraphAI
Q: Nice product :) Questions: Credits - Templates - Prompts - UPDATES - Automations - Storage - Spidy Agents & Proxies
A few questions:
1. When website data "updates" will it scrape automatically or schedule?
1b. How many credits are used for each of the "UPDATES" - verses the 10-30 credits it costs for the original scrape?
1c. Are credits used to keep up with "scanning for live updates/changes"?
1d. Will the system alert us FIRST about changes on the original data - BEFORE - initiating an update automatically - or manually ?
2. Are their templates, custom templates?
2b. Can we save workflows/prompts to reuse later?
2c. Can we automate webhooks to push to Projects or to G-Drive etc work flows?
2d. VSCode & Zapier Integration?
3. Are Spidy Agents = to a Workflow's for deployment?
Ex: 5 Spidy agents/day = 5 workflows active for the day, either scraping OR on update watch/alert for changes?
4. Which Proxy plan are WE on?
Thank you :)
Marco_ScrapeGraphAI
Apr 21, 2025A: with spicy agent you have:
tier 1: 20 chats a day
2: 50
3: 100
Share ScrapeGraphAI
Gratis - I know it's probably not easy with the language barrier. :) But, I asked about "HOW" the credits are used in "updates". Not how many we get with each tier, thats obvious from the deal table. I would love an answer please, and im sure many would use this app more if the communication Gap could be filled in somehow. :)
Q: Does it work on websites needs subscription and needs to be logged in?
Does it work on websites needs subscription and needs to be logged in?
Marco_ScrapeGraphAI
Apr 21, 2025A: you have to make th elgoinm by yourself
Share ScrapeGraphAI
Q: Websites with login
Is it working with websites that needs to be logged in?
Can I navigate between multiple pages inside one website to collect different data for each one?
Thanks
Share ScrapeGraphAI