Logo

Upwork Jobs Scraper List

Categories: jobs

Extreme

Effortlessly browse and filter job listings on Upwork with the Upwork Jobs Scraper 🛠️💼. Extract essential information including job titles, posting dates, job types, experience levels, budgets, and more, all in one place to streamline your freelance search process.

Request Example

time curl -X POST "https://api.blat.ai/harvest" \
--json '{
"mode": "crawl",

"id":

"upwork-com-8b609c36"

,

"params": {

"start_url":

"https://www.upwork.com/nx/search/jobs/?amount=5000-&client_hires=1-9&contractor_tier=3&q=%28%22web%20scraping%22%20OR%20%22data%20scraping%22%20OR%20%22data%20ingestion%22%29&sort=recency&t=1"

}
}' -H "X-API-KEY: ${BLAT_API_KEY}"

Output Example


{
    "job_listings": [
        {
            "title": "Lead Generation Expert for Local B2C Businesses",
            "posted_date": "6 days ago",
            "job_type": "Fixed price",
            "job_url": "/jobs/Lead-Generation-Expert-for-Local-B2C-Businesses_~021839605849080858946/?referrer_url_path=/nx/search/jobs/",
            "experience_level": "Expert",
            "estimated_budget": "$10,000.00",
            "description": "Manual Lead Generation Specialist for Home Renovation / Remodeling Companies\n\nI'm seeking an experienced lead generation specialist to build high-quality lists of home renovation and remodeling companies. This is for a web design agency targeting local businesses in this niche. You must have a powerful computer with at LEAST 8GB ram.\n\nRequirements:\n\nManually gather or scrape lead information from Yellow Pages, Facebook, Instagram, Google Maps, etc. Strictly NO use of automated tools like Apollo.io.\n\nI recommend using Leadswift and verifying manually from there.\n\nProvide VALID contact details including:\n\u2022 Company name\n\u2022 Owner's first and last name\n\u2022 Phone number\n\u2022 Email address\n\u2022 Website\n\u2022 Number of reviews on google\n\u2022 Average rating (of the google reviews)\n\nEnter data clearly in a Google Spreadsheet\nVerify accuracy of all information provided\n\nImportant:\n\nAny lead with incorrect information or invalid details will be considered invalid\nAny lead that is in a different industry than what we require (for example you get the details of a roofer instead of a Home renovator) will be considered invalid\nYou will ONLY get paid for VALID leads.\n\nOnly local home renovation/remodeling businesses\nType \"peach\" in the beginning of the application so i know you're reading\nExcellent communication skills and fluency in English required\nStrong work ethic and ability to deliver quality results efficiently\n\nPayment Terms:\n\n$75 per 1,000 verified, working leads\nPayment only for leads confirmed to be valid and accurate\nRegular quality checks will be performed\nBonus: $50 per 1000 if 100%+ accuracy achieved\n\nTop performers will be paid more and move to long term position\n\nTO APPLY:\nProvide 5 sample leads meeting our criteria and share in a google sheet.\nThe leads must be in the home renovation/remodeling industry and be from Tampa, Florida USA.\n\nWe will only be working with experts!",
            "skills": [
                "Data Scraping",
                "Data Entry",
                "Data Cleaning",
                "Lead Generation",
                "List Building",
                "Prospect List",
                "Market Research"
            ]
        },
        {
            "title": "Data Engineer with Snowflake and Looker",
            "posted_date": "3 weeks ago",
            "job_type": "Fixed price",
            "job_url": "/jobs/Data-Engineer-with-Snowflake-and-Looker_~021834485974922821268/?referrer_url_path=/nx/search/jobs/",
            "experience_level": "Expert",
            "estimated_budget": "$25,000.00",
            "description": "Senior Data Engineer (Snowflake/ETL)\nExperience: 8+ Years\n\u2022\tWhat makes you a great fit:\n\u2022\tYou have 5+ years of extensive development experience using Snowflake or\n\u2022\tsimilar data warehouse technology\n\u2022\tYou have working experience with dbt and other technologies of the modern\n\u2022\tdatastack, such as Snowflake, Apache Airflow, Fivetran, Looker, AWS, git.\n\u2022\tYou have experience in agile processes, such as SCRUM\n\u2022\tYou have extensive experience in writing advanced SQL statements and\n\u2022\tperformance tuning them\n\u2022\tYou have experience inDataIngestiontechniques using custom or SAAS tool\n\u2022\tlike fivetran\n\u2022\tYou have experience in data modelling and can optimise existing/new data\n\u2022\tmodels\n\u2022\tYou have experience in data mining, data warehouse solutions, and ETL, and\n\u2022\tusing databases in a business environment with large-scale, complex datasets\n\u2022\tYou having experience architecting analytical databases (in Data Mesh\n\u2022\tarchitecture) is added advantage\n\u2022\tYou have experience working in agile cross-functional delivery team\n\u2022\tYou have high development standards, especially for code quality, code reviews,\n\u2022\tunit testing, continuous integration and deployment\n\u2022\tAs a Analytics Engineer you\u2019ll be:\n\u2022\tDeveloping end to end ETL/ELT Pipeline working with Data Analysts of business\n\u2022\tFunction.\n\u2022\tDesigning, developing, and implementing scalable, automated processes for\n\u2022\tdata extraction, processing, and analysis in a Data Mesh architecture\n\u2022\tMentoring other Junior Engineers in the Team\n\u2022\tBe a \u201cgo-to\u201d expert for data technologies and solutions\n\u2022\tAbility to provide on the ground troubleshooting and diagnosis to architecture\n\u2022\tand design challenges\n\u2022\tTroubleshooting and resolving technical issues as they arise\n\u2022\tLooking for ways of improving both what and how data pipelines are delivered\n\u2022\tby the department\n\u2022\tTranslating business requirements into technical requirements, such as entities\n\u2022\tthat need to be modelled, DBT models that need to be build, timings, tests and\n\u2022\treports\n\u2022\tOwning the delivery of data models and reports end to end\n\u2022\tPerform exploratory data analysis in order to identify data quality issues early in\n\u2022\tthe process and implement tests to ensure prevent them in the future\n\u2022\tWorking with Data Analysts to ensure that all data feeds are optimised and\n\u2022\tavailable at the required times. This can include Change Capture, Change Data\n\u2022\tControl and other \u201cdelta loading\u201d approaches\n\u2022\tDiscovering, transforming, testing, deploying and documenting data sources\n\u2022\tApplying, help defining, and championing data warehouse governance: data\n\u2022\tquality, testing, coding best practises, and peer review\n\u2022\tBuilding Looker Dashboard for use cases if required",
            "skills": [
                "Apache Airflow",
                "Looker",
                "Git",
                "SQL CLR",
                "ETL"
            ]
        },
        {
            "title": "Full-Stack Mobile App Developer for Meal Planning and Price Comparison App (iOS)",
            "posted_date": "3 weeks ago",
            "job_type": "Fixed price",
            "job_url": "/jobs/Full-Stack-Mobile-App-Developer-for-Meal-Planning-and-Price-Comparison-App-iOS_~021833478549417340407/?referrer_url_path=/nx/search/jobs/",
            "experience_level": "Expert",
            "estimated_budget": "$10,000.00",
            "description": "Job Description:\n\nWe are looking for a highly skilled full-stack mobile app developer to help us build a cross-platform mobile application (initially focusing on iOS). The app is designed to assist users in planning their meals and saving money by comparing supermarket prices. The ideal candidate will have experience with mobile app development, backend services, API integration, andwebscraping.\n\nKey Responsibilities:\n\nDevelop a Cross-Platform App: Build a responsive and user-friendly app with a beautiful UI. The app will initially focus on iOS development using Flutter or React Native.\n\nBackend Setup: Set up a secure backend using either Firebase or AWS Amplify for user authentication, data storage, and real-time syncing.WebScrapingIntegration: Implement a solution for scraping real-time supermarket prices across different countries. The scraping should be efficient, and the results should be cached for improved performance.\n\nAPI Integrations: Integrate third-party APIs to provide users with recipe search functionality and manage user-specific data such as meal plans, saved recipes, and shopping lists.\n\nAffiliate System: Build or integrate an affiliate tracking system that allows affiliates to view their earnings and track referrals.\n\nNotifications: Set up push notifications for meal prep reminders and updates on price changes.\n\nPerformance Optimization: Ensure the app remains responsive with sub-2-second load times, especially when fetching price data.\n\nRequired Skills:\n\nMobile App Development: Proficiency in Flutter or React Native with experience building cross-platform apps.\n\nBackend Services: Experience with Firebase or AWS Amplify for handling user authentication, real-time data storage, and cloud functions.WebScraping: Knowledge ofwebscrapingtechniques using tools like Scrapy, Puppeteer, or BeautifulSoup to scrape price data from supermarket websites.\n\nAPI Integration: Experience integrating third-party APIs (e.g., recipe search, geolocation).\n\nDatabase Management: Familiarity with NoSQL and SQL databases for handling structured and unstructured data.\n\nAffiliate Systems: Experience with integrating or building affiliate tracking systems is a plus.\n\nSecurity and Compliance: Understanding of best practices for app security and data compliance (GDPR).\n\nPreferred Experience:\n\nExperience with Superwall or other analytics and subscription management tools.\n\nPrevious experience in developing apps that involve price comparison or ecommerce.\n\nFamiliarity with app performance optimization, particularly caching strategies and background processing.\n\nProject Scope:\n\nThe initial version of the app will focus on core functionalities like meal planning, recipe management, price comparison, and affiliate tracking. The app will be developed for iOS, with the potential to expand to Android in future iterations.\n\nHow to Apply:\nPlease provide:\n\nA brief introduction outlining your experience in mobile app development.\n\nLinks to any relevant projects you\u2019ve worked on.\n\nYour approach to building scalable and efficient apps.\n\nAvailability and estimated time to complete the project.",
            "skills": [
                "Web Scraping",
                "AWS Amplify",
                "Firebase Realtime Database",
                "Mobile App Development",
                "Flutter Stack",
                "React"
            ]
        },
        {
            "title": "Data Engineer",
            "posted_date": "3 months ago",
            "job_type": "Fixed price",
            "job_url": "/jobs/Data-Engineer_~011791430a83dabe19/?referrer_url_path=/nx/search/jobs/",
            "experience_level": "Expert",
            "estimated_budget": "$28,000.00",
            "description": "Senior Data Engineer (Snowflake/ETL)\n\nExperience: 8+ Years\nWhat makes you a great fit:\n\u2022 You have 5+ years of extensive development experience using Snowflake or\nsimilar data warehouse technology\n\u2022 You have working experience with dbt and other technologies of the modern\ndatastack, such as Snowflake, Apache Airflow, Fivetran, Looker, AWS, and git.\n\u2022 You have experience in agile processes, such as SCRUM\n\u2022 You have extensive experience in writing advanced SQL statements and\nperformance tuning them\n\u2022 You have experience inDataIngestiontechniques using custom or SAAS tool\nlike Fivetran\n\u2022 You have experience in data modelling and can optimise existing/new data\nmodels\n\u2022 You have experience in data mining, data warehouse solutions, and ETL, and\nusing databases in a business environment with large-scale, complex datasets\n\u2022 You having experience architecting analytical databases (in Data Mesh\narchitecture) is added advantage\n\u2022 You have experience working in agile cross-functional delivery team\n\u2022 You have high development standards, especially for code quality, code reviews,\nunit testing, continuous integration and deployment\nAs a Analytics Engineer you\u2019ll be:\n\u2022 Developing end to end ETL/ELT Pipeline working with Data Analysts of business\nFunction.\n\u2022 Designing, developing, and implementing scalable, automated processes for\ndata extraction, processing, and analysis in a Data Mesh architecture\n\u2022 Mentoring other Junior Engineers in the Team\n\u2022 Be a \u201cgo-to\u201d expert for data technologies and solutions\n\u2022 Ability to provide on the ground troubleshooting and diagnosis to architecture\nand design challenges\n\u2022 Troubleshooting and resolving technical issues as they arise\n\u2022 Looking for ways of improving both what and how data pipelines are delivered\nby the department\n\u2022 Translating business requirements into technical requirements, such as entities\nthat need to be modelled, DBT models that need to be build, timings, tests and\nreports\n\u2022 Owning the delivery of data models and reports end to end\n\u2022 Perform exploratory data analysis in order to identify data quality issues early in\nthe process and implement tests to ensure prevent them in the future\n\u2022 Working with Data Analysts to ensure that all data feeds are optimised and\navailable at the required times. This can include Change Capture, Change Data\nControl and other \u201cdelta loading\u201d approaches\n\u2022 Discovering, transforming, testing, deploying and documenting data sources\n\u2022 Applying, help defining, and championing data warehouse governance: data\nquality, testing, coding best practises, and peer review\n\u2022 Building Looker Dashboard for use cases if required",
            "skills": [
                "BigQuery",
                "Snowflake",
                "MySQL",
                "Looker"
            ]
        }
    ]
}