skip to Main Content

I am attempting to create an html page that contains an input field, and a button. Once the user clicks the button, I want the page to run a Node JS File.
Currently, the html script simply displays the input, but I would link the input to be fed into a function in the Node JS File, and the output be displayed/stored.
From my understanding, this is only possible in a live browser. I have attempted to use npm live-server as well as the VScode live-server extension.

However whenever I start the server I get this error in the console.

GET http://127.0.0.1:8080/RecipeScrapper/index.js net::ERR_ABORTED 404 (Not Found)
In the** Network tab** the summary says:
Request URL: http://127.0.0.1:8080/RecipeScrapper/index.js
Request Method: GET
Status Code: 404 Not Found
Referrer Policy: strict-origin-when-cross-origin

This is my File Structure

Desktop
| -- node_modules(folder)
| -- index.html
| -- index.js
| -- package.json
| -- package-lock

Relevant part of html file:

<head>
    <Title>Recipe</Title>
    <script type="module" src="/RecipeScrapper/index.js"></script>
</head>
<body>
    ... 
    <button onclick= "Display(document.getElementById('a').value)"> Display Link </button>
 </body>

Relevant part of index.js

Relevant part of index.js file
import * as cheerio from 'cheerio';
const fetch = (...args) => import('node-fetch').then(({default: fetch}) => fetch(...args));
async function getItems(a) { 
    try {
      const response = await fetch (a); 
      const body = await response.text();
      const $ = cheerio.load(body);
      const $wrapper4 = $('[data-test-id= "ingredients-list"] [data-test-id="ingredient-item-shipped"]');
      let uitems = [];

      $wrapper4.each((i, div) => {
        uitems.push($(div).text());
    });
    for(let i=0; i<uitems.length; i++)
    {
     ...
    }
    //print
    console.log(uitems)
    } catch (error) {
      console.log(error);
    }
  }
getItems('https://www.hellofresh.com/recipes/peppercorn-steak-w06-5857fcd16121bb11c124f383');

Files Repo: https://github.com/Anthony-S4/RecipeScrapper

I am very new to web development and I may be misunderstanding the difference between nodeJs and JS files.
From what I’ve seen it should be possible to accomplish this task using a live-browser. If I am misunderstood, an explanation of the difference between NodeJS and JS files being utilized in the browser would be appreciated. Along with that, any pointers as to how I can create an program/website to accomplish
this task would also be appreciated.

Thank you.

3

Answers


  1. From what I’ve seen it should be possible to accomplish this task using a live-browser.

    It isn’t. JavaScript running in web browsers don’t have access to data on other websites unless that website gives explicit permission.

    If I am misunderstood, an explanation of the difference between NodeJS and JS files being utilized in the browser would be appreciated.

    The key differences that are relevant here are that:

    • Node.js can resolve module names by searching the file system, while browsers work on URLs
    • Node.js and browsers have different APIs available to them (e.g. Node.js can make any HTTP request to any URL and read the response).

    any pointers as to how I can create an program/website to accomplish this task would also be appreciated

    Write a web server using Node.js. Express is a popular library to do that. Call the code you have that depends on Node.js from an Express.js endpoint. Have the browser make an HTTP request to that endpoint (e.g. by submitting a form or using Ajax).

    Login or Signup to reply.
  2. Postman will help you quite a bit when using API’s https://www.postman.com/ (It’s Free). Also depending how it is set up you might not see it in live server but if you go to localhost:8080 (from PORT GET Request) you will see the returned data!

    I found cheerio weird at times and very touchy. he is a link to my resource using cheerio scrapping from one & multiple websites.
    https://github.com/Hazey8709/webScraper-Tut
    https://github.com/Hazey8709/Practice

    Login or Signup to reply.
  3. The part @Quentin is not telling you is Not all sites can be scraped This way. For what you are trying to do I would scrap the data(if it can be), then turn data into json, then use the DOM to display information needed (by button or on page load)

    Web-scraper: https://github.com/kubowania/nodejs-webscraper
    To display data: https://www.youtube.com/watch?v=A7mkCmJXe_8

    Here are the 2 links if you follow them in order and make changes where you need for your own data it will be able to display it in your live server !!!!

    Different Route:
    Find an API that has the data you need and use crud operations accordingly (GET, DELETE, POST, PUT) Meaning (Create, Read, update, Delete (CRUD-Operations))

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search