Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The 102 version is normal. After Firefox is upgraded to 103 or a later version, the file fails to be exported because the stream writing is abnormal. As a result, the file download is incomplete. #292

Closed
davidpan123 opened this issue Oct 14, 2022 · 10 comments · Fixed by #305

Comments

@davidpan123
Copy link

Compared with the 102 version 103, the following security vulnerabilities are fixed: https://www.mozilla.org/en-US/security/advisories/mfsa2022-28/

@ISummerRainI
Copy link

I'm facing this issue too, tried the solution for that one #290 but still the same.

@davidpan123
Copy link
Author

中文:控制台报错: 无法载入‘’。某个 ServiceWorker 拦截了请求并遇到未知错误;
English: Console error: Failed to load ''. A ServiceWorker intercepts the request and encounters an unknown error.

Repository owner deleted a comment from Vampire618 Oct 19, 2022
@TommyBacco
Copy link

Any update on this matter? Someone managed to workaround it?

@Harshit-Pratap-Singh
Copy link

the download stops at 1.5 gb in firefox

@yxq-neuralgalaxy
Copy link

yxq-neuralgalaxy commented Dec 8, 2022

When I'm downloading a 500m file using firefox, the function pipeTo doesn't return any information。

const fileStream = streamsaver.createWriteStream(title);
  const readableStream = response.body;
  if (readableStream) {
    if (readableStream.pipeTo) {
      return readableStream.pipeTo(fileStream)
        .then(() => { 
           // downloading big files will not executed here
        });
        .catch(error => {
          // downloading big files will not executed here
        })
    }
    // Only when downloading small files will it be executed here
    const writer = fileStream.getWriter();
    const reader = readableStream.getReader();
    const pump = async () => reader.read()
      .then((res: any) => res.done
        ? writer.close()
        : writer.write(res.value).then(pump));
    await pump();

firefox:v107.0.1,streamsaver:v2.0.6,web-streams-polyfil:v3.2.1

The same code is normal in chrome

@jimmywarting
Copy link
Owner

seems like browser are taking extra measure to prevent 3th party origin to download stuff from service worker coming from another domain... firefox may be even more restrict nowdays...

guess the best corse of action is to provide a guide of how to download stuff using self hosted service worker...
I have found one potential iframe sandbox flag that might help...

https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe#attr-sandbox

  • allow-downloads-without-user-activation Experimental: Allows for downloads to occur without a gesture from the user.
  • allow-downloads: Allows for downloads to occur with a gesture from the user.

maybe this ☝️ can help solve the Firefox problem?

@Harshit-Pratap-Singh
Copy link

Harshit-Pratap-Singh commented Dec 14, 2022

I am using this code to write data comming in chunks from another user on a peer-to-peer connection, it works fine in chrome and is able to handle large file but when used in firefox as receiving end the file dounload fails at 1.5 gb.

function helperReceiveFile(e) {
    // console.log("ooooo");
    if (!fileStream.current && e.data.toString().includes("start")) {
      fileStream.current = streamsaver.createWriteStream(JSON.parse(e.data).fileName,{size:JSON.parse(e.data).size});
      console.log("fileStream.current--->", fileStream.current);
      // alert("jskdhjfhk")
      console.log(JSON.parse(e.data).fileName);
      fileStreamWriter.current = fileStream.current.getWriter();
      console.log("fileStreamWriter.current--->", fileStreamWriter.current);
      return;
    }
  handleReceiveFile(e) 
    
  }
function handleReceiveFile(e) {
    // console.log(e);
    if (e.data.toString().includes("done")) {
      setGotFile(true);
      console.log(e);
      fileStreamWriter.current.close();
      fileStream.current = null;
      fileName.current = JSON.parse(e.data).fileName;
    } else {
      // console.log(e.data);
       fileStreamWriter.current.write(new Uint8Array(e.data)).catch(e=>{
        console.log("error",e);
       })
      // worker.postMessage(e.data);
    }
  }

I want to handle large files so i have to write the data directly in disk.
@jimmywarting

@arcsun
Copy link

arcsun commented Jan 31, 2023

I am using this code to write data comming in chunks from another user on a peer-to-peer connection, it works fine in chrome and is able to handle large file but when used in firefox as receiving end the file dounload fails at 1.5 gb.

function helperReceiveFile(e) {
    // console.log("ooooo");
    if (!fileStream.current && e.data.toString().includes("start")) {
      fileStream.current = streamsaver.createWriteStream(JSON.parse(e.data).fileName,{size:JSON.parse(e.data).size});
      console.log("fileStream.current--->", fileStream.current);
      // alert("jskdhjfhk")
      console.log(JSON.parse(e.data).fileName);
      fileStreamWriter.current = fileStream.current.getWriter();
      console.log("fileStreamWriter.current--->", fileStreamWriter.current);
      return;
    }
  handleReceiveFile(e) 
    
  }
function handleReceiveFile(e) {
    // console.log(e);
    if (e.data.toString().includes("done")) {
      setGotFile(true);
      console.log(e);
      fileStreamWriter.current.close();
      fileStream.current = null;
      fileName.current = JSON.parse(e.data).fileName;
    } else {
      // console.log(e.data);
       fileStreamWriter.current.write(new Uint8Array(e.data)).catch(e=>{
        console.log("error",e);
       })
      // worker.postMessage(e.data);
    }
  }

I want to handle large files so i have to write the data directly in disk. @jimmywarting

if your server spport HTTP range,you can try to split your download and still able to write the all the response to one file.
I'm using 50MB and it works fine.

@jeremyckahn
Copy link
Contributor

I've noticed that Firefox doesn't prematurely kill the Service Worker process when the Firefox Devtools have the sw.js file opened.

jeremyckahn added a commit to jeremyckahn/StreamSaver.js that referenced this issue Feb 24, 2023
@jeremyckahn
Copy link
Contributor

It seems that #305 should fix this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

8 participants