Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support img tag #1

Closed
0x6b opened this issue Dec 16, 2017 · 6 comments · Fixed by #5
Closed

Support img tag #1

0x6b opened this issue Dec 16, 2017 · 6 comments · Fixed by #5

Comments

@0x6b
Copy link
Owner

0x6b commented Dec 16, 2017

No description provided.

@0x6b
Copy link
Owner Author

0x6b commented Dec 16, 2017

@allenyllee
Copy link

Can this be help?
https://github.com/euangoddard/clipboard2markdown/tree/master

I use it to convert my copied content, including image, and it convert img tag correctly.

@allenyllee
Copy link

It seems that there is an another addon can do that: BlackGlory/Copycat: Copy content from web powerful than ever before.

@0x6b
Copy link
Owner Author

0x6b commented Jul 28, 2018

Sorry for delay on my end as I have some health issue recently. Anyway thanks a lot for your comments @allenyllee. I just updated with preliminary img tag support and tagged it as v0.2.0. https://addons.mozilla.org/en-US/firefox/addon/copy-selection-as-markdown/ is just updated as well. Please have a look into the new one and let me know if you have any issue. Thanks!

@allenyllee
Copy link

allenyllee commented Jul 30, 2018

@0x6b That's great! In some cases, your addon even more correct then copycat.
For example, I want to copy this page which contains latex formula:

I can get the origin latex formula via your addon:

I'm trying to understand how backpropagation works for a softmax/cross-entropy output layer.

The cross entropy error function is

E(t,o)=−∑jtjlogoj

E(t,o)=-\\sum\_j t\_j \\log o_j

instead of getting the html element via copycat:

I'm trying to understand how backpropagation works for a softmax/cross-entropy output layer.

The cross entropy error function is

<nobr aria-hidden="true">E(t,o)=−∑jtjlogoj</nobr>

But I think it can be improved by adding $$..$$ or $...$ and replace double slash with single slash and remove the plain text, like this:

I'm trying to understand how backpropagation works for a softmax/cross-entropy output layer.

The cross entropy error function is

$$E(t,o)=-\sum_j t_j \log o_j$$

Maybe I should open a new issue for this requirement.

@0x6b
Copy link
Owner Author

0x6b commented Jul 31, 2018

It's interesting use case. Let me see how it can be implemented. I'll open a new issue for you later.

@0x6b 0x6b mentioned this issue Jul 31, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants