small change to attempt to stop tempfile() problems in larger documents.#2293
small change to attempt to stop tempfile() problems in larger documents.#2293abelew wants to merge 3 commits intoyihui:masterfrom
Conversation
|
|
yihui
left a comment
There was a problem hiding this comment.
I wish to understand the problem before fixing it. I don't see the limitation mentioned on the help page ?tempfile. I tested a million temp files and it seems to work fine for me:
for (i in 1:1e6) {
f = tempfile(); file.create(f); file.remove(f)
}Do you have more than a million plots in your document?
> xfun::session_info()
R version 4.3.1 (2023-06-16)
Platform: aarch64-apple-darwin20 (64-bit)
Running under: macOS Ventura 13.5.2, RStudio 2023.9.1.494
Locale: en_US.UTF-8 / en_US.UTF-8 / en_US.UTF-8 / C / en_US.UTF-8 / en_US.UTF-8
Package version:
compiler_4.3.1 graphics_4.3.1 grDevices_4.3.1 rstudioapi_0.15.0 stats_4.3.1 tools_4.3.1 utils_4.3.1
xfun_0.40.2
|
I think I replied via email, but the following is an example more similar to what happens in my documents and shows that you only need ~ 30 plots (at least in my hands) to hit a duplicate tmpnam: |
|
Sorry but I don't quite understand the code since I'm not familiar with these packages. In particular, I don't understand what Anyway, it will be clearer if you can show a minimal reproducible example that actually uses knitr. |
Greetings,
When knitting larger documents with many images I sometimes get tempfile errors saying it ran out of files. When I looked more closely, it seemed to me that tempfile() was just not trying very hard; so I hacked a quick md5 tempfile generator which in theory takes all the same arguments as base::tempfile(). I was thinking to make it more robust and use some actual pseudorandomly generated material, but this seems to have worked fine for all of my largest and most troublesome documents.