"Charles C. Berry" writes: > On Wed, 8 Oct 2014, Rainer M Krug wrote: > >> "Charles C. Berry" writes: >> >>> On Mon, 6 Oct 2014, Rainer M Krug wrote: >>> >>>> Hi >>>> >>>> The variable transfer of tables from org to R caused sometimes 'could >>>> not find function "read.table"' errors (e.g. when the file was tangled >>>> into a ./data directory which was loaded by the function >>>> devtools::load_all("./")). This can easily be fixed by adding the package >>>> name to the call in R, i.e. replacing =read.table()= with >>>> =utils::read.table()= which is done in this patch. >>> >>> It does fix that one case. >>> >>> But I wonder if that is the best way. >>> >>> The heart of the matter is that load_all eventually calls sys.source, >>> which can be persnickety about finding objects on the search path. See >>> ?sys.source. >>> >>> If the src block you tangle to ./data/ has any code that uses any >>> other objects from utils, stats, datasets or whatever, you will be in >>> the same pickle. >> >> Exactly - that is true. But it is the same when putting this in a >> package (as far as I am aware). >> > > Do you mean that putting `x <- rnorm(10)' into a data/*.R file will > fail when you try to build and check? > > In fact, `R CMD build' will execute it and save the result as a > data/*.rda file. And check will go through. > > devtools::load_all (calling load_data) fails to do that. Which is why > I think this is a devtools issue. OK - point taken. But I still think that the =utils::read.table()= would not hurt, rather make the variable transfer safer. Rainer > > Chuck > > -- Rainer M. Krug email: Rainerkrugsde PGP: 0x0F52F982