-
Notifications
You must be signed in to change notification settings - Fork 559
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
perl segfaults when freeing deeply nested structures #8978
Comments
From perl-5.8.0@ton.iguana.beCreated by perl-5.8.0@ton.iguana.beThe following program builds a double linked list, and then Increasing the 10000 to 1000000 makes more perl versions I was unable to reduce the program further. #! /usr/bin/perl -w for (1..10000) { # attach node in front # Make nodes single connected Perl Info
|
From enache@rdslink.roOn Sat, May 03, 2003 at 02:53:08PM -0000, perl-5.8.0@ton.iguana.be (via RT) wrote:
Simple hard stack overflow caused be deep recursion. Solution: adjust your limits. ex. $ perl 22095.pl Regards, |
From perl5-porters@ton.iguana.beIn article <20030505152443.GA2169@ratsnest.hole>,
Mmm, since this happens in a module that should be always useable without Since my program has no recursion, I assume the recursion is dec_thingy { which would then be called recursively for as many times as my linear Since e.g. on linux the default stack limit I get is only 8192, maybe This also begs the question why 5.6.1 on the same machine can handle |
From sky@nanisky.comOn måndag, maj 5, 2003, at 18:50 Europe/Stockholm, Ton Hospel wrote:
Patches welcome :) Arthur |
From perl5-porters@ton.iguana.beIn article <61BDB378-81FE-11D7-9B8A-000393CB5BC4@nanisky.com>,
Probably this thing should be looked into anyways even if it's kept |
From reto.stamm@xilinx.comCreated by reto@xilinx.comThe following script core dumps as it tries to collect $hash. # start of script This seems to happen on 32 bit Linux and 32 bit Solaris. Looking at the stack trace, it looks like it's analyzing the hash Perl Info
|
From chromatic@wgz.orgOn Wednesday 22 March 2006 14:43, Reto Stamm wrote:
Looking at the stack trace shows tens of thousands of calls. I think it's -- c |
The RT System itself - Status changed from 'new' to 'open' |
From christian@pflanze.mine.nuCreated by christian@pflanze.mine.nuSee the code below which segfaults. I don't see any cyclic reference, and running LengthOfStream It is clearly related to the stack size, increasing the stack limit It should be possible to create such streams of infinite length, being use strict; sub Delay ( & ) { sub Force ( $ ) { sub Cons ( $ $ ) { sub Car ( $ ) { sub Cdr ( $ ) { our $Nil= []; sub LengthOfStream ( $ ) { sub SequenceStream ( $ ) { if (0) { Perl Info
|
From christian@pflanze.mine.nu (It looks like the below text which I already sent as followup somehow It seems like it might be a problem with parameter passing through @_. # either use this definition instead: # sub LengthOfStream ( $ ) { # or this: sub LengthOfStream_ { sub LengthOfStream ( $ ) { But it has, of course, the ugly side effect of erasing the variable my $s= SequenceStream 100000; I find it strange, that this case also requires the undef $_[0] (and print LengthOfStream SequenceStream 100000,"\n"; Any idea? What I also don't understand is why this leaks memory and eventually sub InfiniteStream { LengthOfStream InfiniteStream(0); Any idea here? Christian. |
christian@pflanze.mine.nu - Status changed from 'new' to 'open' |
From @iabynOn Fri, Apr 28, 2006 at 04:23:49AM -0700, Christian wrote:
It's not actually closure-related; it's just that perl currently uses a my $s; It's on our list of things to fix one day... -- |
From @iabynOn Fri, Apr 28, 2006 at 02:25:25PM +0200, Christian wrote:
This is avoiding the segfault by avoidly building a recursive data -- |
From christian@pflanze.mine.nuIt seems like it might be a problem with parameter passing through # either use this definition instead: # sub LengthOfStream ( $ ) { # or this: sub LengthOfStream_ { sub LengthOfStream ( $ ) { But it has, of course, the ugly side effect of erasing the variable my $s= SequenceStream 100000; I find it strange, that this case also requires the undef $_[0] (and print LengthOfStream SequenceStream 100000,"\n"; Any idea? What I also don't understand is why this leaks memory and eventually sub InfiniteStream { LengthOfStream InfiniteStream(0); Any idea here? Christian. |
From xmltwig@gmail.comCreated by xmltwig@gmail.comI create a tree with a single root and thousands of children. The link The program crashes when the root goes out of scope. See the attached code which tries to find the exact number. Change the Note that the limit can vary (23790 and 23810 in my limited tests). This behaviour happens in Scalar::Util 1.18 and 1.19 I have also had reports of bugs in XML::Twig that look suspiciously I hope that helps -- Perl Info
|
From @andk
mr> The program crashes when the root goes out of scope. mr> See the attached code which tries to find the exact number. Change the Might be architecture dependent. I cannot reproduce under linux. I -- |
The RT System itself - Status changed from 'new' to 'open' |
From nospam-abuse@bloodgate.com-----BEGIN PGP SIGNED MESSAGE----- Moin, On Sunday 29 July 2007 11:29:42 Andreas J. Koenig wrote:
Note that the submitter had threads :) Reproducible here: # perl bug.pl # te@linux:~> perl -v This is perl, v5.8.8 built for x86_64-linux-thread-multi All the best, Tels Summary of my perl5 (revision 5 version 8 subversion 8) configuration: Characteristics of this binary (from libperl): - -- "Most people, I think, don't even know what a rootkit is, so why should -- Thomas Hesse, President of Sony BMG's global digital business division, iQEVAwUBRqxgDXcLPEOTuEwVAQIUggf+LZevVglAJVgseFy2lG46cmJ3Tu8una/V |
From @andk
> Note that the submitter had threads :) I cannot reproduce with threaded perls either. > gnulibc_version='2.4' I have 2.6 and this seems to be the key. I can reproduce it with bleadperl@31663 (without threads) on a box #0 0x080e538b in Perl_hv_undef (hv=0x850c600) at hv.c:1850 [da capo ad libitum] -- |
From blgl@hagernas.comThis looks like a stack overflow caused by the recursive implementation for (my $len=1; ; $len<<=1) { print STDERR "list length $len\n"; /Bo Lindbergh |
p5p@spam.wizbit.be - Status changed from 'open' to 'stalled' |
From @iabynThe issues related to perl seg-faulting when freeing deeply nested followed in May 2011 by the series of commits that made HVs |
@iabyn - Status changed from 'open' to 'resolved' |
Migrated from rt.perl.org#44225 (status was 'resolved')
Searchable as RT44225$
The text was updated successfully, but these errors were encountered: