[Ur] segmentation fault when inserting 10000 instead of 1000 rows?

Marc Weber marco-oweber at gmx.de
Mon Dec 20 00:29:51 EST 2010


segfault occurs using both: http and fastcgi implementation.
code: https://github.com/MarcWeber/urweb-benchmark (this replace Enum
module by ur implementation only which still segfaults).

Replacing the fill function by this variant (using 10000 instead of 1000) makes it segfault:

  fun fill () =
      dml (DELETE FROM t WHERE 1=1);
      List.app ( fn i =>
        (nv <- nextval s;
        (dml (INSERT INTO t (Id, S1, S2, S3, S4) VALUES ({[nv]}, {["S1"]}, {["S2"]}, {["S3"]}, {["S4"]}))))
      ) (Enum.to 0 100000);
      return (page "fill" <xml>done</xml>)

However this works:

  fun fill () =
      dml (DELETE FROM t WHERE 1=1);
      max <- return 10000;
      List.app ( fn i =>
        (nv <- nextval s;
        (dml (INSERT INTO t (Id, S1, S2, S3, S4) VALUES ({[nv]}, {["S1"]}, {["S2"]}, {["S3"]}, {["S4"]}))))
      ) (Enum.to 0 max);
      return (page "fill" <xml>done {[max]}</xml>)

(The difference is that I assign max to 10000 and output it after done)

So its the second bad behaviour I found using urweb using maybe non
idiomatic code.

If urweb should be used in production there is still a long way to go. A
comprehensive test suite is missing which catches some cases.

A type system catching most bugs upfront doesn't help me if the
resulting application does not behave the way it should.

So who is interested in stress testing urweb?
Which is the best way to setup test suites which will catch regression
in the future? Even if urweb behaves correctly gcc might introduce a bug
- and I really don't want customers to find them.

Marc Weber



More information about the Ur mailing list