[vox-tech] Macro Key and Mouse recorder for Linux

vox-tech@lists.lugod.org vox-tech@lists.lugod.org
Wed, 16 Oct 2002 20:14:32 -0700


RapidBuilder for Linux (never used it before but it sound kind of like
what you want ALERT - does *not* appear to be OpenSource! )
http://www.xstreamsoftware.com/
http://www.xstreamsoftware.com/download.htm

-ME

-----BEGIN GEEK CODE BLOCK-----
Version: 3.12
GCS/CM$/IT$/LS$/S/O$ !d--(++) !s !a+++(-----) C++$(++++) U++++$(+$) P+$>+++ 
L+++$(++) E W+++$(+) N+ o K w+$>++>+++ O-@ M+$ V-$>- !PS !PE Y+ PGP++
t@-(++) 5+@ X@ R- tv- b++ DI+++ D+ G--@ e+>++>++++ h(++)>+ r*>? z?
------END GEEK CODE BLOCK------
decode: http://www.ebb.org/ungeek/ about: http://www.geekcode.com/geek.html


On Wed, Oct 16, 2002 at 07:52:11PM -0700, andy wergedal wrote:
> wget will not work because every page checks the IIS
> session for UID/PWD.
> 
> I am using some Mercury tools to test the site. I was
> looking for the same type of thing for linux.
> 
> If I could script the key presses and entries into the text
> boxes then I could test galeon, opera, mozilla and netscape
> on linux for compatability.
> 
> There are a number of tools to record keystrokes for
> windows, but I have not found one for linux.
> 
> -- Andy
> 
> 
> --- dugan@passwall.com wrote:
> > On Wed, Oct 16, 2002 at 10:47:54AM -0700, Andy Wergedal
> > wrote:
> > > I am doing some web-site testing for one of my clients.
> > I am using a number of windows-based tools to do the
> > automated part of my testing. 
> > > 
> > > Does anyone know of a keystroke and mouse macro
> > recorder for Linux? Or a web site tester.
> > > 
> > > I already dump the source code and compare against a
> > known page. I need to automate the GUI portion in Linux
> > 
> > 
> > If you need to check the source code of all linked pages,
> > you may want
> > to check out tools like wget. You can use it to copy an
> > whole site, and
> > have it passed level of recursion/jumps from starting
> > page or
> > use infinite recursion, specify the number of non-local
> > sites to jump
> > (dissimilar hostnames to use when starting from
> > blah.com). Each page is
> > downloaded and stored in a separate file and the local
> > files are stored
> > in a heirarchy much like what you find on the remote
> > site. It (of
> > course) does not copy server-side processing instructions
> > and
> > directives, as it only sees what a web browsr would see.
> > 
> > Another tool is "checkbot" which requires some perl
> > modules. It examines
> > all starting pages and can be passed args for how deeply
> > it should
> > search your site and pages. You can tell it to start from
> > your "main
> > page" and then recursively follow all links to pages and
> > then links on
> > those pages and then links on those pages.... etc.
> > Usually, you limit it
> > to check just your local pages *and* first links to other
> > sites from
> > your site (just to make sure your links to other pages
> > work). It is a
> > very cool tool. Unlike wget (which snarfs stuff as fast
> > as it can)
> > checkbot is a littme bit more sane and does is slower so
> > as to not
> > overwhelm your web server with too many requests too
> > fast. Checkbot also
> > allows for a dump file to store "status" where it updates
> > its link
> > checking status. From this page, you can also see pages
> > that have broken
> > links and the links that it thinks are borken based on
> > the error code
> > returned by the web server.
> > 
> > Are these what you are looking for? If not, I may have
> > other ideas...
> > 
> > -ME  
> >