itamarst has quit [Quit: Connection closed for inactivity]
_vertig0 has joined #pypy
<cfbolz>
_vertig0: do you have a cpython2 around?
<cfbolz>
(sorry for having really bad gaps in my replies, had a super busy week
<cfbolz>
)
<cfbolz>
basically if you can get *any* python2 with pygame, you can use the viewer
<_vertig0>
cfbolz: No worries! I've read that you can't just use any cpython though, it has to be a special cpython 2 to be able to translate, which I'm not sure how to get that. In any case I ultimately had to modify pygame's source and then build it, using an old PyPy nightly build that had cpyext enabled. Wasn't fun but it worked. Thanks for the help though!
<cfbolz>
ah, windows! you're right, thanks for reminding me
<cfbolz>
there's another trick though:
<cfbolz>
you can use a standard python2 *just* for pygame
<cfbolz>
if you run `cpython2-with-pygame dotviewer/sshgraphserver.py LOCAL`
<cfbolz>
then start a translation in another terminal, with pypy2 without pygame, then it will use the other process to show you graph views
<cfbolz>
_vertig0: what are you working on anyway?
<_vertig0>
Oh wow that is really handy to know, thanks!
<cfbolz>
yeah, it's probably under-documented :-(
<_vertig0>
cfbolz: I'm trying to do an LLVM translation backend for fun, but I've had trouble understanding how to plug in a brand new backend and where to hook it up to receive the output of the backendopt step :P
<_vertig0>
Just gimme a second gonna save your tip in my notes for next time...
<cfbolz>
I never tested, but maybe it's even possible to make sshgraphserver.py work on python3. that would make installation easier for the future
<cfbolz>
adding a new backend should be possible, we used to have an llvm backend (long ago)
<cfbolz>
it's a lot of work though :-(
<_vertig0>
Yep I hijacked your llvm-translation-backend branch for that purpose :P
<_vertig0>
The hardest part so far was figuring out where the existing C one picks up the rtyper output and loads it into the database since the interface for database loading seems to also double as functions used to retrieve stuff from the C database, at least from what I see so far
<_vertig0>
(I don't know why I'm talking like I figured out how to do it when I haven't)
<cfbolz>
yeah, IIRC every backend had their own "database", there wasn't any shared infrastructure for that kind of stuff
<korvo>
_vertig0: (One of the fun things about RPython is that sometimes the big picture will make sense even when the code is difficult to understand. You're doing great.)
<_vertig0>
korvo: I hope it'll make sense to me in the end haha, right now I don't know what the rtyped intermediate representations are like or how to turn that into target code of any kind, let alone LLVM :P
<_vertig0>
cpython considering a feature that Java has :O
<korvo>
_vertig0: No rush. It's an ambitious goal and the translation process doesn't admit a nice linear decomposition into discrete non-overlapping stages.
<_vertig0>
Haha thanks!
<korvo>
Actually, have you read much about partial evaluation? It might help to know that a partial evaluator doesn't quite have the same internal structure as a 1960s-style compiler.
<_vertig0>
Is that when you run through some code to see its structure without actually executing it? I vaguely remember something about this but not much
<cfbolz>
rpython is a pretty traditional ssa-based compiler though
<cfbolz>
at least at the backend level
<korvo>
Oh, for sure. What I was hoping to explain was the bidirectional data flowing between different stages of the compiler.
<korvo>
For example, it might be confusing that the number of blocks grows during translation. But partial evaluation helps to explain this: sometimes we need to partially evaluate a hunk of code with respect to multiple different environments, and this generates multiple new blocks that didn't exist in the original input source.
<_vertig0>
I wouldn't say that necessarily confusing, compilers sometimes do have to expand blocks like that
<_vertig0>
(Maybe expand is the wrong word for that but you get what I mean)
<cfbolz>
most (optimizing) compilers add blocks in the optimization phase, I'd expect
<korvo>
Sure. As I was typing that, I saw a bit more of cfbolz' perspective; an SSA block-splitting compiler is more like 1990s than 1960s, and shouldn't be thought of as the same thing.
dmalcolm has quit [Ping timeout: 244 seconds]
_vertig0 has quit [Quit: Going offline, see ya! (www.adiirc.com)]
_vertig0 has joined #pypy
_vertig0 has quit [Read error: Connection reset by peer]
otisolsen70 has joined #pypy
otisolsen70 has quit [Remote host closed the connection]
otisolsen70 has joined #pypy
_vertig0 has joined #pypy
_vertig0 has quit [Read error: Connection reset by peer]
jcea has joined #pypy
dmalcolm has joined #pypy
_vertig0 has joined #pypy
itamarst has joined #pypy
lritter has joined #pypy
otisolsen70 has quit [Quit: Leaving]
lritter has quit [Ping timeout: 252 seconds]
mjacob has quit [Ping timeout: 252 seconds]
mjacob has joined #pypy
_vertig0 has quit [Read error: Connection reset by peer]
lritter has joined #pypy
lritter has quit [Quit: Leaving]
itamarst has quit [Quit: Connection closed for inactivity]