Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Jul 31 20:07:13 <bgamari> hah
Jul 31 20:08:01 <bgamari> AndreasK, I suspect in most repl cases we really want to favor fast iteration time
Jul 31 20:08:13 <bgamari> then again, in that case you are generally using the bytecode interpreter, not the NCG
Jul 31 20:08:18 <AndreasK> Exactly
Jul 31 20:08:19 <alanz> bgamari, pong
Jul 31 20:08:43 <AndreasK> bgamari: Guess it will leave that as is than
Jul 31 20:08:46 <alanz> We need an API to ghci some time, but not sure how it will look
Jul 31 20:08:52 <wz1000> if there is TH involved, there might be cases when turning off optimization may make compiles slower
Jul 31 20:08:53 <bgamari> which has comparable performance characteristics to -O-∞ ;)
Jul 31 20:09:11 <bgamari> perhaps we should introduce -O-∞ as a synonym for -fbytecode
Jul 31 20:09:15 <bgamari> ;)
Jul 31 20:09:30 <bgamari> alanz, yeah, I know we've discussed this in the past
Jul 31 20:09:34 <AndreasK> bgamari: True. I wondered a few times if we couldn't run the core simplifier at least before compiling to bytecode
Jul 31 20:09:57 <alanz> I have been fighting other dragons in the meantime
Jul 31 20:10:10 <bgamari> sure
Jul 31 20:10:36 <alanz> but there is basically the vanilla ghci solution, and something that would end up in hie
Jul 31 20:10:59 <bgamari> alright, so you don't view HIE as superceding ghci-as-a-language-server?
Jul 31 20:11:01 <alanz> And I am not sure if those two are the same. Ideally hie uses ghci
Jul 31 20:11:17 <alanz> well, we cant expect people to have hie installed always
Jul 31 20:11:27 <wz1000> hopefully with an interface other than stdin/stdout
Jul 31 20:11:40 <alanz> so ghci is the basic repl, imo. But it should be able to expose a separate API to it
Jul 31 20:11:57 <alanz> yes, via some other connection, such as a named pipe, tcp session, etc
Jul 31 20:12:29 <AndreasK> I don't think much of what ghci is used for by machines falls into "basic repl functionality".
Jul 31 20:12:40 <alanz> And at the back of my mind I am thinking of doing something with the external interpreter. Which can be run to have its stdio on pipes
Jul 31 20:13:04 <alanz> AndreasK, I agree, but people need to be able to type ghci on the CLI and get a repl
Jul 31 20:13:11 <alanz> without a big monster like hie
Jul 31 20:13:44 <wz1000> we do have execStmt
Jul 31 20:13:58 <wz1000> https://www.stackage.org/haddock/nightly-2018-03-21/ghc-8.4.1/GHC.html#v:execStmt
Jul 31 20:14:04 <alanz> If external interpretet is used purely to run stuff, with io that can easily be externalised to a UI, it means the normal stdio for ghci can be used for machine comms
Jul 31 20:14:20 <AndreasK> Sure, but I think that's a seperate issue from the "tools that want something language server like" one
Jul 31 20:14:49 <AndreasK> alanz: But then I'm sure you've explored that design space much more than me. So I trust your judment there more than mine :)
Jul 31 20:14:54 <alanz> There is a whole heavily featured repl built into ghci, which partially uses readline, which is supposed to allow managing io pipes
Jul 31 20:15:01 <alanz> but ghci does not always use it
Jul 31 20:15:28 <alanz> I am told that external interpreter cuts out a number of targets, including cross compilation
Jul 31 20:15:42 <alanz> Not sure of the details, though.
Jul 31 20:15:47 <AndreasK> Isn't that also broken on windows?
Jul 31 20:15:55 <alanz> I would expect it to *help* with cross compilation
Jul 31 20:16:14 <alanz> I don't know, I know it exists, and can have its io wrapped.
Jul 31 20:16:17 <alanz> and that is all
Jul 31 20:17:14 <bgamari> alanz, well, technically it uses haskeline, not readline
Jul 31 20:17:14 <alanz> which means we can end up with something that keeps responding and has a ghc session (the normal exe), and the actual bytecode vm running in the external
Jul 31 20:17:22 <alanz> sorry, that is what I meant
Jul 31 20:17:28 <bgamari> I'm not sure how well haskeline supports pipes
Jul 31 20:17:32 <bgamari> you likely know better than I do
Jul 31 20:17:50 <alanz> And I tried to use that to separate things out, but ghci is a bit sloppy about when it uses that
Jul 31 20:18:44 <alanz> Something else to look at as a candidate is the phoityne plugin for vscode
Jul 31 20:18:44 * replay has quit (Ping timeout: 256 seconds)
Jul 31 20:19:00 <alanz> it implements a thing called the Debug Access Protocol, and basically wraps ghci
Jul 31 20:19:06 <wz1000> alanz: why don't we use execStmt or something to implement a repl in hie?
Jul 31 20:19:28 <alanz> wz1000, because 30 seconds later people will say "why doesn't it do XXXX"
Jul 31 20:19:51 <alanz> ghci is a standard repl, people expect it to behave a particular way
Jul 31 20:20:04 <bgamari> for a truly machine-readable interface I expect we'd want to have some JSON-based request protocol
Jul 31 20:20:07 <alanz> and it has evolved over years. No point throwing all that embedded learning away
Jul 31 20:20:14 <bgamari> but perhaps that is out of scope of ghci
Jul 31 20:20:33 <alanz> bgamari, that is immaterial, the main thing is to come up with a way of separating machine comms from human comms
Jul 31 20:20:40 <bgamari> right
Jul 31 20:21:07 * crobbins (~crobbins@c-73-232-226-190.hsd1.tx.comcast.net) has joined
Jul 31 20:21:24 <alanz> and ideally having separate processes is an advantage, so that a running statement does not lock everything else up
Jul 31 20:21:58 <bgamari> my point being that we need something beyond a way of having multiple textual REPL interfaces (which would still require the consumer to parse what are essentially unparseable responses)
Jul 31 20:22:10 <alanz> I agree
Jul 31 20:22:21 <alanz> separate comms channels
Jul 31 20:22:50 <alanz> it could be done by having a wrapper around ghci, and having ghci talk the multiplexed stuff
Jul 31 20:23:03 <alanz> and then the wrapper can route is appropriately
Jul 31 20:23:28 <alanz> I think to some extent that is what Intero does, I seem to recall a comment from someone there that it actually works pretty well
Jul 31 20:23:50 * replay (~replay@pdpc/supporter/student/replay) has joined
Jul 31 20:23:52 * bgamari opens a ticket for this
Jul 31 20:24:00 * bgamari should have opened such a ticket long ago
Jul 31 20:24:49 <alanz> This is the kind of thing that should have a lot of attention. Someone looking at it full time, for a year or two
Jul 31 20:25:56 <elvishjerricco> Is there any documentation on how the GHC calling convention works in LLVM (or in general)? I'm looking at mocking its implementation in LLVM when targeting wasm (which doesn't yet support tail calls), so I'd like to know what exactly I'm getting into.
Jul 31 20:26:01 <alanz> In the context of https://github.com/haskell/haskell-mode/issues/1553, having a set of tests that capture the actual machine usage would be a good ste
Jul 31 20:26:15 <bgamari> alanz, I see no reason why GHCi could support multiple channels itself
Jul 31 20:26:19 <bgamari> couldn't*
Jul 31 20:26:20 <alanz> so we know when it breaks, and can talk to the relevant parties about how to manage it going forward
Jul 31 20:26:32 <bgamari> adding another layer doesn't seem to me to simplify anything
Jul 31 20:26:40 <alanz> bgamari, I think it should, to be the building block
Jul 31 20:26:52 <alanz> think of that as a logical layering
Jul 31 20:26:58 <alanz> thought experiment
Jul 31 20:27:31 <alanz> and/or running it with the plugin stuff that ezyang wrote about a little while ago
Jul 31 20:29:35 * ByronJohnson has quit (Ping timeout: 240 seconds)
Jul 31 20:30:38 * ByronJohnson (~bairyn@unaffiliated/bob0) has joined
Jul 31 20:33:35 * fendor has quit (Ping timeout: 240 seconds)
Jul 31 20:34:17 * raichoo (~raichoo@dslb-088-077-255-196.088.077.pools.vodafone-ip.de) has joined
Jul 31 20:34:53 <bgamari> alanz, https://ghc.haskell.org/trac/ghc/ticket/15461#comment:1
Jul 31 20:35:37 <alanz> "newcomer" :)
Jul 31 20:36:13 <RyanGlScott> After all, we're all nothing if not extremely experienced newcomers
Jul 31 20:36:18 <alanz> and 8.6.1. ambitious
Jul 31 20:36:32 <RyanGlScott> I like the cut of this bgamari guy's jib
Jul 31 20:36:44 <alanz> true. I think I will always be a newcomer, there is *so* much more to learn
Jul 31 20:36:57 * raichoo_ has quit (Ping timeout: 264 seconds)
Jul 31 20:37:03 <bgamari> RyanGlScott, hah
Jul 31 20:37:21 <bgamari> I do think that (1) is reasonably straightforward
Jul 31 20:37:38 <bgamari> really GHCi is just a normal (if slightly messy) haskell application
Jul 31 20:38:10 <wz1000> how would that work with different environments on the repl sessions?
Jul 31 20:38:18 <alanz> bgamari, it is, but making sure that all the io gets wrapped is the problem
Jul 31 20:38:51 <bgamari> um, right
Jul 31 20:38:52 <alanz> e.g. I am debuging a haskell module that does 'hputStrLn stderr "stderr, baby"'
Jul 31 20:38:55 * bgamari forgot all about this
Jul 31 20:39:03 <bgamari> hmm
Jul 31 20:39:30 <alanz> so either we need to urge restraint on the repl users, which is not unreasonable
Jul 31 20:39:38 <alanz> as it is the status quo
Jul 31 20:39:42 <bgamari> I can think of some terrible hacks to make the standard handles thread-local
Jul 31 20:40:02 <alanz> or we need to run the actual interpreter in a separate process, where we *can* manage the io channels
Jul 31 20:40:05 <wz1000> I'm pretty sure `process` can redirect IO
Jul 31 20:40:22 <bgamari> alanz, but then you can't share state between your channels
Jul 31 20:40:29 <alanz> wz1000, it can, but what are you going to run with redirected io?
Jul 31 20:40:42 <bgamari> my thought was that all channels are running against the same ghc session
Jul 31 20:40:51 <wz1000> yeah, you need a separate process probably
Jul 31 20:40:54 <bgamari> and that if you want multiple sessions then you just start multiple ghcis
Jul 31 20:40:57 <alanz> bgamari, the closest we have to this is the external interpreter
Jul 31 20:41:10 <bgamari> alanz, right, but I really don't think that's what you want
Jul 31 20:41:28 <alanz> where we can run multiple repls against the same session, but when we execute code it is in the external process
Jul 31 20:41:45 <bgamari> yes
Jul 31 20:42:13 <bgamari> but the trouble is if you have two channels, both of which run putStrLn we wouldn't be able to deinterleave the output
Jul 31 20:42:45 <alanz> well, I would see it being one channel for machine interaction, and one for human.
Jul 31 20:42:54 <bgamari> arguably you would want each channel to be running as a separate process but sharing a heap
Jul 31 20:43:22 <alanz> Because the basic repl stuff calling the GHC API is fine, in terms of sharing
Jul 31 20:43:32 <bgamari> so one way to emulate this would be to run the various channels' computations as multiple threads
Jul 31 20:43:39 <wz1000> maybe the bytecode interpreter can be modified to treat writes to stdio differently
Jul 31 20:43:46 <alanz> and any interpreted IO can be clearly wrapped, if it is going through redirected io
Jul 31 20:43:47 <bgamari> and emulate their standard handles
Jul 31 20:44:03 <bgamari> but this may very well be more work than it's worth
Jul 31 20:44:07 <alanz> wz1000, we have that already, with fexternal-interpreter
Jul 31 20:44:18 <bgamari> wz1000, it wouldn't even require the bytecode interpreter
Jul 31 20:44:21 <bgamari> you'd just modify base
Jul 31 20:44:29 <alanz> and the hook that launches it allows you to interpose your own pipes
Jul 31 20:44:41 <bgamari> the standard handles are already MVars
Jul 31 20:45:16 <alanz> well, that sounds promising
Jul 31 20:45:21 <bgamari> you would just generalize that idea, exposing an interface of getStdin :: IO Handle
Jul 31 20:45:52 <bgamari> which in the case of ghci could be emulated as a lookup in some global map indexed by threadid or somesuch
Jul 31 20:45:55 <alanz> the bytecode interpreter can load and run non-bytecode modules?
Jul 31 20:46:09 <bgamari> you would need to make sure that new threads were given the right handles
Jul 31 20:46:12 <geekosaur> yes, it can't set breakpoints in them or etc.
Jul 31 20:46:17 <alanz> But I guess you can manage the MVars according to the process id
Jul 31 20:46:19 <bgamari> it's terrible
Jul 31 20:46:40 <bgamari> but so is the existence of global handles
Jul 31 20:46:54 <bgamari> so our hand is rather forced
Jul 31 20:46:55 <wz1000> Maybe we could have a way to redirect IO during a normal progam. setStdOut :: Handle -> IO ()
Jul 31 20:46:57 <alanz> which is part of the POSIX spec, I think
Jul 31 20:47:04 <bgamari> indeed it is
Jul 31 20:47:19 <alanz> wz1000, I think I investigated at some point and came up blank
Jul 31 20:47:28 <alanz> but that could just be inadequate research
Jul 31 20:47:40 <bgamari> alanz, to be clear I'm suggesting that all channels' requests run under the same external interpreter
Jul 31 20:47:55 <bgamari> under the same process
Jul 31 20:48:03 <alanz> bgamari, I see 3 "users"
Jul 31 20:48:03 <bgamari> that's the only way they can share a heap
Jul 31 20:48:21 <alanz> 1) the normal repl user (human) interacting with the GHC API
Jul 31 20:48:44 <alanz> 2) the interpreter, with unrestricted IO, but redirected and wrapped
Jul 31 20:48:53 <alanz> 3) the machine repl user, similar to 1
Jul 31 20:49:27 <alanz> Do we need more than that?
Jul 31 20:50:48 <bgamari> I can see cases where the answer is yes
Jul 31 20:50:52 <bgamari> IHaskell, for instance
Jul 31 20:51:04 <bgamari> where you might want multiple textual REPLs running against the same interpreter
Jul 31 20:51:09 * RyanGlScott has quit (Quit: http://www.kiwiirc.com/ - A hand crafted IRC client)
Jul 31 20:51:27 <alanz> but those would just be multiplxed input onto 1)
Jul 31 20:53:03 <alanz> I guess 1 and 3 are basically the same, just talking a different protocol, and could potentially be as many as we want
Jul 31 20:53:29 <alanz> it is just the interpreter that is unique. But if wrapped properly, we could have lots of them too. Hmm.
Jul 31 20:53:44 <AndreasK> elvishjerricco: I don't think there is a detailed writeup anywhere
Jul 31 20:54:21 <AndreasK> elvishjerricco: The best I ever found was https://ghc.haskell.org/trac/ghc/wiki/Commentary/Rts/HaskellExecution/CallingConvention
Jul 31 20:54:32 <AndreasK> Which is ... meager
Jul 31 20:55:46 <elvishjerricco> AndreasK: Heh, no kidding
Jul 31 20:56:13 <bgamari> alanz, I'm not sure we need to support multiple interpreters per ghci session
Jul 31 20:56:28 <elvishjerricco> We're considering "emulating" tail calls in LLVM's wasm backend so we can use GHC's LLVM backend instead of the unregisterised one
Jul 31 20:56:47 <elvishjerricco> So I'm just looking into what's expected of `ghccc`
Jul 31 20:56:58 <bgamari> it seems to me that there is no advantage to supporting that over simply starting multiple ghci processes
Jul 31 20:57:09 <bgamari> elvishjerricco, the best documentation for the llvm calling conv is the implementation, sadly
Jul 31 20:57:16 <bgamari> however, the implementation is fairly readable
Jul 31 20:57:24 <alanz> bgamari, the same thought was going through my mind
Jul 31 20:57:53 <elvishjerricco> bgamari: That's nice at least. As someone unfamiliar with LLVM internals, do you have any links to the implementation?
Jul 31 20:58:13 <bgamari> alanz, that being said, you can't necessarily multiplex multiple ihaskell shells onto the same textual repl
Jul 31 20:58:48 <bgamari> alanz, I may want to start a separate long-running computation in each shell, for instance
Jul 31 20:59:13 <alanz> I think the fundamental problem, as we hit with ghc-mod, is that a given process can only have a single GHC session at a time
Jul 31 20:59:35 <bgamari> right, but I think that's fine
Jul 31 20:59:42 <alanz> so if one repl is say working on a test, and another on the lib, it will be a problem
Jul 31 21:00:03 <AndreasK> elvishjerricco: I encourage you to update the page with things you find out
Jul 31 21:00:07 <bgamari> yes, the interaction with the packaging system is a bit unfortunate
Jul 31 21:00:29 <elvishjerricco> AndreasK: That would be prudent of me. I'll try to keep that in mind
Jul 31 21:00:34 <alanz> so imo there needs to be a single "logical" session, controlled by the user. But some of it the user does directly via the repl, and some via their tooling
Jul 31 21:00:57 <alanz> which is how Intero does it at present, I am pretty sure
Jul 31 21:01:15 <bgamari> elvishjerricco, https://github.com/llvm-mirror/llvm/blob/ff1d4d27d786dd78122ad199a2a2417f4b6ded17/lib/Target/X86/X86CallingConv.td#L656
Jul 31 21:01:18 <alanz> and session is literally a GHC session
Jul 31 21:01:43 <alanz> of course we could first make ghc fully re-entrant, to get more options
Jul 31 21:02:05 * alanz being sarcastic
Jul 31 21:02:11 <bgamari> hah
Jul 31 21:02:22 <bgamari> alanz, the notion of a home package is bit unfortunate
Jul 31 21:02:24 <alanz> I gather the linker is the problem
Jul 31 21:02:48 <bgamari> although I've not thought enough about it to know whether there's a better alternative
Jul 31 21:02:59 <bgamari> and it would be pretty deep surgery to change at this point
Jul 31 21:03:17 <alanz> I know I asked around a bit at one stage, and got basically that answer too
Jul 31 21:05:31 <bgamari> alright, well I updated the ticket
Jul 31 21:05:53 <alanz> On another topic completely, I see http://ircbrowse.net/browse/ghc last updated in 2016
Jul 31 21:06:07 <alanz> based on the channel title
Jul 31 21:07:11 <alanz> sorry, 2018-04-11
Jul 31 21:09:14 * fmixing (~fmixing@5.18.211.209) has joined
Jul 31 21:11:21 <tdammers> @tell RyanGlScott yes, that one. thought you had mentioned a commit that contains it somewhere
Jul 31 21:11:22 <lambdabot> Consider it noted.
Jul 31 21:13:34 * fmixing has quit (Ping timeout: 256 seconds)
Jul 31 21:13:51 <elvishjerricco> How does this sound for a trampolining calling convention (in the absence of reasonable tail calling)? Functions using the GHC calling convention will return a bool and a pointer. If the bool is `true`, then the pointer is another function to trampoline to. If it is false, the pointer points to the return value. So tail calling this convention (which would require the tail caller to have this convention) means simply
Jul 31 21:13:51 <elvishjerricco> returning `true` and the function to trampoline into. Calling this convention in non-tail position (caller can have whatever convention they want) means initiating a trampoline, and repeatedly calling the returned functions until `false` is returned.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.