Giter Site home page Giter Site logo

brick's Introduction

Brick

Brick is a modern, functional, OO-like language, designed to have the expressiveness you've come to expect from languages like Ruby and Python, joined with the power of Common Lisp, Rust, and ML.

Here's a quick list of features we are working on supporting in the core language:

  • Classes for organization
  • Mixins
  • Type Inference/Reconstruction
  • Parametric Polymorphism
  • Overloading
  • Hygenic Macros
  • Pattern Matching
  • Algebraic Data Types
  • Traits
  • Parallelism

Here's the obligitory Hello World:

fn main
  puts("Hello, World!")
end

And something a bit more complex:

fn main
  let nums = [1, 2]
      noms = ["Chad", "Nick", "Kristen", "Steve"]
      odds = nums.map(|x| -> x * 2 - 1)
  in
    odds.each => |num|
      puts("%s says hello from a new thread!", noms[num])
    end
  end
end

Interested?

Read the docs

Getting Involved

So you want to help out. That's great!
Whether you're a Ruby developer, a Node.js hipster, or a C wizard, we can use your help.

Here's some areas we're working on right now:

  • The language definition (Kind of a big deal to get this ironed out)
  • Writing the reference compiler/interpreter. Technologies used include:
  • Writing the runtime for the compiler. Technologies used include:
    • libuv: Cross-platform abstraction
    • C: Glue between all the pieces
  • The website, brick-lang.org. Possible technologies include:
    • Node.js + Express
    • Ruby + Jekyll or Sinatra

About Rust and Brick

It's come to my attention that some people see Brick as a clone of Rust. This is not the case.

Brick was imagined as a sister language to Rust. Where highly performant, low-level applications may be written in Rust, Brick is intended as the flip-side of that coin: highly parallel, high-level applications.

C and C++ (and to some extant Go) are used as systems languages, Rust attepts to bring features from languages like OCaml and Haskell to this area.

On the other hand, Clojure, Scala, and OCaml are the main functional languages for application development. Clojure and Scala are tied to the JVM, while OCaml is in need of a serious overhaul of the runtime (no threading).

I want to use features from OCaml and Haskell and Rust in my daily work. But I don't do systems level programming very often. It's too low level for many of the things I do.

So Brick is being designed to integrate with other languages at a low level, such as Rust and C. But it's not a competitor at all. If you debate between using Brick and Rust, you should re-evaluate your intent.

Not to mention that Brick is nowhere near the level development of Rust right now :P

brick's People

Contributors

toroidal-code avatar weswigham avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

brick's Issues

Pipeline syntax

let | x = 1
    | y = 2
    | z = 3

This is nice. Why restrict it to let/cond/match? Why not let the | syntax, hereby referred to as the 'pipeline syntax' to any unary or binary function/operator?

Example use cases:

Binary operations:

sad -> |a:String, b:String|
    a+"\r\n"+b

sad | "Hello"
    | "from a new line"
let | x = + | func1()
            | func2()
            | func3()
zip | [1,3,5]
    | [2,4,6]

Unary operation:

process -> |q:Array<Int>|
    puts('[')
    let | sum = 0
    q.each -> |i:<Int>|
        sum += i
        puts | i
             | ','  
    puts | sum
         | ']'
    sum

process | [1,2,3]
        | [4,5,6]
        | [7,8,9]

On unary operation, the pipline applies the unary operation to every pipeline'd object.

On a binary operation, the pipeline acts as a kind of upwards collapse (or a fold left, if you'd prefer). The first two are fed as inputs to the binary operation, and then the result is fed into it again along with the third input, then that result is fed in along with the 4th, and so on.

So yeah, don't restrict this syntax to let/cond/match, it looks pretty useful and pretty elsewhere.

Milestone 1

Implement subset 1:

  • functions
  • context modification let!
  • simple type system Int

Wiki link in the readme is broken

The wiki link points to a GitHub wiki that doesn't seem to exist yet (it doesn't 404 though, it just redirects back to the readme).

Feel free to close this issue if that was intentional.

Self-aware environments

Namely, a function should be aware of its context and be able to modify it.

Firstly, a function should have a method by which it can call itself, anonymous fn or otherwise. (it's certainly more clear if the method is the same)

fn -> 
    puts("Top level")
    (fn (i:Int) ->
         cond | i<10 ->
             puts("Inner level: #%d" % i)
             ^(i+1)
    )(0)

Which would output

Top level
Inner level: 0
Inner level: 1
Inner level: 2
Inner level: 3
Inner level: 4
Inner level: 5
Inner level: 6
Inner level: 7
Inner level: 8
Inner level: 9

Secondly, a function should probably have access to it's :before's and :after's; ie:

fn example ->
    cond | ^.before.length>0 -> 
                puts("Apparently something went before me... NO MORE.")
                ^.before.each -> |k, v|
                    ^.before[k] = nil
            | * -> 
                puts("I am alone in this world, as I should be.")

fn example:before ->
    puts("This happens before the example, bwahahaha")

This is to simply allow for dynamically adding and removing before and after hooks.

It would also be useful to have a function's type signature available to the function, and maybe even a copy of the arguments it was passed.

It would also be nice to extend the syntax (^^^^^'s!) to let a function get these same attributes of its caller (and therefore its caller's caller, and its caller's caller's caller, etc...)

Async/await

C# started it, ES6 has it, even C++ 14 has it. Async/await is an amazingly simple construct for creating asynchronous code. Weather it's implemented with promises or continuations on the back end, we'll see later (likely continuations, since I see no indication that Brick will be event-loop based).

Regardless, the async/await keyword syntax for defining asynchronous code is a godsend; it helps create self-documenting code, simplifies asynchronous control flow, and fits snugly with already-defined chunks of brick syntax:

#=
A super-simple asynchronous IRC bot, to experiment with Brick
#=
import Net.Socket.TCP
server = "irc.freenode.org"
port = 6667
nick = "Brick"
channel = "#brick_bot"
commands = {
    "!quit": |sock:Socket, _:Array<String>| async {
        await sock.write("QUIT :Exiting")
        await sock.destroy!
        exit(0)
    },
    "!id": |sock:Socket, words:Array<String>| async -> boolean {
        await sock.write("PRIVMSG " + channel + " :" + words.implode(" "))
    }
}
async listener(sock:Socket, s:String)
    if s.substring(0, 6) == "PING :"
        await sock.write("PONG :" + s.substring(6, s.length))
    else
        let p = s.find(":")
            cs = s.substring(p, s.length)
            exp = cs.explode(" ")
        in
            if commands.has_key?(exp[0])
                puts(exp[0] + "> " + exp.drop(1).implode(" "))
                await commands.get(exp[0])(sock, exp.drop(1))
            else
                puts("> " + cs)
            end
        end
    end
end
async main
    let !sock = await TCP(server, port) in
        if not !sock.success?
            puts("Failed to connect to server")
        else
            await !sock.write("NICK " + nick)
            await !sock.write("USER " + nick + " 0 * :" + nick + " bot")
            await !sock.write("JOIN " + channel)
            sock.listen!(listener)
        end
    end
end

Important notes on what I see as proper use: async isn't an used like an access modifier like in C++ or C#, instead, it's an alternate keyword for function declaration, which implies that the return type should be wrapped in an Awaitable Promise/Future. This way, if manually specifying a type declaration for an async, you needn't include that yourself. In effect, the async keyword adds the type -> Awaitable<T> to the end of the type line. You'll notice that for lambdas, the async keyword appears after the argument types. If I specifying the lambda's return type, it would appears before that. This makes sense, since in practice, the keyword modifies the return type of the function.

As far as the specifics on how async/await interact with threading/blocking, the C# documentation section is excellent: http://msdn.microsoft.com/en-us/library/hh191443.aspx#BKMK_Threads

Multiline comment issue

I really like the way multiline comment look right now,

#=
    Yay, a comment!
#=

Here's the problem, though:

#=
    Block A
#=
    Block B
#=
    Block C
#=

What in that is actually commented? In other languages, it would depend on the direction of the multiline comment sigil, eg.

/*
    Block A
*/
    Block B
/*
    Block C
*/

The remedy is trivial, change the multiline comment syntax to having an 'opening' and 'closing' sigil.

Perhaps

#<
    Block A
>#
    Block B
#<
    Block C
>#

Which, for style, means you could

#<================================
    Block A
================================>#

If you were into that kinda thing.

Hash Syntax

This is about =>, the fat arrow.

Please try to avoid it; in Ruby is looked cool and shiny for, say, passing keyword arguments into a function, since the hash was imitating a 'send' operation, but in every other scenario it is less clear to a newer programmer what the fat arrow does.

I propose removing the 'fat arrow' syntax before things get out of hand and replacing it with either your traditional ':' or '=' defined hashes, this way the language isn't caught in the same situation Ruby is, with multiple equally valid hash syntaxes (creating bad or inconsistent style).

Example hash literals:

{1: "One", 2: "Two", "Three": 3, four: 4, [fivevar]: 5}

(where fourvar is a variable that is resolved to the key)
Really, in this example syntax, the number and string key definitions are syntactic shortcuts for

{[1]: "One", [2]: "Two", ["Three"]: 3, ["four"]: 4, [fivevar]: 5}

Incidentally, the literal could also be written as:

{"Zero", "One", "Two", "Three": 3, four: 4, [fivevar]: 5}

Where definitions not written as key-value pairs (aka just values) are implicitly assigned to increasing available indexes.
(with the addition of a 0th element)

The square-bracket variable syntax is a carry-over form PHP and Lua, where square-bracket defined tables are the norm.

A good thing (tm) that the square-bracket definition option allows is that when you use it, it becomes obvious if the contents are a variable or a literal to the reader. Additionally, it allows any hashable object as the key (instead of just strings or numbers)

Also, here's why I like the square brackets over the fat arrow in big hashes:

{
    ["A"]: "Incorrect",
    ["B"]: "Somewhat correct",
    ["C"]: -> "And then there's this", # A block returning the string
    ["D"]: -> { # A block returning a hash
        ["Success"]: "You've done good, son",
        ["Fail"]: os.exit
    },
    ["E"]: ["Just", "For", "Reference"]
}

Whereas with the fat arrow, we'd have:

{
    :A =>  "Incorrect",
    :B => "Somewhat correct",
    :C => -> "And then there's this", # A block returning the string
    :D => -> { # A block returning a hash
        :Success => "You've done good, son",
        :Fail => os.exit
    },
    :E => ["Just", "For", "Reference"]
}

Namely the => -> bit is what bothers me. (Not too keen on how I can't use variables in the keys either (unless you allow for variably formatted symbols like ruby, but that seems like an unneeded stretch)). There's also more symmetry between the look of keys and arrays (a hash is essentially an extended array) with the square bracket syntax.

Coroutines As A Core Language Feature

Co-routines are baby threads. Or perhaps super threads. In effect, they are threads whose scheduling is undecided. Python and Lua both implement co-routines at a language level, and for good reason. In python, they are used as an extension to generators, whereas in Lua they are used to model more complex iterator patterns and to handle concurrency issues in non-concurrent systems.

So, more to the point, the language should probably implement co-routines at the language level (for syntax prettiness purposes). A co-routine is a superclass of a function; any function is at least a one-step co-routine. What makes a co-routine different from a generator is that it can consume new inputs on every step (rather than just yielding outputs).

Since Brick is statically typed, co-routines may yield a problem - a co-routine may not want to yield the same type in all cases, and determining if the caller is expecting the correct ones can be difficult.

So, on to the actual syntax. In Python, coroutines were kind of hacked onto generators, so I'm going to ignore its ugly syntax for them:

def foo():
    for i in range(10):
        yield i # generator

def bar():
    state = 1
    for i in range(10):
        state += (yield state) # coroutine

def main():
    while True:
        print(bar.send(math.random(0,20)))

Lua's syntax is a bit less unnatural, but still cumbersome.

function bar()
    local state = 1
    for i=1,10 do
        state = state + coroutine.yield(state)
    end
end

local core = coroutine.wrap(bar)

function main()
    while true do
        print(core(math.random(0, 20)))
    end
end

(Lua has some more functions, such as resume, if you want to avoid the wrap shortcut)

So, what variety of coroutine syntax would fit in well with the language...?
I think something like

fn bar ->
    let x = 1
    10.times ->
        x += ^.receive(x)

fn main ->
   while true ->
        puts(bar.send(math.random(0, 20)))

Which has none of the yield-keyword-ambiguity that python has, while avoiding the high verbosity of lua's coroutines. Additionally, using ^ to store a function's state (if we view a function has a state machine) makes sense in this context, also making coroutine 'trampolining' (yielding all the way down to the initial thread/scheduler so it can schedule/start the next task) un-needed, since the child can simply go

^^.send(result)

(This is actually one of the major problems in a coroutine-based system, the need to trampoline back down to the scheduler to pass inputs around. Being able to avoid that is pretty cool.)

Partial Application Discussion

Let's talk about syntax for partial application of functions. Blame Steve. I disagree with Steve entirely, and believe it should be square brackets - and that the operation should potentially be over-loadable on a per-function basis via a meta-function.

thingy(x,y,z)
#Can be (fully) partially applied as
thingy[x][y][z]

This makes it very clear when you partially apply a tuple argument without the need for further grouping symbols.

thingy2((x,y),z)
#Can be (fully) partially applied as
thingy2[x,y][z]

I think this is sensible - especially if tuples are implicit during declaration or return without grouping symbols, for example:

T -> J -> (T, J)
fn thingy(x, y)
    x, y

fn letter
    let | x = 23, 24
    thingy2[x](25)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.