h1
Tide
-
2019-11-27
Today we’re happy to announce the release of Tide 0.4.0 which has an exciting new design. This post will (briefly) cover some of the exciting developments around Tide, and where we’re heading.
Our north star
The async/.await
MVP was released as part of Rust 1.39 three weeks ago. We
still have a long way to go, and features such as async traits, async iteration
syntax, and async closures mean that the async code we’ll be writing in the
future probably will be different from how we do things today.
That’s why we’d like to start with sharing where we want to bring Tide in the future, but can’t because we’re limited by the technology of our time. We’d like to make Tide a blazing fast, request-response based, streaming web framework. In typical Rust spirit: not compromising between ergonomics and performance.
Reply with “hello world”:
async fn main() -> tide::Result<()> {
let mut app = tide::new();
app.at("/").get(|_| "Hello, world!");
app.listen("127.0.0.1:8080").await
}
Serialize + Deserialize JSON:
#[derive(Deserialize, Serialize)]
struct Cat {
name: String
}
async fn main() -> tide::Result<()> {
let mut app = tide::new();
app.at("/submit").post(async |req| {
let cat: Cat = req.body_json()?;
println!("cat name: {}", cat.name);
let cat = Cat { name: "chashu".into() };
tide::Response::new(200).body_json(cat)
});
app.listen("127.0.0.1:8080").await
}
Both examples are really short, but do quite a bit in terms of functionality. We think using async Rust should be as easy as sync Rust, and as the lang features progress this will become increasingly a reality.
Tide today
Like we said, we’re not quite there yet. Today we’re releasing Tide 0.4.0, a first step in this direction. Our “hello world” looks like this:
#[async_std::main]
async fn main() -> io::Result<()> {
let mut app = tide::new();
app.at("/").get(|_| async move { "Hello, world!" });
app.listen("127.0.0.1:8080").await
}
Notice the extra async move {}
inside the get
handler? That’s because async
closures don’t exist yet, which means we need the block statements. But also
don’t have blanket impls for regular closures either.
#[derive(Deserialize, Serialize)]
struct Cat {
name: String,
}
#[async_std::main]
async fn main() -> io::Result<()> {
let mut app = tide::new();
app.at("/submit").post(|mut req: tide::Request<()>| async move {
let cat: Cat = req.body_json().await.unwrap();
println!("cat name: {}", cat.name);
let cat = Cat {
name: "chashu".into(),
};
tide::Response::new(200).body_json(&cat).unwrap()
});
app.listen("127.0.0.1:8080").await
}
The JSON example similarly still has a way to go. In particular error handling
could really use some work. Notice the unwrap
s? Yeah, not great. It’s pretty
high on our todo list to fix this. In general there’s still a bit of polish
missing, but we’re definitely on track.
The Tide Architecture
Request-Response
A big change from prior Tide versions is that we’re now directly based on a
request-response model. This means that a Request
goes in, and a Response
is returned. This might sound obvious, but for example Node.js uses
the res.end
callback to send back responses
rather than returning responses from functions.
async fn(req: Request) -> Result<Response>;
Middleware
Aside from requests and responses, Tide allows passing middleware, global state and local state. Middleware wrap each request and response pair, allowing code to be run before the endpoint, and after each endpoint. Additionally each handler can choose to never yield to the endpoint and abort early. This is useful for e.g. authentication middleware.
Tide 0.4.0 ships with a request logger based on the log
crate out of the box. This middleware will log
each request when it comes in, and each response when it goes out.
use tide::middleware::RequestLogger;
#[async_std::main]
async fn main() -> Result<(), std::io::Error> {
let mut app = tide::new();
app.middleware(RequestLogger::new());
app.at("/").get(|_| async move { "Hello, world!" });
app.listen("127.0.0.1:8080").await
}
Tide middleware works like a stack. A simplified example of the logger middleware is something like this:
async fn log(req: Request, next: Next) -> Result<Response> {
println!("Incoming request from {} on url {}", req.peer_addr(), req.url());
let res = next().await?;
println!("Outgoin response with status {}", res.status());
res
}
As a new request comes in, we perform some logic. Then we yield to the next
middleware (or endpoint, we don’t know when we yield to next
), and once that’s
done, we return the Response. We can decide to not yield to next
at any stage,
and abort early.
The sequence in which middleware is run is:
Tide
1. 7. Middleware 1
==============
2. 6. Middleware 2
==============
3. 5. Middleware 3
==============
4. Endpoint
State
Middleware often needs to share values with the endpoint. This is done through
“local state”. Local state is built using a
typemap
that’s available through Request::local_state
.
Global state is used when a complete application needs access to a particular
value. Examples of this include: database connections, websocket connections, or
network-enabled config. Every Request<State>
has an inner value that must
implement Send + Sync + Clone
, and can thus freely be shared between requests.
By default tide::new
will use ()
as the shared state. But if you want to
create a new app with shared state you can do:
/// Shared state
struct MyState {
db_port: u16,
}
#[async_std::main]
async fn main() -> Result<(), std::io::Error> {
let state = State { db_port: 8083 };
let mut app = tide::with_state(state);
app.at("/").get(|_| async move { "Hello, world!" });
app.listen("127.0.0.1:8080").await
}
Extension Traits
Sometimes having global and local context can require a bit of setup. There are cases where it’d be nice if things were a little easier. This is why Tide encourages people to write extension traits.
By using an extension trait you can extend Request
or Response
with more
functionality. For example, an authentication package could implement a
user
method on Request
, to access the authenticated user provided by
middleware. Or a GraphQL package could implement body_graphql
methods for
Request
and Response
as counterparts to
body_json
so that serializing and deserializing GraphQL becomes easier.
More interesting even is the interplay between global State
, derives, and
extension traits. There’s probably a world of ORM-adjacent extension that could
be construed. And probably much more we haven’t thought of; but we encourage you
to experiment and share what you come up with.
An extension trait in its base form is written as such:
pub trait RequestExt {
pub fn bark(&self) -> String;
}
impl<State> RequestExt for Request<State> {
pub fn bark(&self) -> String {
"woof".to_string()
}
}
Tide apps will then have access to the bark
method on Request
.
#[async_std::main]
async fn main() -> Result<(), std::io::Error> {
let mut app = tide::new();
app.at("/").get(|req| async move { req.bark() });
app.listen("127.0.0.1:8080").await
}
What’s next?
As you can tell from our JSON example, error handling isn’t great yet. The
error types don’t align the way we want them to, and that’s a bit of an issue.
Removing the unwrap
s required to make Tide function properly is high on our
list.
But after that we’d like to focus on expanding the set of features. There are a lot of things people want to do with web apps, and we’d like to learn what they are. In particular WebSockets is something we’ve heard come up regularly. But so is enabling good HTTP security out of the box.
It’s still the early days for Tide, and we’re excited for what folks will be building. We’d love to hear about your experiences using Tide. The better we understand what people are doing, the better we can make Tide a tool that helps folks succeed.
Conclusion
In this post we’ve covered the future and present of Tide, and covered its architecture and design philosophy. It probably bears repeating that our 0.4.0 release hardly reflects a done state. Instead it’s the first step into a new direction for the project. We’re very excited for the future of Rust, and in particular async networking.
We believe Tide poses an interesting direction for writing HTTP servers; one that blends familiarity from other languages with Rust’s unique way of doing things, resulting in something that’s more than the sum of its parts. Either way; we’re excited to be sharing this with y’all. And with that I’m going on vacation – back on the 11th of December. We hope you enjoy Tide 0.4!
Thanks to Friedel Ziegelmayer, Felipe Sere, Tirr-c, Nemo157, Oli Obk, David Tolnay, and countless others for their help and assistance with this post, issues, bug fixes, designs, and all the other work that goes into making a release. Tide wouldn’t have been possible without everyone who has been involved.