[−][src]Module syn::parse
Parsing interface for parsing a token stream into a syntax tree node.
Parsing in Syn is built on parser functions that take in a ParseStream
and produce a Result<T>
where T
is some syntax tree node. Underlying
these parser functions is a lower level mechanism built around the
Cursor
type. Cursor
is a cheaply copyable cursor over a range of
tokens in a token stream.
Example
Here is a snippet of parsing code to get a feel for the style of the
library. We define data structures for a subset of Rust syntax including
enums (not shown) and structs, then provide implementations of the Parse
trait to parse these syntax tree data structures from a token stream.
Once Parse
impls have been defined, they can be called conveniently from a
procedural macro through parse_macro_input!
as shown at the bottom of
the snippet. If the caller provides syntactically invalid input to the
procedural macro, they will receive a helpful compiler error message
pointing out the exact token that triggered the failure to parse.
extern crate proc_macro;
use proc_macro::TokenStream;
use syn::{braced, parse_macro_input, token, Field, Ident, Result, Token};
use syn::parse::{Parse, ParseStream};
use syn::punctuated::Punctuated;
enum Item {
Struct(ItemStruct),
Enum(ItemEnum),
}
struct ItemStruct {
struct_token: Token![struct],
ident: Ident,
brace_token: token::Brace,
fields: Punctuated<Field, Token![,]>,
}
#
# enum ItemEnum {}
impl Parse for Item {
fn parse(input: ParseStream) -> Result<Self> {
let lookahead = input.lookahead1();
if lookahead.peek(Token![struct]) {
input.parse().map(Item::Struct)
} else if lookahead.peek(Token![enum]) {
input.parse().map(Item::Enum)
} else {
Err(lookahead.error())
}
}
}
impl Parse for ItemStruct {
fn parse(input: ParseStream) -> Result<Self> {
let content;
Ok(ItemStruct {
struct_token: input.parse()?,
ident: input.parse()?,
brace_token: braced!(content in input),
fields: content.parse_terminated(Field::parse_named)?,
})
}
}
#
# impl Parse for ItemEnum {
# fn parse(input: ParseStream) -> Result<Self> {
# unimplemented!()
# }
# }
# const IGNORE: &str = stringify! {
#[proc_macro]
# };
pub fn my_macro(tokens: TokenStream) -> TokenStream {
let input = parse_macro_input!(tokens as Item);
/* ... */
# "".parse().unwrap()
}
The syn::parse*
functions
The syn::parse
, syn::parse2
, and syn::parse_str
functions serve
as an entry point for parsing syntax tree nodes that can be parsed in an
obvious default way. These functions can return any syntax tree node that
implements the Parse
trait, which includes most types in Syn.
use syn::Type;
# fn run_parser() -> syn::Result<()> {
let t: Type = syn::parse_str("std::collections::HashMap<String, Value>")?;
# Ok(())
# }
#
# fn main() {
# run_parser().unwrap();
# }
The parse_quote!
macro also uses this approach.
The Parser
trait
Some types can be parsed in several ways depending on context. For example
an Attribute
can be either "outer" like #[...]
or "inner" like
#![...]
and parsing the wrong one would be a bug. Similarly Punctuated
may or may not allow trailing punctuation, and parsing it the wrong way
would either reject valid input or accept invalid input.
The Parse
trait is not implemented in these cases because there is no good
behavior to consider the default.
# extern crate proc_macro;
#
# use syn::punctuated::Punctuated;
# use syn::{PathSegment, Result, Token};
#
# fn f(tokens: proc_macro::TokenStream) -> Result<()> {
#
// Can't parse `Punctuated` without knowing whether trailing punctuation
// should be allowed in this context.
let path: Punctuated<PathSegment, Token![::]> = syn::parse(tokens)?;
#
# Ok(())
# }
In these cases the types provide a choice of parser functions rather than a
single Parse
implementation, and those parser functions can be invoked
through the Parser
trait.
extern crate proc_macro;
use proc_macro::TokenStream;
use syn::parse::Parser;
use syn::punctuated::Punctuated;
use syn::{Attribute, Expr, PathSegment, Result, Token};
fn call_some_parser_methods(input: TokenStream) -> Result<()> {
// Parse a nonempty sequence of path segments separated by `::` punctuation
// with no trailing punctuation.
let tokens = input.clone();
let parser = Punctuated::<PathSegment, Token![::]>::parse_separated_nonempty;
let _path = parser.parse(tokens)?;
// Parse a possibly empty sequence of expressions terminated by commas with
// an optional trailing punctuation.
let tokens = input.clone();
let parser = Punctuated::<Expr, Token![,]>::parse_terminated;
let _args = parser.parse(tokens)?;
// Parse zero or more outer attributes but not inner attributes.
let tokens = input.clone();
let parser = Attribute::parse_outer;
let _attrs = parser.parse(tokens)?;
Ok(())
}
This module is available if Syn is built with the "parsing"
feature.
Modules
discouraged | Extensions to the parsing API with niche applicability. |
Structs
Error | Error returned when a Syn parser cannot parse the input tokens. |
Lookahead1 | Support for checking the next token in a stream to decide how to parse. |
ParseBuffer | Cursor position within a buffered token stream. |
StepCursor | Cursor state associated with speculative parsing. |
Traits
Parse | Parsing interface implemented by all types that can be parsed in a default way from a token stream. |
Parser | Parser that can parse Rust tokens into a particular syntax tree node. |
Peek | Types that can be parsed by looking at just one token. |
Type Definitions
ParseStream | Input to a Syn parser function. |
Result | The result of a Syn parser. |