Skip to content

Commit 9b3ce2e

Browse files
committed
Add support for mapping old fields to new ones in TLV read macros
As we've grown, we regularly face a question of whether to "break out" of our nice TLV-based struct/enum reading/writing macros in order to handle mapping legacy fields to new ones, or deal with keeping the legacy fields and handling at runtime what should be hanlded at (de-)serialization time. This attempts to address this tradeoff by adding support for a "legacy" TLV read. This read style allows us to read a TLV which is not directly mapped to any fields in the struct/enum but which can be computed from the struct/enum's layout at write-time and which is incorporated into the read data at read-time. It takes a type, a `$read` expression (which is executed after all TLVs are read but before the struct/enum is built) and a `$write` expression (which is executed to calculate the value to write in the TLV). They are always read as `option`s to retain a future ability to remove the `legacy` fields. Sadly, there's two issues with doing this trivially which force us into `proc-macro` land: (a) when matching the original struct we want to list the fields in the match arm so that we have them available to write. Sadly, we can't call a macro to have it write out the field name based on the field type, so instead need to pass the whole match to a proc-macro and have it walk through to find the types and skip fields that are `legacy`. (b) when building a final struct/enum after reading, we need to list a few `$field: $expr`s and cannot decide whether to include a field based on a regular macro. The proc-macros to do so aren't trivial, but they aren't that bad either. We could instead try to rewrite our TLV stream processing macros to handle a new set of TLVs which are passed via a separate argument, but as TLVs are required to in ordered by type this requires a good chunk of additional generated code in each TLV write. It also would result in a somewhat less ergonomic callsite as it would no longer fit into our existing list of TLVs.
1 parent 020be44 commit 9b3ce2e

File tree

3 files changed

+228
-14
lines changed

3 files changed

+228
-14
lines changed

lightning-macros/src/lib.rs

Lines changed: 176 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,10 @@
1818
#![deny(rustdoc::private_intra_doc_links)]
1919
#![cfg_attr(docsrs, feature(doc_auto_cfg))]
2020

21-
use proc_macro::TokenStream;
21+
extern crate alloc;
22+
23+
use alloc::string::ToString;
24+
use proc_macro::{Delimiter, Group, TokenStream, TokenTree};
2225
use quote::quote;
2326
use syn::spanned::Spanned;
2427
use syn::{parse, ImplItemFn, Token};
@@ -74,3 +77,175 @@ pub fn maybe_await(expr: TokenStream) -> TokenStream {
7477

7578
quoted.into()
7679
}
80+
81+
fn expect_ident(token: &TokenTree, expected_name: Option<&str>) {
82+
if let TokenTree::Ident(id) = &token {
83+
if let Some(exp) = expected_name {
84+
assert_eq!(id.to_string(), exp, "Expected ident {}, got {:?}", exp, token);
85+
}
86+
} else {
87+
panic!("Expected ident {:?}, got {:?}", expected_name, token);
88+
}
89+
}
90+
91+
fn expect_punct(token: &TokenTree, expected: char) {
92+
if let TokenTree::Punct(p) = &token {
93+
assert_eq!(p.as_char(), expected, "Expected punctuation {}, got {}", expected, p);
94+
} else {
95+
panic!("Expected punctuation {}, got {:?}", expected, token);
96+
}
97+
}
98+
99+
/// Scans a match statement for fields which should be skipped
100+
///
101+
/// Wraps a `match self {..}` statement and scans the fields in the match patterns (in the form
102+
/// `ref $field_name: $field_ty`) for types marked `legacy`, skipping those fields.
103+
#[proc_macro]
104+
pub fn skip_legacy_fields(expr: TokenStream) -> TokenStream {
105+
let mut stream = expr.clone().into_iter();
106+
let mut res = TokenStream::new();
107+
108+
let match_ident = stream.next().unwrap();
109+
expect_ident(&match_ident, Some("match"));
110+
res.extend(proc_macro::TokenStream::from(match_ident));
111+
112+
let self_ident = stream.next().unwrap();
113+
expect_ident(&self_ident, Some("self"));
114+
res.extend(proc_macro::TokenStream::from(self_ident));
115+
116+
let token_to_stream = |tok| proc_macro::TokenStream::from(tok);
117+
118+
let arms = stream.next().unwrap();
119+
if let TokenTree::Group(group) = arms {
120+
let mut new_arms = TokenStream::new();
121+
122+
let mut arm_stream = group.stream().into_iter().peekable();
123+
while arm_stream.peek().is_some() {
124+
let enum_ident = arm_stream.next().unwrap();
125+
let co1 = arm_stream.next().unwrap();
126+
expect_punct(&co1, ':');
127+
let co2 = arm_stream.next().unwrap();
128+
expect_punct(&co2, ':');
129+
let variant_ident = arm_stream.next().unwrap();
130+
let fields = arm_stream.next().unwrap();
131+
let eq = arm_stream.next().unwrap();
132+
expect_punct(&eq, '=');
133+
let gt = arm_stream.next().unwrap();
134+
expect_punct(&gt, '>');
135+
let init = arm_stream.next().unwrap();
136+
137+
let next_tok = arm_stream.peek();
138+
if let Some(TokenTree::Punct(_)) = next_tok {
139+
expect_punct(next_tok.unwrap(), ',');
140+
arm_stream.next();
141+
}
142+
143+
let mut computed_fields = proc_macro::TokenStream::new();
144+
if let TokenTree::Group(group) = fields {
145+
if group.delimiter() == Delimiter::Brace {
146+
let mut fields_stream = group.stream().into_iter().peekable();
147+
148+
let mut new_fields = proc_macro::TokenStream::new();
149+
loop {
150+
let next_tok = fields_stream.peek();
151+
if let Some(TokenTree::Punct(_)) = next_tok {
152+
let dot1 = fields_stream.next().unwrap();
153+
expect_punct(&dot1, '.');
154+
let dot2 = fields_stream.next().expect("Missing second trailing .");
155+
expect_punct(&dot2, '.');
156+
let trailing_dots = [dot1, dot2];
157+
new_fields.extend(trailing_dots.into_iter().map(token_to_stream));
158+
assert!(fields_stream.peek().is_none());
159+
break;
160+
}
161+
162+
let ref_ident = fields_stream.next().unwrap();
163+
expect_ident(&ref_ident, Some("ref"));
164+
let field_name_ident = fields_stream.next().unwrap();
165+
let co = fields_stream.next().unwrap();
166+
expect_punct(&co, ':');
167+
let ty_info = fields_stream.next().unwrap();
168+
let com = fields_stream.next().unwrap();
169+
expect_punct(&com, ',');
170+
171+
if let TokenTree::Group(group) = ty_info {
172+
let first_group_tok = group.stream().into_iter().next().unwrap();
173+
if let TokenTree::Ident(ident) = first_group_tok {
174+
if ident.to_string() == "legacy" {
175+
continue;
176+
}
177+
}
178+
}
179+
180+
let field = [ref_ident, field_name_ident, com];
181+
new_fields.extend(field.into_iter().map(token_to_stream));
182+
}
183+
let fields_group = Group::new(Delimiter::Brace, new_fields);
184+
computed_fields.extend(token_to_stream(TokenTree::Group(fields_group)));
185+
} else {
186+
computed_fields.extend(token_to_stream(TokenTree::Group(group)));
187+
}
188+
}
189+
190+
let arm_pfx = [enum_ident, co1, co2, variant_ident];
191+
new_arms.extend(arm_pfx.into_iter().map(token_to_stream));
192+
new_arms.extend(computed_fields);
193+
let arm_sfx = [eq, gt, init];
194+
new_arms.extend(arm_sfx.into_iter().map(token_to_stream));
195+
}
196+
197+
let new_arm_group = Group::new(Delimiter::Brace, new_arms);
198+
res.extend(token_to_stream(TokenTree::Group(new_arm_group)));
199+
} else {
200+
panic!("Expected `match self {{..}}` and nothing else");
201+
}
202+
203+
assert!(stream.next().is_none(), "Expected `match self {{..}}` and nothing else");
204+
205+
res
206+
}
207+
208+
/// Scans an enum definition for fields initialized to `LDK_DROP_LEGACY_FIELD_DEFINITION` and drops
209+
/// that field.
210+
#[proc_macro]
211+
pub fn drop_legacy_field_definition(expr: TokenStream) -> TokenStream {
212+
let mut st = if let Ok(parsed) = parse::<syn::Expr>(expr) {
213+
if let syn::Expr::Struct(st) = parsed {
214+
st
215+
} else {
216+
return (quote! {
217+
compile_error!("drop_legacy_field_definitions!() can only be used on struct expressions")
218+
})
219+
.into();
220+
}
221+
} else {
222+
return (quote! {
223+
compile_error!("drop_legacy_field_definitions!() can only be used on expressions")
224+
})
225+
.into();
226+
};
227+
assert!(st.attrs.is_empty());
228+
assert!(st.qself.is_none());
229+
assert!(st.dot2_token.is_none());
230+
assert!(st.rest.is_none());
231+
let mut new_fields = syn::punctuated::Punctuated::new();
232+
core::mem::swap(&mut new_fields, &mut st.fields);
233+
for field in new_fields {
234+
if let syn::Expr::Macro(syn::ExprMacro { mac, .. }) = &field.expr {
235+
let macro_name = mac.path.segments.last().unwrap().ident.to_string();
236+
let is_init = macro_name == "_init_tlv_based_struct_field";
237+
let ty_tokens = mac.tokens.clone().into_iter().skip(2).next();
238+
if let Some(proc_macro2::TokenTree::Group(group)) = ty_tokens {
239+
let first_token = group.stream().into_iter().next();
240+
if let Some(proc_macro2::TokenTree::Ident(ident)) = first_token {
241+
if is_init && ident.to_string() == "legacy" {
242+
continue;
243+
}
244+
}
245+
}
246+
}
247+
st.fields.push(field);
248+
}
249+
let out = syn::Expr::Struct(st);
250+
quote! { #out }.into()
251+
}

lightning/Cargo.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -35,6 +35,7 @@ default = ["std", "grind_signatures"]
3535
[dependencies]
3636
lightning-types = { version = "0.1.0", path = "../lightning-types", default-features = false }
3737
lightning-invoice = { version = "0.32.0", path = "../lightning-invoice", default-features = false }
38+
lightning-macros = { version = "0.1", path = "../lightning-macros" }
3839

3940
bech32 = { version = "0.11.0", default-features = false }
4041
bitcoin = { version = "0.32.2", default-features = false, features = ["secp-recovery"] }

lightning/src/util/ser_macros.rs

Lines changed: 51 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -54,6 +54,9 @@ macro_rules! _encode_tlv {
5454
field.write($stream)?;
5555
}
5656
};
57+
($stream: expr, $optional_type: expr, $optional_field: expr, (legacy, $fieldty: ty, $read: expr, $write: expr)) => {
58+
$crate::_encode_tlv!($stream, $optional_type, $write, option);
59+
};
5760
($stream: expr, $type: expr, $field: expr, optional_vec) => {
5861
if !$field.is_empty() {
5962
$crate::_encode_tlv!($stream, $type, $field, required_vec);
@@ -206,6 +209,9 @@ macro_rules! _get_varint_length_prefixed_tlv_length {
206209
$len.0 += field_len;
207210
}
208211
};
212+
($len: expr, $optional_type: expr, $optional_field: expr, (legacy, $fieldty: ty, $read: expr, $write: expr)) => {
213+
$crate::_get_varint_length_prefixed_tlv_length!($len, $optional_type, $write, option);
214+
};
209215
($len: expr, $type: expr, $field: expr, optional_vec) => {
210216
if !$field.is_empty() {
211217
$crate::_get_varint_length_prefixed_tlv_length!($len, $type, $field, required_vec);
@@ -284,6 +290,9 @@ macro_rules! _check_decoded_tlv_order {
284290
($last_seen_type: expr, $typ: expr, $type: expr, $field: ident, (option, explicit_type: $fieldty: ty)) => {{
285291
// no-op
286292
}};
293+
($last_seen_type: expr, $typ: expr, $type: expr, $field: ident, (legacy, $fieldty: ty, $read: expr, $write: expr)) => {{
294+
// no-op
295+
}};
287296
($last_seen_type: expr, $typ: expr, $type: expr, $field: ident, (required, explicit_type: $fieldty: ty)) => {{
288297
_check_decoded_tlv_order!($last_seen_type, $typ, $type, $field, required);
289298
}};
@@ -341,6 +350,9 @@ macro_rules! _check_missing_tlv {
341350
($last_seen_type: expr, $type: expr, $field: ident, (option, explicit_type: $fieldty: ty)) => {{
342351
// no-op
343352
}};
353+
($last_seen_type: expr, $type: expr, $field: ident, (legacy, $fieldty: ty, $read: expr, $write: expr)) => {{
354+
// no-op
355+
}};
344356
($last_seen_type: expr, $type: expr, $field: ident, (required, explicit_type: $fieldty: ty)) => {{
345357
_check_missing_tlv!($last_seen_type, $type, $field, required);
346358
}};
@@ -388,6 +400,10 @@ macro_rules! _decode_tlv {
388400
let _field: &Option<$fieldty> = &$field;
389401
_decode_tlv!($outer_reader, $reader, $field, option);
390402
}};
403+
($outer_reader: expr, $reader: expr, $field: ident, (legacy, $fieldty: ty, $read: expr, $write: expr)) => {{
404+
$crate::_decode_tlv!($outer_reader, $reader, $field, (option, explicit_type: $fieldty));
405+
// Note that $read is only executed in impl_writeable_tlv_based_enum_upgradable!
406+
}};
391407
($outer_reader: expr, $reader: expr, $field: ident, (required, explicit_type: $fieldty: ty)) => {{
392408
let _field: &$fieldty = &$field;
393409
_decode_tlv!($outer_reader, $reader, $field, required);
@@ -449,6 +465,17 @@ macro_rules! _decode_tlv {
449465
}};
450466
}
451467

468+
/// Runs the read side logic for legacy read types
469+
#[doc(hidden)]
470+
#[macro_export]
471+
macro_rules! _run_legacy_tlv_read_logic {
472+
($field: ident, (legacy, $fieldty: ty, $read: expr, $write: expr)) => {{
473+
$read;
474+
}};
475+
($field: ident, $fieldty: tt) => { }
476+
}
477+
478+
452479
/// Checks if `$val` matches `$type`.
453480
/// This is exported for use by other exported macros, do not use directly.
454481
#[doc(hidden)]
@@ -605,6 +632,9 @@ macro_rules! _decode_tlv_stream_range {
605632
$({
606633
$crate::_check_missing_tlv!(last_seen_type, $type, $field, $fieldty);
607634
})*
635+
$({
636+
$crate::_run_legacy_tlv_read_logic!($field, $fieldty);
637+
})*
608638
} }
609639
}
610640

@@ -771,6 +801,7 @@ macro_rules! _init_tlv_based_struct_field {
771801
($field: ident, (option: $trait: ident $(, $read_arg: expr)?)) => {
772802
$crate::_init_tlv_based_struct_field!($field, option)
773803
};
804+
// Note that legacy TLVs are eaten by `drop_legacy_field_definition`
774805
($field: ident, upgradable_required) => {
775806
$field.0.unwrap()
776807
};
@@ -818,6 +849,9 @@ macro_rules! _init_tlv_field_var {
818849
($field: ident, (option, explicit_type: $fieldty: ty)) => {
819850
let mut $field: Option<$fieldty> = None;
820851
};
852+
($field: ident, (legacy, $fieldty: ty, $read: expr, $write: expr)) => {
853+
$crate::_init_tlv_field_var!($field, (option, explicit_type: $fieldty));
854+
};
821855
($field: ident, (required, explicit_type: $fieldty: ty)) => {
822856
let mut $field = $crate::util::ser::RequiredWrapper::<$fieldty>(None);
823857
};
@@ -877,6 +911,10 @@ macro_rules! _init_and_read_tlv_stream {
877911
/// If `$fieldty` is `option`, then `$field` is optional field.
878912
/// If `$fieldty` is `optional_vec`, then `$field` is a [`Vec`], which needs to have its individual elements serialized.
879913
/// Note that for `optional_vec` no bytes are written if the vec is empty
914+
/// If `$fieldty` is `(legacy, $ty, $read, $write)` then, when writing, the expression $write will
915+
/// be called which returns an `Option` and is written as a TLV if `Some`. When reading, an
916+
/// optional field of type `$ty` is read. The code in `$read` is always executed after all TLVs
917+
/// have been read.
880918
///
881919
/// For example,
882920
/// ```
@@ -932,11 +970,11 @@ macro_rules! impl_writeable_tlv_based {
932970
$crate::_init_and_read_len_prefixed_tlv_fields!(reader, {
933971
$(($type, $field, $fieldty)),*
934972
});
935-
Ok(Self {
973+
Ok(::lightning_macros::drop_legacy_field_definition!(Self {
936974
$(
937975
$field: $crate::_init_tlv_based_struct_field!($field, $fieldty)
938976
),*
939-
})
977+
}))
940978
}
941979
}
942980
}
@@ -1030,8 +1068,8 @@ macro_rules! _impl_writeable_tlv_based_enum_common {
10301068
$(($length_prefixed_tuple_variant_id: expr, $length_prefixed_tuple_variant_name: ident)),* $(,)?) => {
10311069
impl $crate::util::ser::Writeable for $st {
10321070
fn write<W: $crate::util::ser::Writer>(&self, writer: &mut W) -> Result<(), $crate::io::Error> {
1033-
match self {
1034-
$($st::$variant_name { $(ref $field, )* .. } => {
1071+
lightning_macros::skip_legacy_fields!(match self {
1072+
$($st::$variant_name { $(ref $field: $fieldty, )* .. } => {
10351073
let id: u8 = $variant_id;
10361074
id.write(writer)?;
10371075
$crate::write_tlv_fields!(writer, {
@@ -1049,7 +1087,7 @@ macro_rules! _impl_writeable_tlv_based_enum_common {
10491087
$crate::util::ser::BigSize(field.serialized_length() as u64).write(writer)?;
10501088
field.write(writer)?;
10511089
}),*
1052-
}
1090+
});
10531091
Ok(())
10541092
}
10551093
}
@@ -1119,11 +1157,11 @@ macro_rules! impl_writeable_tlv_based_enum {
11191157
$crate::_init_and_read_len_prefixed_tlv_fields!(reader, {
11201158
$(($type, $field, $fieldty)),*
11211159
});
1122-
Ok($st::$variant_name {
1160+
Ok(::lightning_macros::drop_legacy_field_definition!($st::$variant_name {
11231161
$(
11241162
$field: $crate::_init_tlv_based_struct_field!($field, $fieldty)
11251163
),*
1126-
})
1164+
}))
11271165
};
11281166
f()
11291167
}),*
@@ -1168,11 +1206,11 @@ macro_rules! impl_writeable_tlv_based_enum_legacy {
11681206
$crate::_init_and_read_len_prefixed_tlv_fields!(reader, {
11691207
$(($type, $field, $fieldty)),*
11701208
});
1171-
Ok($st::$variant_name {
1209+
Ok(::lightning_macros::drop_legacy_field_definition!($st::$variant_name {
11721210
$(
11731211
$field: $crate::_init_tlv_based_struct_field!($field, $fieldty)
11741212
),*
1175-
})
1213+
}))
11761214
};
11771215
f()
11781216
}),*
@@ -1231,11 +1269,11 @@ macro_rules! impl_writeable_tlv_based_enum_upgradable {
12311269
$crate::_init_and_read_len_prefixed_tlv_fields!(reader, {
12321270
$(($type, $field, $fieldty)),*
12331271
});
1234-
Ok(Some($st::$variant_name {
1272+
Ok(Some(::lightning_macros::drop_legacy_field_definition!($st::$variant_name {
12351273
$(
12361274
$field: $crate::_init_tlv_based_struct_field!($field, $fieldty)
12371275
),*
1238-
}))
1276+
})))
12391277
};
12401278
f()
12411279
}),*
@@ -1287,11 +1325,11 @@ macro_rules! impl_writeable_tlv_based_enum_upgradable_legacy {
12871325
$crate::_init_and_read_len_prefixed_tlv_fields!(reader, {
12881326
$(($type, $field, $fieldty)),*
12891327
});
1290-
Ok(Some($st::$variant_name {
1328+
Ok(Some(::lightning_macros::drop_legacy_field_definition!($st::$variant_name {
12911329
$(
12921330
$field: $crate::_init_tlv_based_struct_field!($field, $fieldty)
12931331
),*
1294-
}))
1332+
})))
12951333
};
12961334
f()
12971335
}),*

0 commit comments

Comments
 (0)