Hyperparameter search

Hi
How can I pass any parameter values in search space such as Number of attention heads, vocab size etc. with Raytune.

I have implemented it with optuna but not getting anything woth Ray Tune. Please help me to get it done i Ray tune as backend in Transformer Trainer API for Hyperparameter serch.

def model_init(trial):

if trial is not None:
    num_attention_heads=trial.suggest_int("num_attention_heads", 1, 2 )
    vocab_size=trial.suggest_int("vocab_size", 1000, 1500)
else:
    num_attention_heads = 6
    vocab_size=1000

config = RobertaConfig(
    vocab_size=vocab_size, #output vocabulary size 
    max_position_embeddings=514, #position embedding 
    num_attention_heads=num_attention_heads, #number of possible attentions 
    num_hidden_layers=3,
    type_vocab_size=1,
    )

#basic model skeleton is RobertA with new configuration
model =  RobertaForMaskedLM(config=config)  
print("Model Parameters {}".format(model.num_parameters()))
return model

but how i can achieve same with Ray Tune as backeend