• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

type-ruby / t-ruby / 20573122825

29 Dec 2025 12:41PM UTC coverage: 92.341% (+13.3%) from 79.076%
20573122825

push

github

web-flow
feat: improve error messages with tsc-style diagnostics (#30)

* feat: add Diagnostic class for unified error structure

- Add Diagnostic class with code, message, file, line, column attributes
- Add factory methods: from_type_check_error, from_parse_error, from_scan_error
- Add comprehensive tests for all functionality
- TR1xxx codes for parser errors, TR2xxx for type errors

* feat: add DiagnosticFormatter with tsc-style output

- Format errors as file:line:col - severity CODE: message
- Display source code snippets with line numbers
- Show error markers (~~~) under problem location
- Include Expected/Actual/Suggestion context
- Support ANSI colors with TTY auto-detection
- Format summary line: Found X errors and Y warnings

* feat: add ErrorReporter for collecting and reporting errors

- Collect multiple diagnostics during compilation
- Convert TypeCheckError, ParseError, ScanError to Diagnostic
- Auto-load source from file when not provided
- Report formatted output using DiagnosticFormatter
- Track error vs warning counts

* feat: integrate ErrorReporter into CLI

- Use ErrorReporter for TypeCheckError, ParseError, ScanError
- Display tsc-style formatted error output
- Include source code snippets and error markers
- Show Expected/Actual/Suggestion context
- Display summary line with error count

* refactor: use DiagnosticFormatter in Watcher

- Replace hash-based error format with Diagnostic objects
- Use DiagnosticFormatter for consistent tsc-style output
- Include source code snippets in watch mode errors
- Update tests to expect Diagnostic objects

* feat: add location info to MethodDef for better error messages

- TokenDeclarationParser: capture def token's line/column
- Parser.parse_function_with_body: add line/column to func_info
- Parser.parse_method_in_class: add line/column to method_info
- IR CodeGenerator.build_method: pass location to MethodDef

Error messages now show exact line:column position:
  src/file.trb:18:1 - error TR2001: T... (continued)

571 of 640 new or added lines in 14 files covered. (89.22%)

4 existing lines in 2 files now uncovered.

8210 of 8891 relevant lines covered (92.34%)

1046.45 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

97.45
/lib/t_ruby/lsp_server.rb
1
# frozen_string_literal: true
2

3
require "json"
1✔
4

5
module TRuby
1✔
6
  # LSP (Language Server Protocol) Server for T-Ruby
7
  # Provides IDE integration with autocomplete, diagnostics, and navigation
8
  class LSPServer
1✔
9
    VERSION = "0.1.0"
1✔
10

11
    # LSP Error codes
12
    module ErrorCodes
1✔
13
      PARSE_ERROR = -32_700
1✔
14
      INVALID_REQUEST = -32_600
1✔
15
      METHOD_NOT_FOUND = -32_601
1✔
16
      INVALID_PARAMS = -32_602
1✔
17
      INTERNAL_ERROR = -32_603
1✔
18
      SERVER_NOT_INITIALIZED = -32_002
1✔
19
      UNKNOWN_ERROR_CODE = -32_001
1✔
20
    end
21

22
    # LSP Completion item kinds
23
    module CompletionItemKind
1✔
24
      TEXT = 1
1✔
25
      METHOD = 2
1✔
26
      FUNCTION = 3
1✔
27
      CONSTRUCTOR = 4
1✔
28
      FIELD = 5
1✔
29
      VARIABLE = 6
1✔
30
      CLASS = 7
1✔
31
      INTERFACE = 8
1✔
32
      MODULE = 9
1✔
33
      PROPERTY = 10
1✔
34
      UNIT = 11
1✔
35
      VALUE = 12
1✔
36
      ENUM = 13
1✔
37
      KEYWORD = 14
1✔
38
      SNIPPET = 15
1✔
39
      COLOR = 16
1✔
40
      FILE = 17
1✔
41
      REFERENCE = 18
1✔
42
      FOLDER = 19
1✔
43
      ENUM_MEMBER = 20
1✔
44
      CONSTANT = 21
1✔
45
      STRUCT = 22
1✔
46
      EVENT = 23
1✔
47
      OPERATOR = 24
1✔
48
      TYPE_PARAMETER = 25
1✔
49
    end
50

51
    # LSP Diagnostic severity
52
    module DiagnosticSeverity
1✔
53
      ERROR = 1
1✔
54
      WARNING = 2
1✔
55
      INFORMATION = 3
1✔
56
      HINT = 4
1✔
57
    end
58

59
    # Semantic Token Types (LSP 3.16+)
60
    module SemanticTokenTypes
1✔
61
      NAMESPACE = 0
1✔
62
      TYPE = 1
1✔
63
      CLASS = 2
1✔
64
      ENUM = 3
1✔
65
      INTERFACE = 4
1✔
66
      STRUCT = 5
1✔
67
      TYPE_PARAMETER = 6
1✔
68
      PARAMETER = 7
1✔
69
      VARIABLE = 8
1✔
70
      PROPERTY = 9
1✔
71
      ENUM_MEMBER = 10
1✔
72
      EVENT = 11
1✔
73
      FUNCTION = 12
1✔
74
      METHOD = 13
1✔
75
      MACRO = 14
1✔
76
      KEYWORD = 15
1✔
77
      MODIFIER = 16
1✔
78
      COMMENT = 17
1✔
79
      STRING = 18
1✔
80
      NUMBER = 19
1✔
81
      REGEXP = 20
1✔
82
      OPERATOR = 21
1✔
83
    end
84

85
    # Semantic Token Modifiers (bit flags)
86
    module SemanticTokenModifiers
1✔
87
      DECLARATION = 0x01
1✔
88
      DEFINITION = 0x02
1✔
89
      READONLY = 0x04
1✔
90
      STATIC = 0x08
1✔
91
      DEPRECATED = 0x10
1✔
92
      ABSTRACT = 0x20
1✔
93
      ASYNC = 0x40
1✔
94
      MODIFICATION = 0x80
1✔
95
      DOCUMENTATION = 0x100
1✔
96
      DEFAULT_LIBRARY = 0x200
1✔
97
    end
98

99
    # Token type names for capability registration
100
    SEMANTIC_TOKEN_TYPES = %w[
1✔
101
      namespace type class enum interface struct typeParameter
102
      parameter variable property enumMember event function method
103
      macro keyword modifier comment string number regexp operator
104
    ].freeze
105

106
    # Token modifier names
107
    SEMANTIC_TOKEN_MODIFIERS = %w[
1✔
108
      declaration definition readonly static deprecated
109
      abstract async modification documentation defaultLibrary
110
    ].freeze
111

112
    # Built-in types for completion
113
    BUILT_IN_TYPES = %w[String Integer Boolean Array Hash Symbol void nil].freeze
1✔
114

115
    # Type keywords for completion
116
    TYPE_KEYWORDS = %w[type interface def end].freeze
1✔
117

118
    def initialize(input: $stdin, output: $stdout)
1✔
119
      @input = input
78✔
120
      @output = output
78✔
121
      @documents = {}
78✔
122
      @initialized = false
78✔
123
      @shutdown_requested = false
78✔
124
      @type_alias_registry = TypeAliasRegistry.new
78✔
125
      # Use Compiler for unified diagnostics (same as CLI)
126
      @compiler = Compiler.new
78✔
127
    end
128

129
    # Main run loop for the LSP server
130
    def run
1✔
131
      loop do
×
132
        message = read_message
×
133
        break if message.nil?
×
134

135
        response = handle_message(message)
×
136
        send_response(response) if response
×
137
      end
138
    end
139

140
    # Read a single LSP message from input
141
    def read_message
1✔
142
      # Read headers
143
      headers = {}
124✔
144
      loop do
124✔
145
        line = @input.gets
247✔
146
        return nil if line.nil?
247✔
147

148
        line = line.strip
246✔
149
        break if line.empty?
246✔
150

151
        if line =~ /^([^:]+):\s*(.+)$/
123✔
152
          headers[Regexp.last_match(1)] = Regexp.last_match(2)
123✔
153
        end
154
      end
155

156
      content_length = headers["Content-Length"]&.to_i
123✔
157
      return nil unless content_length&.positive?
123✔
158

159
      # Read content
160
      content = @input.read(content_length)
123✔
161
      return nil if content.nil?
123✔
162

163
      JSON.parse(content)
123✔
164
    rescue JSON::ParserError => e
165
      { "error" => "Parse error: #{e.message}" }
1✔
166
    end
167

168
    # Send a response message
169
    def send_response(response)
1✔
170
      return if response.nil?
39✔
171

172
      content = JSON.generate(response)
39✔
173
      message = "Content-Length: #{content.bytesize}\r\n\r\n#{content}"
39✔
174
      @output.write(message)
39✔
175
      @output.flush
39✔
176
    end
177

178
    # Send a notification (no response expected)
179
    def send_notification(method, params)
1✔
180
      notification = {
181
        "jsonrpc" => "2.0",
39✔
182
        "method" => method,
183
        "params" => params,
184
      }
185
      send_response(notification)
39✔
186
    end
187

188
    # Handle an incoming message
189
    def handle_message(message)
1✔
190
      return error_response(nil, ErrorCodes::PARSE_ERROR, "Parse error") if message["error"]
125✔
191

192
      method = message["method"]
125✔
193
      params = message["params"] || {}
125✔
194
      id = message["id"]
125✔
195

196
      # Check if server is initialized for non-init methods
197
      if !@initialized && method != "initialize" && method != "exit"
125✔
198
        return error_response(id, ErrorCodes::SERVER_NOT_INITIALIZED, "Server not initialized")
1✔
199
      end
200

201
      result = dispatch_method(method, params, id)
124✔
202

203
      # For notifications (no id), don't send a response
204
      return nil if id.nil?
124✔
205

206
      if result.is_a?(Hash) && result[:error]
85✔
207
        error_response(id, result[:error][:code], result[:error][:message])
1✔
208
      else
209
        success_response(id, result)
84✔
210
      end
211
    end
212

213
    private
1✔
214

215
    def dispatch_method(method, params, _id)
1✔
216
      case method
124✔
217
      when "initialize"
218
        handle_initialize(params)
47✔
219
      when "initialized"
220
        handle_initialized(params)
1✔
221
      when "shutdown"
222
        handle_shutdown
3✔
223
      when "exit"
224
        handle_exit
×
225
      when "textDocument/didOpen"
226
        handle_did_open(params)
34✔
227
      when "textDocument/didChange"
228
        handle_did_change(params)
2✔
229
      when "textDocument/didClose"
230
        handle_did_close(params)
2✔
231
      when "textDocument/completion"
232
        handle_completion(params)
9✔
233
      when "textDocument/hover"
234
        handle_hover(params)
12✔
235
      when "textDocument/definition"
236
        handle_definition(params)
6✔
237
      when "textDocument/semanticTokens/full"
238
        handle_semantic_tokens_full(params)
4✔
239
      when "textDocument/diagnostic"
240
        handle_diagnostic(params)
3✔
241
      else
242
        { error: { code: ErrorCodes::METHOD_NOT_FOUND, message: "Method not found: #{method}" } }
1✔
243
      end
244
    end
245

246
    # === LSP Lifecycle Methods ===
247

248
    def handle_initialize(params)
1✔
249
      @initialized = true
47✔
250
      @root_uri = params["rootUri"]
47✔
251
      @workspace_folders = params["workspaceFolders"]
47✔
252

253
      {
254
        "capabilities" => {
47✔
255
          "textDocumentSync" => {
256
            "openClose" => true,
257
            "change" => 1, # Full sync
258
            "save" => { "includeText" => true },
259
          },
260
          "completionProvider" => {
261
            "triggerCharacters" => [":", "<", "|", "&"],
262
            "resolveProvider" => false,
263
          },
264
          "hoverProvider" => true,
265
          "definitionProvider" => true,
266
          "diagnosticProvider" => {
267
            "interFileDependencies" => false,
268
            "workspaceDiagnostics" => false,
269
          },
270
          "semanticTokensProvider" => {
271
            "legend" => {
272
              "tokenTypes" => SEMANTIC_TOKEN_TYPES,
273
              "tokenModifiers" => SEMANTIC_TOKEN_MODIFIERS,
274
            },
275
            "full" => true,
276
            "range" => false,
277
          },
278
        },
279
        "serverInfo" => {
280
          "name" => "t-ruby-lsp",
281
          "version" => VERSION,
282
        },
283
      }
284
    end
285

286
    def handle_initialized(_params)
1✔
287
      # Server is now fully initialized
288
      nil
289
    end
290

291
    def handle_shutdown
1✔
292
      @shutdown_requested = true
3✔
293
      nil
294
    end
295

296
    def handle_exit
1✔
297
      exit(@shutdown_requested ? 0 : 1)
×
298
    end
299

300
    # === Document Synchronization ===
301

302
    def handle_did_open(params)
1✔
303
      text_document = params["textDocument"]
34✔
304
      uri = text_document["uri"]
34✔
305
      text = text_document["text"]
34✔
306

307
      @documents[uri] = {
34✔
308
        text: text,
309
        version: text_document["version"],
310
      }
311

312
      # Parse and send diagnostics
313
      publish_diagnostics(uri, text)
34✔
314
      nil
315
    end
316

317
    def handle_did_change(params)
1✔
318
      text_document = params["textDocument"]
2✔
319
      uri = text_document["uri"]
2✔
320
      changes = params["contentChanges"]
2✔
321

322
      # For full sync, take the last change
323
      if changes && !changes.empty?
2✔
324
        @documents[uri] = {
2✔
325
          text: changes.last["text"],
326
          version: text_document["version"],
327
        }
328

329
        # Re-parse and send diagnostics
330
        publish_diagnostics(uri, changes.last["text"])
2✔
331
      end
332
      nil
333
    end
334

335
    def handle_did_close(params)
1✔
336
      uri = params["textDocument"]["uri"]
2✔
337
      @documents.delete(uri)
2✔
338

339
      # Clear diagnostics
340
      send_notification("textDocument/publishDiagnostics", {
2✔
341
                          "uri" => uri,
342
                          "diagnostics" => [],
343
                        })
344
      nil
345
    end
346

347
    # === Diagnostics ===
348

349
    # Handle pull-based diagnostics (LSP 3.17+)
350
    def handle_diagnostic(params)
1✔
351
      uri = params.dig("textDocument", "uri")
3✔
352
      return { "kind" => "full", "items" => [] } unless uri
3✔
353

354
      doc = @documents[uri]
3✔
355
      return { "kind" => "full", "items" => [] } unless doc
3✔
356

357
      text = doc[:text]
2✔
358
      return { "kind" => "full", "items" => [] } unless text
2✔
359

360
      diagnostics = analyze_document(text)
2✔
361
      { "kind" => "full", "items" => diagnostics }
2✔
362
    end
363

364
    def publish_diagnostics(uri, text)
1✔
365
      diagnostics = analyze_document(text)
36✔
366

367
      send_notification("textDocument/publishDiagnostics", {
36✔
368
                          "uri" => uri,
369
                          "diagnostics" => diagnostics,
370
                        })
371
    end
372

373
    def analyze_document(text, uri: nil)
1✔
374
      # Use unified Compiler.analyze for diagnostics
375
      # This ensures CLI and LSP show the same errors
376
      file_path = uri ? uri_to_path(uri) : "<source>"
40✔
377
      compiler_diagnostics = @compiler.analyze(text, file: file_path)
40✔
378

379
      # Convert TRuby::Diagnostic objects to LSP diagnostic format
380
      compiler_diagnostics.map { |d| diagnostic_to_lsp(d) }
73✔
381
    end
382

383
    # Convert TRuby::Diagnostic to LSP diagnostic format
384
    def diagnostic_to_lsp(diagnostic)
1✔
385
      # LSP uses 0-based line numbers
386
      line = (diagnostic.line || 1) - 1
37✔
387
      line = 0 if line.negative?
37✔
388

389
      col = (diagnostic.column || 1) - 1
37✔
390
      col = 0 if col.negative?
37✔
391

392
      end_col = diagnostic.end_column ? diagnostic.end_column - 1 : col + 1
37✔
393

394
      severity = case diagnostic.severity
37✔
395
                 when :error then DiagnosticSeverity::ERROR
35✔
396
                 when :warning then DiagnosticSeverity::WARNING
1✔
397
                 when :info then DiagnosticSeverity::INFORMATION
1✔
NEW
398
                 else DiagnosticSeverity::ERROR
×
399
                 end
400

401
      lsp_diag = {
402
        "range" => {
37✔
403
          "start" => { "line" => line, "character" => col },
404
          "end" => { "line" => line, "character" => end_col },
405
        },
406
        "severity" => severity,
407
        "source" => "t-ruby",
408
        "message" => diagnostic.message,
409
      }
410

411
      # Add error code if available
412
      lsp_diag["code"] = diagnostic.code if diagnostic.code
37✔
413

414
      lsp_diag
37✔
415
    end
416

417
    def uri_to_path(uri)
1✔
418
      # Convert file:// URI to filesystem path
419
      return uri unless uri.start_with?("file://")
2✔
420

421
      uri.sub(%r{^file://}, "")
1✔
422
    end
423

424
    def create_diagnostic(line, message, severity)
1✔
425
      {
426
        "range" => {
1✔
427
          "start" => { "line" => line, "character" => 0 },
428
          "end" => { "line" => line, "character" => 1000 },
429
        },
430
        "severity" => severity,
431
        "source" => "t-ruby",
432
        "message" => message,
433
      }
434
    end
435

436
    # === Completion ===
437

438
    def handle_completion(params)
1✔
439
      uri = params["textDocument"]["uri"]
9✔
440
      position = params["position"]
9✔
441

442
      document = @documents[uri]
9✔
443
      return { "items" => [] } unless document
9✔
444

445
      text = document[:text]
8✔
446
      lines = text.split("\n")
8✔
447
      line = lines[position["line"]] || ""
8✔
448
      char_pos = position["character"]
8✔
449

450
      # Get the text before cursor
451
      prefix = line[0...char_pos] || ""
8✔
452

453
      completions = []
8✔
454

455
      # Context-aware completion
456
      case prefix
8✔
457
      when /:\s*$/
458
        # After colon - suggest types
459
        completions.concat(type_completions)
2✔
460
      when /\|\s*$/
461
        # After pipe - suggest types for union
462
        completions.concat(type_completions)
1✔
463
      when /&\s*$/
464
        # After ampersand - suggest types for intersection
465
        completions.concat(type_completions)
1✔
466
      when /<\s*$/
467
        # Inside generic - suggest types
468
        completions.concat(type_completions)
1✔
469
      when /^\s*$/
470
        # Start of line - suggest keywords
471
        completions.concat(keyword_completions)
1✔
472
      when /^\s*def\s+\w*$/
473
        # Function definition - no completion needed
474
        completions = []
1✔
475
      when /^\s*type\s+\w*$/
476
        # Type alias definition - no completion needed
477
        completions = []
×
478
      when /^\s*interface\s+\w*$/
479
        # Interface definition - no completion needed
480
        completions = []
×
481
      else
482
        # Default - suggest all
483
        completions.concat(type_completions)
1✔
484
        completions.concat(keyword_completions)
1✔
485
      end
486

487
      # Add document-specific completions
488
      completions.concat(document_type_completions(text))
8✔
489

490
      { "items" => completions }
8✔
491
    end
492

493
    def type_completions
1✔
494
      BUILT_IN_TYPES.map do |type|
6✔
495
        {
496
          "label" => type,
48✔
497
          "kind" => CompletionItemKind::CLASS,
498
          "detail" => "Built-in type",
499
          "documentation" => "T-Ruby built-in type: #{type}",
500
        }
501
      end
502
    end
503

504
    def keyword_completions
1✔
505
      TYPE_KEYWORDS.map do |keyword|
2✔
506
        {
507
          "label" => keyword,
8✔
508
          "kind" => CompletionItemKind::KEYWORD,
509
          "detail" => "Keyword",
510
          "documentation" => keyword_documentation(keyword),
511
        }
512
      end
513
    end
514

515
    def keyword_documentation(keyword)
1✔
516
      case keyword
13✔
517
      when "type"
518
        "Define a type alias: type AliasName = TypeDefinition"
3✔
519
      when "interface"
520
        "Define an interface: interface Name ... end"
3✔
521
      when "def"
522
        "Define a function with type annotations: def name(param: Type): ReturnType"
3✔
523
      when "end"
524
        "End a block (interface, class, method, etc.)"
3✔
525
      else
526
        keyword
1✔
527
      end
528
    end
529

530
    def document_type_completions(text)
1✔
531
      parser = Parser.new(text)
8✔
532
      result = parser.parse
8✔
533

534
      # Add type aliases from the document
535
      completions = (result[:type_aliases] || []).map do |alias_info|
8✔
536
        {
537
          "label" => alias_info[:name],
2✔
538
          "kind" => CompletionItemKind::CLASS,
539
          "detail" => "Type alias",
540
          "documentation" => "type #{alias_info[:name]} = #{alias_info[:definition]}",
541
        }
542
      end
543

544
      # Add interfaces from the document
545
      (result[:interfaces] || []).each do |interface_info|
8✔
546
        completions << {
1✔
547
          "label" => interface_info[:name],
548
          "kind" => CompletionItemKind::INTERFACE,
549
          "detail" => "Interface",
550
          "documentation" => "interface #{interface_info[:name]}",
551
        }
552
      end
553

554
      completions
8✔
555
    end
556

557
    # === Hover ===
558

559
    def handle_hover(params)
1✔
560
      uri = params["textDocument"]["uri"]
12✔
561
      position = params["position"]
12✔
562

563
      document = @documents[uri]
12✔
564
      return nil unless document
12✔
565

566
      text = document[:text]
11✔
567
      lines = text.split("\n")
11✔
568
      line = lines[position["line"]] || ""
11✔
569
      char_pos = position["character"]
11✔
570

571
      # Find the word at cursor position
572
      word = extract_word_at_position(line, char_pos)
11✔
573
      return nil if word.nil? || word.empty?
11✔
574

575
      hover_info = get_hover_info(word, text)
9✔
576
      return nil unless hover_info
9✔
577

578
      {
579
        "contents" => {
8✔
580
          "kind" => "markdown",
581
          "value" => hover_info,
582
        },
583
        "range" => word_range(position["line"], line, char_pos, word),
584
      }
585
    end
586

587
    def extract_word_at_position(line, char_pos)
1✔
588
      return nil if char_pos > line.length
19✔
589

590
      # Find word boundaries
591
      start_pos = char_pos
17✔
592
      end_pos = char_pos
17✔
593

594
      # Move start back to word start
595
      start_pos -= 1 while start_pos.positive? && line[start_pos - 1] =~ /[\w<>]/
17✔
596

597
      # Move end forward to word end
598
      end_pos += 1 while end_pos < line.length && line[end_pos] =~ /[\w<>]/
17✔
599

600
      return nil if start_pos == end_pos
17✔
601

602
      line[start_pos...end_pos]
15✔
603
    end
604

605
    def word_range(line_num, line, char_pos, word)
1✔
606
      start_pos = line.index(word) || char_pos
9✔
607
      end_pos = start_pos + word.length
9✔
608

609
      {
610
        "start" => { "line" => line_num, "character" => start_pos },
9✔
611
        "end" => { "line" => line_num, "character" => end_pos },
612
      }
613
    end
614

615
    def get_hover_info(word, text)
1✔
616
      # Check if it's a built-in type
617
      if BUILT_IN_TYPES.include?(word)
9✔
618
        return "**#{word}** - Built-in T-Ruby type"
1✔
619
      end
620

621
      # Check if it's a type alias
622
      parser = Parser.new(text)
8✔
623
      result = parser.parse
8✔
624

625
      (result[:type_aliases] || []).each do |alias_info|
8✔
626
        if alias_info[:name] == word
1✔
627
          return "**Type Alias**\n\n```ruby\ntype #{alias_info[:name]} = #{alias_info[:definition]}\n```"
1✔
628
        end
629
      end
630

631
      # Check if it's an interface
632
      (result[:interfaces] || []).each do |interface_info|
7✔
633
        if interface_info[:name] == word
2✔
634
          members = interface_info[:members].map { |m| "  #{m[:name]}: #{m[:type]}" }.join("\n")
5✔
635
          return "**Interface**\n\n```ruby\ninterface #{interface_info[:name]}\n#{members}\nend\n```"
2✔
636
        end
637
      end
638

639
      # Check if it's a function
640
      (result[:functions] || []).each do |func|
5✔
641
        next unless func[:name] == word
4✔
642

643
        params = func[:params].map { |p| "#{p[:name]}: #{p[:type] || "untyped"}" }.join(", ")
7✔
644
        return_type = func[:return_type] || "void"
4✔
645
        return "**Function**\n\n```ruby\ndef #{func[:name]}(#{params}): #{return_type}\n```"
4✔
646
      end
647

648
      nil
649
    end
650

651
    # === Definition ===
652

653
    def handle_definition(params)
1✔
654
      uri = params["textDocument"]["uri"]
6✔
655
      position = params["position"]
6✔
656

657
      document = @documents[uri]
6✔
658
      return nil unless document
6✔
659

660
      text = document[:text]
5✔
661
      lines = text.split("\n")
5✔
662
      line = lines[position["line"]] || ""
5✔
663
      char_pos = position["character"]
5✔
664

665
      word = extract_word_at_position(line, char_pos)
5✔
666
      return nil if word.nil? || word.empty?
5✔
667

668
      # Find definition location
669
      location = find_definition(word, text, uri)
4✔
670
      return nil unless location
4✔
671

672
      location
3✔
673
    end
674

675
    def find_definition(word, text, uri)
1✔
676
      lines = text.split("\n")
4✔
677

678
      # Search for type alias definition
679
      lines.each_with_index do |line, idx|
4✔
680
        if line.match?(/^\s*type\s+#{Regexp.escape(word)}\s*=/)
4✔
681
          return {
682
            "uri" => uri,
1✔
683
            "range" => {
684
              "start" => { "line" => idx, "character" => 0 },
685
              "end" => { "line" => idx, "character" => line.length },
686
            },
687
          }
688
        end
689

690
        # Search for interface definition
691
        if line.match?(/^\s*interface\s+#{Regexp.escape(word)}\s*$/)
3✔
692
          return {
693
            "uri" => uri,
1✔
694
            "range" => {
695
              "start" => { "line" => idx, "character" => 0 },
696
              "end" => { "line" => idx, "character" => line.length },
697
            },
698
          }
699
        end
700

701
        # Search for function definition
702
        if line.match?(/^\s*def\s+#{Regexp.escape(word)}\s*\(/)
2✔
703
          return {
704
            "uri" => uri,
1✔
705
            "range" => {
706
              "start" => { "line" => idx, "character" => 0 },
707
              "end" => { "line" => idx, "character" => line.length },
708
            },
709
          }
710
        end
711
      end
712

713
      nil
714
    end
715

716
    # === Semantic Tokens ===
717

718
    def handle_semantic_tokens_full(params)
1✔
719
      uri = params["textDocument"]["uri"]
4✔
720
      document = @documents[uri]
4✔
721
      return { "data" => [] } unless document
4✔
722

723
      text = document[:text]
3✔
724
      tokens = generate_semantic_tokens(text)
3✔
725

726
      { "data" => tokens }
3✔
727
    end
728

729
    def generate_semantic_tokens(text)
1✔
730
      lines = text.split("\n")
5✔
731

732
      # Parse the document to get IR
733
      parser = Parser.new(text)
5✔
734
      parse_result = parser.parse
5✔
735
      parser.ir_program
5✔
736

737
      # Collect all tokens from parsing
738
      raw_tokens = []
5✔
739

740
      # Process type aliases
741
      (parse_result[:type_aliases] || []).each do |alias_info|
5✔
742
        lines.each_with_index do |line, line_idx|
2✔
743
          next unless (match = line.match(/^\s*type\s+(#{Regexp.escape(alias_info[:name])})\s*=/))
8✔
744

745
          # 'type' keyword
746
          type_pos = line.index("type")
2✔
747
          raw_tokens << [line_idx, type_pos, 4, SemanticTokenTypes::KEYWORD, SemanticTokenModifiers::DECLARATION]
2✔
748

749
          # Type name
750
          name_pos = match.begin(1)
2✔
751
          raw_tokens << [line_idx, name_pos, alias_info[:name].length, SemanticTokenTypes::TYPE, SemanticTokenModifiers::DEFINITION]
2✔
752

753
          # Type definition (after =)
754
          add_type_tokens(raw_tokens, line, line_idx, alias_info[:definition])
2✔
755
        end
756
      end
757

758
      # Process interfaces
759
      (parse_result[:interfaces] || []).each do |interface_info|
5✔
760
        lines.each_with_index do |line, line_idx|
2✔
761
          if (match = line.match(/^\s*interface\s+(#{Regexp.escape(interface_info[:name])})/))
10✔
762
            # 'interface' keyword
763
            interface_pos = line.index("interface")
2✔
764
            raw_tokens << [line_idx, interface_pos, 9, SemanticTokenTypes::KEYWORD, SemanticTokenModifiers::DECLARATION]
2✔
765

766
            # Interface name
767
            name_pos = match.begin(1)
2✔
768
            raw_tokens << [line_idx, name_pos, interface_info[:name].length, SemanticTokenTypes::INTERFACE, SemanticTokenModifiers::DEFINITION]
2✔
769
          end
770

771
          # Interface members
772
          interface_info[:members]&.each do |member|
10✔
773
            next unless (match = line.match(/^\s*(#{Regexp.escape(member[:name])})\s*:\s*/))
10✔
774

775
            prop_pos = match.begin(1)
2✔
776
            raw_tokens << [line_idx, prop_pos, member[:name].length, SemanticTokenTypes::PROPERTY, 0]
2✔
777

778
            # Member type
779
            add_type_tokens(raw_tokens, line, line_idx, member[:type])
2✔
780
          end
781
        end
782
      end
783

784
      # Process functions
785
      (parse_result[:functions] || []).each do |func|
5✔
786
        lines.each_with_index do |line, line_idx|
2✔
787
          next unless (match = line.match(/^\s*def\s+(#{Regexp.escape(func[:name])})\s*\(/))
10✔
788

789
          # 'def' keyword
790
          def_pos = line.index("def")
2✔
791
          raw_tokens << [line_idx, def_pos, 3, SemanticTokenTypes::KEYWORD, 0]
2✔
792

793
          # Function name
794
          name_pos = match.begin(1)
2✔
795
          raw_tokens << [line_idx, name_pos, func[:name].length, SemanticTokenTypes::FUNCTION, SemanticTokenModifiers::DEFINITION]
2✔
796

797
          # Parameters
798
          func[:params]&.each do |param|
2✔
799
            next unless (param_match = line.match(/\b(#{Regexp.escape(param[:name])})\s*(?::\s*)?/))
2✔
800

801
            param_pos = param_match.begin(1)
2✔
802
            raw_tokens << [line_idx, param_pos, param[:name].length, SemanticTokenTypes::PARAMETER, 0]
2✔
803

804
            # Parameter type if present
805
            if param[:type]
2✔
806
              add_type_tokens(raw_tokens, line, line_idx, param[:type])
2✔
807
            end
808
          end
809

810
          # Return type
811
          if func[:return_type]
2✔
812
            add_type_tokens(raw_tokens, line, line_idx, func[:return_type])
2✔
813
          end
814
        end
815
      end
816

817
      # Process 'end' keywords
818
      lines.each_with_index do |line, line_idx|
5✔
819
        if (match = line.match(/^\s*(end)\s*$/))
14✔
820
          end_pos = match.begin(1)
4✔
821
          raw_tokens << [line_idx, end_pos, 3, SemanticTokenTypes::KEYWORD, 0]
4✔
822
        end
823
      end
824

825
      # Sort tokens by line, then by character position
826
      raw_tokens.sort_by! { |t| [t[0], t[1]] }
33✔
827

828
      # Convert to delta encoding
829
      encode_tokens(raw_tokens)
5✔
830
    end
831

832
    def add_type_tokens(raw_tokens, line, line_idx, type_str)
1✔
833
      return unless type_str
13✔
834

835
      # Find position of the type in the line
836
      pos = line.index(type_str)
12✔
837
      return unless pos
12✔
838

839
      # Handle built-in types
840
      if BUILT_IN_TYPES.include?(type_str)
12✔
841
        raw_tokens << [line_idx, pos, type_str.length, SemanticTokenTypes::TYPE, SemanticTokenModifiers::DEFAULT_LIBRARY]
9✔
842
        return
9✔
843
      end
844

845
      # Handle generic types like Array<String>
846
      if type_str.include?("<")
3✔
847
        if (match = type_str.match(/^(\w+)<(.+)>$/))
1✔
848
          base = match[1]
1✔
849
          base_pos = line.index(base, pos)
1✔
850
          if base_pos
1✔
851
            modifier = BUILT_IN_TYPES.include?(base) ? SemanticTokenModifiers::DEFAULT_LIBRARY : 0
1✔
852
            raw_tokens << [line_idx, base_pos, base.length, SemanticTokenTypes::TYPE, modifier]
1✔
853
          end
854
          # Recursively process type arguments
855
          # (simplified - just mark them as types)
856
          args = match[2]
1✔
857
          args.split(/[,\s]+/).each do |arg|
1✔
858
            arg = arg.strip.gsub(/[<>]/, "")
1✔
859
            next if arg.empty?
1✔
860

861
            arg_pos = line.index(arg, pos)
1✔
862
            if arg_pos
1✔
863
              modifier = BUILT_IN_TYPES.include?(arg) ? SemanticTokenModifiers::DEFAULT_LIBRARY : 0
1✔
864
              raw_tokens << [line_idx, arg_pos, arg.length, SemanticTokenTypes::TYPE, modifier]
1✔
865
            end
866
          end
867
        end
868
        return
1✔
869
      end
870

871
      # Handle union types
872
      if type_str.include?("|")
2✔
873
        type_str.split("|").map(&:strip).each do |t|
1✔
874
          t_pos = line.index(t, pos)
2✔
875
          if t_pos
2✔
876
            modifier = BUILT_IN_TYPES.include?(t) ? SemanticTokenModifiers::DEFAULT_LIBRARY : 0
2✔
877
            raw_tokens << [line_idx, t_pos, t.length, SemanticTokenTypes::TYPE, modifier]
2✔
878
          end
879
        end
880
        return
1✔
881
      end
882

883
      # Handle intersection types
884
      if type_str.include?("&")
1✔
885
        type_str.split("&").map(&:strip).each do |t|
1✔
886
          t_pos = line.index(t, pos)
2✔
887
          if t_pos
2✔
888
            modifier = BUILT_IN_TYPES.include?(t) ? SemanticTokenModifiers::DEFAULT_LIBRARY : 0
2✔
889
            raw_tokens << [line_idx, t_pos, t.length, SemanticTokenTypes::TYPE, modifier]
2✔
890
          end
891
        end
892
        return
1✔
893
      end
894

895
      # Simple type
896
      raw_tokens << [line_idx, pos, type_str.length, SemanticTokenTypes::TYPE, 0]
×
897
    end
898

899
    def encode_tokens(raw_tokens)
1✔
900
      encoded = []
7✔
901
      prev_line = 0
7✔
902
      prev_char = 0
7✔
903

904
      raw_tokens.each do |token|
7✔
905
        line, char, length, token_type, modifiers = token
31✔
906

907
        delta_line = line - prev_line
31✔
908
        delta_char = delta_line.zero? ? char - prev_char : char
31✔
909

910
        encoded << delta_line
31✔
911
        encoded << delta_char
31✔
912
        encoded << length
31✔
913
        encoded << token_type
31✔
914
        encoded << modifiers
31✔
915

916
        prev_line = line
31✔
917
        prev_char = char
31✔
918
      end
919

920
      encoded
7✔
921
    end
922

923
    # === Response Helpers ===
924

925
    def success_response(id, result)
1✔
926
      {
927
        "jsonrpc" => "2.0",
84✔
928
        "id" => id,
929
        "result" => result,
930
      }
931
    end
932

933
    def error_response(id, code, message)
1✔
934
      {
935
        "jsonrpc" => "2.0",
2✔
936
        "id" => id,
937
        "error" => {
938
          "code" => code,
939
          "message" => message,
940
        },
941
      }
942
    end
943
  end
944
end
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc