============================================================== TAC KBP 2017 KB CONSTRUCTION: COMPONENT EAL EVALUATION RESULTS ============================================================== Team ID: SAFT_ISI Organization: USC Information Sciences Institute ************************************************************* Run ID: SAFT_ISI_KB* See full details of run under Cold Start KB: Composite KB evaluation results ************************************************************* #### The summary includes system performance in two conditions: #### * WithRealis: Argument extraction performance where a true positive required matching the tuple (EventType, Role, ArgumentEntity, Realis) (official) #### * NoRealis: Argument extraction performance where a true positive required matching the tuple (EventType, Role, ArgumentEntity) (diagnostic) #### #### Performance measured via comparison to the RichERE gold standard over the LDC core documents. #### The core documents are a mix of Chinese, English, and Spanish. We report language in each langauge separately #### Systems were submitted under two conditions: #### * Sept: Process the 500-document core set and submit using the EAL-specific format #### * CS: Process the full ColdStart TAC corpus and submit using the ColdStart++ format. NIST converts the KB to the EAL format #### We specify the type of submission in each row #### #### For each condition, we report scores on argument extraction and the linking into argument frames #### #### For the official error-based argument score, we report bootstrapped sampled results at the median, 5% and 95% confidence intervals. #### * ArgScore_5: 5% confidence interval -- from argScores/ArgScore.bootstrapped.csv, (column header= 0.05) #### * ArgScore_Med: Median -- from argScores/ArgScore.bootstrapped.medians.csv, (column 'LinearScore') #### * ArgScore_95: 95% confidence interval -- from argScores/ArgScore.bootstrapped.medians.csv, 95% (column header = 0.95) #### #### For the official linking metric (B^3) at the median, 5% and 95% confidence intervals. We report: #### * Link_5: 5% confidence interval -- from linkScores/linkScores.bootstrapped.medians.csv, 5% (column header = 0.05, row=F1) #### * Link_Med: Median -- from argScores/ArgScore.bootstrapped.medians.csv, (column 'F1') #### * Link_95: 95% confidence interval -- from argScores/ArgScore.bootstrapped.medians.csv, 95% (column header = 0.95, row=F1 ) #### #### We also report precision, recall, and F1 for argument extraction. #### * Arg_P: Precision of argument tuples -- from aggregateF.txtF.txt #### * Arg_R: Recall of argument tuples -- from nonBootstrapped/aggregateF.txtF.txt #### * Arg_F1: 95% confidence interval -- from nonBootstrapped/aggregateF.txtF.txt ### cmn Results ### WithRealis Score (TP Matches: EventType,Role,ArgEntity,Realis) ### Site Sub. Lang. Cond. ArgScore_5 ArgScore_Med ArgScore_95 Link_5 LinkMed Link_95 ArgP ArgR ArgF1 BBN BBN1 cmn sept 10.36 11.81 13.00 4.98 5.78 6.60 39.48 17.74 24.48 ISCAS_Sogou_KB ISCAS_Sogou_KB_CMN_2 cmn cs 9.04 10.27 11.32 5.93 6.66 7.45 36.17 16.65 22.80 ISCAS_Sogou_KB ISCAS_Sogou_KB_CMN_1 cmn cs 8.98 10.23 11.26 5.92 6.66 7.46 36.16 16.61 22.77 ISCAS_Sogou_KB ISCAS_Sogou_KB_CMN_3 cmn cs 8.98 10.23 11.26 5.92 6.66 7.46 36.16 16.61 22.77 ISCAS_Sogou_KB ISCAS_Sogou_KB_CMN_4 cmn cs 8.98 10.23 11.26 5.92 6.66 7.46 36.16 16.61 22.77 ISCAS_Sogou_KB ISCAS_Sogou_KB_CMN_5 cmn cs 8.98 10.23 11.26 5.92 6.66 7.46 36.16 16.61 22.77 A2KD_Adept_KB A2KD_Adept_KB_CMN_2 cmn cs 5.73 6.46 7.20 1.43 1.85 2.31 46.59 8.54 14.43 A2KD_Adept_KB A2KD_Adept_KB_CMN_1 cmn cs 5.73 6.46 7.20 1.43 1.85 2.31 46.59 8.54 14.43 A2KD_Adept_KB A2KD_Adept_KB_XLING_1 cmn cs 5.70 6.42 7.13 1.43 1.84 2.32 46.30 8.50 14.36 A2KD_Adept_KB A2KD_Adept_KB_XLING_2 cmn cs 5.70 6.42 7.13 1.43 1.84 2.32 46.30 8.50 14.36 A2KD_Adept_KB A2KD_Adept_KB_CMN_3 cmn cs 5.59 6.28 7.01 1.41 1.84 2.32 45.72 8.43 14.23 A2KD_Adept_KB A2KD_Adept_KB_CMN_4 cmn cs 5.59 6.28 7.01 1.41 1.84 2.32 45.72 8.43 14.23 CMU_CS_Event CMU_CS_Event2 cmn sept 3.49 4.00 4.51 1.29 1.57 1.90 28.84 7.82 12.30 CMU_CS_Event CMU_CS_Event1 cmn sept 3.49 4.00 4.51 1.37 1.71 2.14 28.84 7.82 12.30 CMU_CS_Event CMU_CS_Event3 cmn sept 3.49 4.00 4.51 1.13 1.36 1.65 28.84 7.82 12.30 TinkerBell_KB TinkerBell_KB_XLING_3 cmn cs 3.25 3.74 4.22 0.89 1.13 1.41 27.71 7.67 12.01 TinkerBell_KB TinkerBell_KB_XLING_2 cmn cs 3.25 3.74 4.22 0.89 1.13 1.41 27.71 7.67 12.01 TinkerBell_KB TinkerBell_KB_XLING_4 cmn cs 3.25 3.74 4.22 0.89 1.13 1.41 27.71 7.67 12.01 TinkerBell_KB TinkerBell_KB_CMN_1 cmn cs 3.25 3.74 4.22 0.89 1.13 1.41 27.71 7.67 12.01 TinkerBell_KB TinkerBell_KB_XLING_5 cmn cs 3.25 3.74 4.22 0.89 1.13 1.41 27.71 7.67 12.01 TinkerBell_KB TinkerBell_KB_CMN_3 cmn cs 3.25 3.74 4.22 0.89 1.13 1.41 27.71 7.67 12.01 TinkerBell_KB TinkerBell_KB_CMN_2 cmn cs 3.25 3.74 4.22 0.89 1.13 1.41 27.71 7.67 12.01 TinkerBell_KB TinkerBell_KB_CMN_4 cmn cs 3.25 3.74 4.22 0.89 1.13 1.41 27.71 7.67 12.01 TinkerBell_KB TinkerBell_KB_XLING_1 cmn cs 3.25 3.74 4.22 0.89 1.13 1.41 27.71 7.67 12.01 TinkerBell_KB TinkerBell_KB_CMN_5 cmn cs 3.25 3.74 4.22 0.89 1.13 1.41 27.71 7.67 12.01 SAFT_ISI_KB SAFT_ISI_KB_CMN_1 cmn cs 3.20 3.71 4.19 1.15 1.40 1.73 28.95 6.89 11.13 CMU_CS_Event CMU_CS_Event4 cmn sept 2.22 2.73 3.13 0.42 0.62 0.84 43.39 3.39 6.29 CMU_CS_Event CMU_CS_Event5 cmn sept 2.01 2.50 2.88 0.22 0.33 0.47 43.23 3.01 5.63 ### NoRealis Score (TP Matches: EventType,Role,ArgEntity) ### Site Sub. Lang. Cond. ArgScore_5 ArgScore_Med ArgScore_95 Link_5 LinkMed Link_95 ArgP ArgR ArgF1 BBN BBN1 cmn sept 14.37 15.75 16.93 6.06 6.92 7.78 46.27 21.70 29.54 ISCAS_Sogou_KB ISCAS_Sogou_KB_CMN_1 cmn cs 12.31 13.56 14.66 7.06 7.89 8.80 41.79 20.16 27.20 ISCAS_Sogou_KB ISCAS_Sogou_KB_CMN_3 cmn cs 12.31 13.56 14.66 7.06 7.89 8.80 41.79 20.16 27.20 ISCAS_Sogou_KB ISCAS_Sogou_KB_CMN_4 cmn cs 12.31 13.56 14.66 7.06 7.89 8.80 41.79 20.16 27.20 ISCAS_Sogou_KB ISCAS_Sogou_KB_CMN_5 cmn cs 12.31 13.56 14.66 7.06 7.89 8.80 41.79 20.16 27.20 ISCAS_Sogou_KB ISCAS_Sogou_KB_CMN_2 cmn cs 12.30 13.56 14.67 7.07 7.89 8.79 41.83 20.23 27.27 A2KD_Adept_KB A2KD_Adept_KB_XLING_1 cmn cs 7.25 8.06 8.93 1.84 2.27 2.79 52.71 10.17 17.04 A2KD_Adept_KB A2KD_Adept_KB_XLING_2 cmn cs 7.25 8.06 8.93 1.84 2.27 2.79 52.71 10.17 17.04 A2KD_Adept_KB A2KD_Adept_KB_CMN_2 cmn cs 7.21 8.03 8.86 1.82 2.28 2.78 52.72 10.15 17.02 A2KD_Adept_KB A2KD_Adept_KB_CMN_1 cmn cs 7.21 8.03 8.86 1.82 2.28 2.78 52.72 10.15 17.02 A2KD_Adept_KB A2KD_Adept_KB_CMN_3 cmn cs 7.05 7.85 8.70 1.81 2.26 2.78 51.72 10.01 16.78 A2KD_Adept_KB A2KD_Adept_KB_CMN_4 cmn cs 7.05 7.85 8.70 1.81 2.26 2.78 51.72 10.01 16.78 TinkerBell_KB TinkerBell_KB_XLING_3 cmn cs 5.80 6.46 7.17 1.02 1.28 1.57 37.78 10.28 16.16 TinkerBell_KB TinkerBell_KB_XLING_2 cmn cs 5.80 6.46 7.17 1.02 1.28 1.57 37.78 10.28 16.16 TinkerBell_KB TinkerBell_KB_XLING_4 cmn cs 5.80 6.46 7.17 1.02 1.28 1.57 37.78 10.28 16.16 TinkerBell_KB TinkerBell_KB_CMN_1 cmn cs 5.80 6.46 7.17 1.02 1.28 1.57 37.78 10.28 16.16 TinkerBell_KB TinkerBell_KB_XLING_5 cmn cs 5.80 6.46 7.17 1.02 1.28 1.57 37.78 10.28 16.16 TinkerBell_KB TinkerBell_KB_CMN_3 cmn cs 5.80 6.46 7.17 1.02 1.28 1.57 37.78 10.28 16.16 TinkerBell_KB TinkerBell_KB_CMN_2 cmn cs 5.80 6.46 7.17 1.02 1.28 1.57 37.78 10.28 16.16 TinkerBell_KB TinkerBell_KB_CMN_4 cmn cs 5.80 6.46 7.17 1.02 1.28 1.57 37.78 10.28 16.16 TinkerBell_KB TinkerBell_KB_XLING_1 cmn cs 5.80 6.46 7.17 1.02 1.28 1.57 37.78 10.28 16.16 TinkerBell_KB TinkerBell_KB_CMN_5 cmn cs 5.80 6.46 7.17 1.02 1.28 1.57 37.78 10.28 16.16 CMU_CS_Event CMU_CS_Event2 cmn sept 5.22 5.88 6.43 1.72 2.06 2.49 34.83 9.91 15.43 CMU_CS_Event CMU_CS_Event1 cmn sept 5.22 5.88 6.43 1.78 2.22 2.69 34.83 9.91 15.43 CMU_CS_Event CMU_CS_Event3 cmn sept 5.22 5.88 6.43 1.61 1.95 2.31 34.83 9.91 15.43 SAFT_ISI_KB SAFT_ISI_KB_CMN_1 cmn cs 4.61 5.21 5.76 1.47 1.79 2.17 34.48 8.61 13.78 CMU_CS_Event CMU_CS_Event4 cmn sept 3.08 3.63 4.08 0.58 0.82 1.09 50.35 4.13 7.64 CMU_CS_Event CMU_CS_Event5 cmn sept 2.70 3.23 3.68 0.33 0.49 0.66 49.74 3.64 6.78 ### eng Results ### WithRealis Score (TP Matches: EventType,Role,ArgEntity,Realis) ### Site Sub. Lang. Cond. ArgScore_5 ArgScore_Med ArgScore_95 Link_5 LinkMed Link_95 ArgP ArgR ArgF1 BBN BBN1 eng sept 8.72 9.63 10.62 6.96 7.81 8.78 33.36 17.30 22.79 A2KD_Adept_KB A2KD_Adept_KB_ENG_3 eng cs 4.48 5.05 5.60 2.35 2.82 3.34 33.79 8.41 13.47 A2KD_Adept_KB A2KD_Adept_KB_ENG_4 eng cs 4.48 5.05 5.60 2.35 2.82 3.34 33.79 8.41 13.47 A2KD_Adept_KB A2KD_Adept_KB_ENG_2 eng cs 4.31 4.88 5.42 2.20 2.65 3.18 33.32 8.37 13.38 A2KD_Adept_KB A2KD_Adept_KB_ENG_1 eng cs 4.31 4.88 5.42 2.20 2.65 3.18 33.32 8.37 13.38 A2KD_Adept_KB A2KD_Adept_KB_XLING_2 eng cs 4.17 4.75 5.30 2.32 2.78 3.29 32.94 8.36 13.34 A2KD_Adept_KB A2KD_Adept_KB_XLING_1 eng cs 4.17 4.75 5.30 2.32 2.78 3.29 32.94 8.36 13.34 CMU_CS_Event CMU_CS_Event3 eng sept 2.13 2.53 2.94 1.41 1.69 2.02 21.99 6.84 10.44 CMU_CS_Event CMU_CS_Event1 eng sept 2.13 2.53 2.94 1.48 1.76 2.08 21.99 6.84 10.44 CMU_CS_Event CMU_CS_Event2 eng sept 2.13 2.53 2.94 1.30 1.56 1.87 21.99 6.84 10.44 CMU_CS_Event CMU_CS_Event4 eng sept 1.81 2.09 2.39 0.57 0.74 0.97 32.46 3.43 6.20 CMU_CS_Event CMU_CS_Event5 eng sept 1.53 1.80 2.07 0.36 0.50 0.67 30.78 2.89 5.28 BUPT_PRIS BUPT_PRIS2 eng sept 0.98 1.18 1.37 0.32 0.45 0.61 23.73 2.59 4.66 BUPT_PRIS BUPT_PRIS1 eng sept 0.59 0.75 0.90 0.22 0.32 0.43 17.50 2.55 4.45 SAFT_ISI_KB SAFT_ISI_KB_ENG_1 eng cs 0.45 0.65 0.83 0.85 1.07 1.34 11.22 5.40 7.30 BUPT_PRIS BUPT_PRIS3 eng sept 0.45 0.58 0.71 0.24 0.35 0.48 15.14 2.27 3.95 TinkerBell_KB TinkerBell_KB_XLING_5 eng cs 0.40 0.53 0.68 0.15 0.24 0.34 17.43 0.82 1.56 TinkerBell_KB TinkerBell_KB_XLING_1 eng cs 0.40 0.53 0.68 0.15 0.24 0.34 17.43 0.82 1.56 TinkerBell_KB TinkerBell_KB_ENG_3 eng cs 0.40 0.53 0.68 0.15 0.24 0.34 17.43 0.82 1.56 TinkerBell_KB TinkerBell_KB_XLING_3 eng cs 0.40 0.53 0.68 0.15 0.24 0.34 17.43 0.82 1.56 TinkerBell_KB TinkerBell_KB_ENG_1 eng cs 0.40 0.53 0.68 0.15 0.24 0.34 17.43 0.82 1.56 TinkerBell_KB TinkerBell_KB_ENG_5 eng cs 0.40 0.53 0.68 0.15 0.24 0.34 17.43 0.82 1.56 TinkerBell_KB TinkerBell_KB_XLING_4 eng cs 0.40 0.53 0.68 0.15 0.24 0.34 17.43 0.82 1.56 TinkerBell_KB TinkerBell_KB_XLING_2 eng cs 0.40 0.53 0.68 0.15 0.24 0.34 17.43 0.82 1.56 TinkerBell_KB TinkerBell_KB_ENG_2 eng cs 0.40 0.53 0.68 0.15 0.24 0.34 17.43 0.82 1.56 TinkerBell_KB TinkerBell_KB_ENG_4 eng cs 0.40 0.53 0.68 0.15 0.24 0.34 17.43 0.82 1.56 ### NoRealis Score (TP Matches: EventType,Role,ArgEntity) ### Site Sub. Lang. Cond. ArgScore_5 ArgScore_Med ArgScore_95 Link_5 LinkMed Link_95 ArgP ArgR ArgF1 BBN BBN1 eng sept 16.51 17.49 18.49 8.58 9.50 10.62 44.51 25.01 32.03 A2KD_Adept_KB A2KD_Adept_KB_ENG_3 eng cs 6.78 7.41 8.02 2.77 3.28 3.88 40.18 11.06 17.34 A2KD_Adept_KB A2KD_Adept_KB_ENG_4 eng cs 6.78 7.41 8.02 2.77 3.28 3.88 40.18 11.06 17.34 A2KD_Adept_KB A2KD_Adept_KB_ENG_2 eng cs 6.51 7.12 7.76 2.62 3.12 3.68 39.35 10.93 17.11 A2KD_Adept_KB A2KD_Adept_KB_ENG_1 eng cs 6.51 7.12 7.76 2.62 3.12 3.68 39.35 10.93 17.11 A2KD_Adept_KB A2KD_Adept_KB_XLING_2 eng cs 6.36 7.01 7.65 2.74 3.27 3.87 38.96 10.93 17.07 A2KD_Adept_KB A2KD_Adept_KB_XLING_1 eng cs 6.36 7.01 7.65 2.74 3.27 3.87 38.96 10.93 17.07 CMU_CS_Event CMU_CS_Event3 eng sept 4.18 4.65 5.15 1.71 2.03 2.40 28.47 9.77 14.54 CMU_CS_Event CMU_CS_Event1 eng sept 4.18 4.65 5.15 1.82 2.16 2.55 28.47 9.77 14.54 CMU_CS_Event CMU_CS_Event2 eng sept 4.18 4.65 5.15 1.61 1.92 2.27 28.47 9.77 14.54 CMU_CS_Event CMU_CS_Event4 eng sept 2.84 3.20 3.57 0.74 0.96 1.24 39.59 4.61 8.26 CMU_CS_Event CMU_CS_Event5 eng sept 2.33 2.67 3.02 0.47 0.65 0.85 37.33 3.86 7.00 BUPT_PRIS BUPT_PRIS2 eng sept 1.62 1.87 2.11 0.34 0.47 0.63 29.43 3.40 6.10 BUPT_PRIS BUPT_PRIS1 eng sept 1.29 1.52 1.77 0.29 0.41 0.55 22.81 3.53 6.11 SAFT_ISI_KB SAFT_ISI_KB_ENG_1 eng cs 1.02 1.29 1.57 1.08 1.34 1.65 15.26 8.11 10.59 BUPT_PRIS BUPT_PRIS3 eng sept 0.97 1.17 1.35 0.32 0.45 0.61 20.23 3.14 5.43 TinkerBell_KB TinkerBell_KB_XLING_5 eng cs 0.50 0.65 0.81 0.17 0.27 0.38 19.57 1.02 1.93 TinkerBell_KB TinkerBell_KB_XLING_1 eng cs 0.50 0.65 0.81 0.17 0.27 0.38 19.57 1.02 1.93 TinkerBell_KB TinkerBell_KB_ENG_3 eng cs 0.50 0.65 0.81 0.17 0.27 0.38 19.57 1.02 1.93 TinkerBell_KB TinkerBell_KB_XLING_3 eng cs 0.50 0.65 0.81 0.17 0.27 0.38 19.57 1.02 1.93 TinkerBell_KB TinkerBell_KB_ENG_1 eng cs 0.50 0.65 0.81 0.17 0.27 0.38 19.57 1.02 1.93 TinkerBell_KB TinkerBell_KB_ENG_5 eng cs 0.50 0.65 0.81 0.17 0.27 0.38 19.57 1.02 1.93 TinkerBell_KB TinkerBell_KB_XLING_4 eng cs 0.50 0.65 0.81 0.17 0.27 0.38 19.57 1.02 1.93 TinkerBell_KB TinkerBell_KB_XLING_2 eng cs 0.50 0.65 0.81 0.17 0.27 0.38 19.57 1.02 1.93 TinkerBell_KB TinkerBell_KB_ENG_2 eng cs 0.50 0.65 0.81 0.17 0.27 0.38 19.57 1.02 1.93 TinkerBell_KB TinkerBell_KB_ENG_4 eng cs 0.50 0.65 0.81 0.17 0.27 0.38 19.57 1.02 1.93 ### spa Results ### WithRealis Score (TP Matches: EventType,Role,ArgEntity,Realis) ### Site Sub. Lang. Cond. ArgScore_5 ArgScore_Med ArgScore_95 Link_5 LinkMed Link_95 ArgP ArgR ArgF1 BBN BBN1 spa sept 2.01 2.47 2.92 1.29 1.70 2.19 22.53 5.22 8.47 CMU_CS_Event CMU_CS_Event3 spa sept 1.28 1.56 1.85 0.24 0.38 0.55 31.45 1.95 3.67 CMU_CS_Event CMU_CS_Event2 spa sept 1.28 1.56 1.85 0.24 0.38 0.55 31.45 1.95 3.67 CMU_CS_Event CMU_CS_Event1 spa sept 1.28 1.56 1.85 0.20 0.31 0.43 31.45 1.95 3.67 SAFT_ISI_KB SAFT_ISI_KB_SPA_1 spa cs 1.08 1.35 1.61 0.19 0.32 0.46 31.31 1.63 3.10 CMU_CS_Event CMU_CS_Event4 spa sept 0.66 0.86 1.04 0.09 0.18 0.28 46.60 0.80 1.57 CMU_CS_Event CMU_CS_Event5 spa sept 0.42 0.58 0.72 0.02 0.08 0.15 40.00 0.50 0.99 TinkerBell_KB TinkerBell_KB_XLING_1 spa cs 0.09 0.17 0.25 0.00 0.03 0.07 9.71 0.17 0.33 TinkerBell_KB TinkerBell_KB_SPA_4 spa cs 0.09 0.17 0.25 0.00 0.03 0.07 9.71 0.17 0.33 TinkerBell_KB TinkerBell_KB_SPA_3 spa cs 0.09 0.17 0.25 0.00 0.03 0.07 9.71 0.17 0.33 TinkerBell_KB TinkerBell_KB_XLING_5 spa cs 0.09 0.17 0.25 0.00 0.03 0.07 9.71 0.17 0.33 TinkerBell_KB TinkerBell_KB_XLING_4 spa cs 0.09 0.17 0.25 0.00 0.03 0.07 9.71 0.17 0.33 TinkerBell_KB TinkerBell_KB_XLING_2 spa cs 0.09 0.17 0.25 0.00 0.03 0.07 9.71 0.17 0.33 TinkerBell_KB TinkerBell_KB_SPA_5 spa cs 0.09 0.17 0.25 0.00 0.03 0.07 9.71 0.17 0.33 TinkerBell_KB TinkerBell_KB_SPA_1 spa cs 0.09 0.17 0.25 0.00 0.03 0.07 9.71 0.17 0.33 TinkerBell_KB TinkerBell_KB_SPA_2 spa cs 0.09 0.17 0.25 0.00 0.03 0.07 9.71 0.17 0.33 TinkerBell_KB TinkerBell_KB_XLING_3 spa cs 0.09 0.17 0.25 0.00 0.03 0.07 9.71 0.17 0.33 ### NoRealis Score (TP Matches: EventType,Role,ArgEntity) ### Site Sub. Lang. Cond. ArgScore_5 ArgScore_Med ArgScore_95 Link_5 LinkMed Link_95 ArgP ArgR ArgF1 BBN BBN1 spa sept 3.22 3.82 4.37 1.55 2.05 2.60 28.38 7.12 11.39 CMU_CS_Event CMU_CS_Event3 spa sept 1.49 1.80 2.11 0.26 0.41 0.58 33.87 2.28 4.27 CMU_CS_Event CMU_CS_Event2 spa sept 1.49 1.80 2.11 0.26 0.41 0.58 33.87 2.28 4.27 CMU_CS_Event CMU_CS_Event1 spa sept 1.49 1.80 2.11 0.21 0.33 0.46 33.87 2.28 4.27 SAFT_ISI_KB SAFT_ISI_KB_SPA_1 spa cs 1.24 1.55 1.84 0.20 0.34 0.49 33.55 1.90 3.59 CMU_CS_Event CMU_CS_Event4 spa sept 0.78 1.00 1.20 0.10 0.19 0.30 49.51 0.92 1.81 CMU_CS_Event CMU_CS_Event5 spa sept 0.51 0.69 0.85 0.02 0.09 0.16 42.67 0.58 1.14 TinkerBell_KB TinkerBell_KB_XLING_1 spa cs 0.11 0.20 0.28 0.00 0.03 0.08 11.65 0.22 0.43 TinkerBell_KB TinkerBell_KB_SPA_4 spa cs 0.11 0.20 0.28 0.00 0.03 0.08 11.65 0.22 0.43 TinkerBell_KB TinkerBell_KB_SPA_3 spa cs 0.11 0.20 0.28 0.00 0.03 0.08 11.65 0.22 0.43 TinkerBell_KB TinkerBell_KB_XLING_5 spa cs 0.11 0.20 0.28 0.00 0.03 0.08 11.65 0.22 0.43 TinkerBell_KB TinkerBell_KB_XLING_4 spa cs 0.11 0.20 0.28 0.00 0.03 0.08 11.65 0.22 0.43 TinkerBell_KB TinkerBell_KB_XLING_2 spa cs 0.11 0.20 0.28 0.00 0.03 0.08 11.65 0.22 0.43 TinkerBell_KB TinkerBell_KB_SPA_5 spa cs 0.11 0.20 0.28 0.00 0.03 0.08 11.65 0.22 0.43 TinkerBell_KB TinkerBell_KB_SPA_1 spa cs 0.11 0.20 0.28 0.00 0.03 0.08 11.65 0.22 0.43 TinkerBell_KB TinkerBell_KB_SPA_2 spa cs 0.11 0.20 0.28 0.00 0.03 0.08 11.65 0.22 0.43 TinkerBell_KB TinkerBell_KB_XLING_3 spa cs 0.11 0.20 0.28 0.00 0.03 0.08 11.65 0.22 0.43